The PHP language does not have the feature of asynchronous callbacks like JavaScript, so achieving concurrency is a bit tricky. However, for connection and communication between servers, PHP has good support for the libcurl library. To execute a batch of concurrent requests, you can easily rely on the curl_multi_init method.
Usually, a simple CURL request is assembled and sent like this:
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$data = curl_exec($ch);
curl_close($ch);
Once curl_exec is executed, we can capture the $data variable, which is the required response result.
Also valid is curl_multi_exec, native PHP allows batch processing of CURL handles in parallel. So how to implement a batch of concurrent CURL handles? We can control two independent CURL handles by establishing a CURL batch session. When the session needs to be executed, a handle on the stack is processed through curl_multi_exec. Encapsulated into a function, it basically looks like this:
function curl_multi ($query_arr) {
$ch = curl_multi_init();
$count = count($query_arr);
$ch_arr = array();
for ($i = 0; $i < $count; $i++) {
$query_string = $query_arr[$i];
$ch_arr[$i] = curl_init ($query_string);
curl_setopt($ch_arr[$i], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($ch, $ch_arr[$i]);
}
$running = null;
do {
curl_multi_exec($ch, $running);
} while ($running > 0);
for ($i = 0; $i < $count; $i++) {
$results[$i] = curl_multi_getcontent($ch_arr[$i]);
curl_multi_remove_handle($ch, $ch_arr[$i]);
}
curl_multi_close($ch);
return $results;
}
The final data returned is a result set array consisting of individual CURL handle responses. Compared with individual requests, it can basically save half the time.