I'm trying to request multiple APIs as quickly as possible. So I tried curl_multi
. But I get slower results than foreach and file_get_contents. What did i do wrong?
Using file_get_contents
:
<?php $start = microtime(true); $urls = array("https://www.example1.com/", "https://www.example2.com/", "https://www.example3.com/"); foreach ($urls as $url) { $result = file_get_contents($url); } echo microtime(true) - $start; ?>
Using curl_multi
:
<?php $start = microtime(true); $urls = array("https://www.example1.com/", "https://www.example2.com/", "https://www.example3.com/"); $urls_count = count($urls); $curl_arr = array(); $master = curl_multi_init(); for($i = 0; $i < $urls_count; $i++) { $curl_arr[$i] = curl_init($urls[$i]); curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, true); curl_multi_add_handle($master, $curl_arr[$i]); } do { curl_multi_exec($master,$running); } while($running > 0); for($i = 1; $i < $urls_count; $i++) { $results = curl_multi_getcontent ( $curl_arr[$i] ); } echo microtime(true) - $start; ?>
The problem is that
curl_multi
has a lot of overhead. I'm assuming it has to create a shell process for each request, then execute curl in that process, and finally return the content to the script that requested the action.file_get_contents
Optimized PHP language inherent:This is a great learning experience for when to use libraries and native features in the language. Additionally, libraries can optionally be multi-threaded and take advantage of multi-core processors, which may speed up requests. Something to look up and test yourself.