In actual PHP development, we often use the file_get_contents function to obtain the content returned by the remote page. However, if the remote response time is very slow, file_get_contents() will always be stuck there and will not time out. At this time we have Sometimes you will find that the system load of the Linux server of the web service suddenly increases. Use the top command to check that the CPU usage of many php-cgi processes is close to 100%.
We know that in php.ini, there is a parameter max_execution_time that can set the maximum execution time of PHP scripts. However, in php-cgi (php-fpm), this parameter will not take effect. What can really control the maximum execution time of a PHP script is the following parameter in the php-fpm.conf configuration file:
C code
The timeout (in seconds) for serving a single request after which the worker process will be terminated
Should be used when 'max_execution_time' ini option does not stop script execution for some reason
'0s' means 'off'
The default value is 0 seconds, In other words, the PHP script will continue to execute. In this way, when all php-cgi processes are stuck in the file_get_contents() function, this WebServer can no longer process new PHP requests. It is necessary to modify this parameter and set the maximum execution time of a PHP script. However, it only treats the symptoms but not the root cause. . For example, if it is changed to
To achieve a complete solution, we can only let PHP programmers change the habit of using file_get_contents("http://example.com/") directly, but slightly modify it, add a timeout, and use the following method Implement HTTP GET requests. If you find it troublesome, you can encapsulate the following code into a function yourself.
<?php $ctx = stream_context_create(array( 'http' => array( 'timeout' => 1 //设置一个超时时间,单位为秒 ) ) ); file_get_contents("http://example.com/", 0, $ctx); ?>