The weather finally cleared up, but a problem arose. When synchronizing user data between two sites, when using the PHP function file_get_contents to crawl and execute a remote page, if the connection times out, a Fatal Error will be output or it will be quite slow, resulting in the following code not being able to run. First, let’s learn about the PHP file_get_contents() function
definition and usage. The
file_get_contents() function reads the entire file into a string.
Same as file(), except that file_get_contents() reads the file into a string.
The file_get_contents() function is the preferred method for reading the contents of a file into a string. If supported by the operating system, memory mapping technology is also used to enhance performance.
Syntax
file_get_contents(path,include_path,context,start,max_length) Parameter Description
path Required. Specifies the file to be read.
include_path optional. If you also want to search for files in include_path, you can set this parameter to "1".
context optional. Specifies the environment for a file handle.
Context is a set of options that can modify the behavior of the stream. If null is used, it is ignored.
start optional. Specifies the position in the file to begin reading. This parameter is new in PHP 5.1.
max_length optional. Specifies the number of bytes to read. This parameter is new in PHP 5.1.
Note
Support for context was added in PHP 5.0.0.
For timeouts or slow pages, there are generally two solutions:
1. Use the third parameter of file_get_contents()
Copy code The code is as follows:
$url = "http://zhoz.com/zhoz.php";
$ctx = stream_context_create(array(
'http' => array('timeout' => 10)
) )
$result = @file_get_contents($url, 0, $ctx) ; ;
if($result){ var_dump($result);
This method 1 has been tested locally and reflects well, but if tested on an external network (environment: China → US server), it will basically time out.
After testing TimeOut, it is basically useless. We recommend the following methods
2. Use the curl extension library
Copy the code
The code is as follows:
$url = "http://zhoz.com/zhoz.php"; h:i:s'); echo ""; //$buffer = file_get_contents($url); $buffer = zhoz_get_contents($url); :i:s');
if(emptyempty($buffer)) {
echo " Buffer is empty";
} catch(Exception $e) {
echo "error "; > curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_HEADER,0); curl_setopt($ch,CURLOPT_RETURNTRANSFER, true );
$content = curl_exec($ch);
curl_close($ch);
return $content; , choose which method to apply according to the system environment:
Copy the code
The code is as follows:
function vita_get_url_content($url) {
if(function_exists('file_get_contents')) {
$file_contents = file_get_contents($url);
} else {
$ch = curl_init();
$timeout = 5;
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $ timeout);
$file_contents = curl_exec($ch);
curl_close($ch);
}
return $file_contents;
}
?>
http://www.bkjia.com/PHPjc/327837.htmlwww.bkjia.comtruehttp: //www.bkjia.com/PHPjc/327837.htmlTechArticleThe weather has finally cleared up, but a problem has arisen. When synchronizing user data between two sites, when using the php function file_get_contents to crawl and execute a remote page, if the connection times out, a message will be output...