In actual applications, we often encounter some special situations, such as the need for news, weather forecast, etc. However, as a personal site or a small site, we cannot have so much manpower, material and financial resources to do these things. How? What to do?
Fortunately, the Internet is resource sharing. We can use programs to automatically capture pages from other sites and process them for our use.
What should I use? What the comrade gave me won’t work. In fact, Php has this function, which is to use the curl library. Please look at the code below!
$ch = curl_init ("http://dailynews.sina.com.cn");
$fp = fopen ("php_homepage. txt", "w");
curl_setopt ($ch, CURLOPT_FILE, $fp);
curl_setopt ($ch, CURLOPT_HEADER, 0);
curl_exec ($ch);
curl_close ($ch);
fclose ($fp);
?>
But sometimes there will be some errors, but the download has actually been completed! I asked the foreigners, but they didn't give me an answer. I thought it wouldn't work, so just add a ◎; in front of the function. In this way, as long as we perform appropriate analysis on $txt, we can secretly grab Sina news! However, it’s better not to use it! To avoid legal disputes, I just want to tell you that Php is very powerful! There are many things you can do!
[The copyright of this article is jointly owned by the author and Oso.com. If you need to reprint, please indicate the author and source]