业精于勤,荒于嬉;行成于思,毁于随。
You can use the scrapy framework to implement the crawler. You can try 3 times if the crawl fails. If it still fails, you can customize it and write it into the log, and you can deal with it later by yourself
requests.get has a timeout parameter.
a = requests.get("http://www.baidu.com",timeout = 500)
You can use the scrapy framework to implement the crawler. You can try 3 times if the crawl fails. If it still fails, you can customize it and write it into the log, and you can deal with it later by yourself
requests.get has a timeout parameter.