python - pyspider爬取时fetcher报超时错误:fetcher/: 504 Gateway Time-out
高洛峰
高洛峰 2017-04-18 09:45:14
0
2
1119

有时爬虫会报如下超时错误:

Traceback (most recent call last):
  File "/opt/pyspider/pyspider/run.py", line 351, in <lambda>
    app.config['fetch'] = lambda x: umsgpack.unpackb(fetcher_rpc.fetch(x).data)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1233, in __call__
    return self.__send(self.__name, args)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1587, in __request
    verbose=self.__verbose
  File "/usr/lib/python2.7/xmlrpclib.py", line 1273, in request
    return self.single_request(host, handler, request_body, verbose)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1321, in single_request
    response.msg,
ProtocolError: <ProtocolError for fetcher/: 504 Gateway Time-out>

请问有什么好的方法避免?

高洛峰
高洛峰

拥有18年软件开发和IT教学经验。曾任多家上市公司技术总监、架构师、项目经理、高级软件工程师等职务。 网络人气名人讲师,...

reply all(2)
巴扎黑

This error will only appear during debugging

左手右手慢动作

@zuzhaochong
This is indeed a front-end misalignment during debugging, and the fetcher in the background will report such an error:
[E 161014 23:45:09 tornado_fetcher:202] [599] douban:f25b579c7b441d19bc800412cccb145b https://movie .douban.com/revi... ValueError('No JSON object could be decoded',) 50.00s
After I finished debugging, when I actually started crawling, there would be a lot of this error after a while, and on the page The crawler status is displayed as "PAUSED". What's the problem? How to solve it?

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!