Web crawler - python urlopen error timeout: timed out
伊谢尔伦
伊谢尔伦 2017-05-18 11:01:23
0
1
1045

Today the script always reports timeout errors. What is the exception handling of timeout?
The try except below is useless

        try:
            url_open = urllib.request.urlopen(url)
        except urllib.error.HTTPError:
            print('HTTPError')
            continue
        except urllib.error.URLError:
            print('URLError')
            continue

is to return

  File "F:\Program Files (x86)\Anaconda3\lib\http\client.py", line 612, in _safe_read
    chunk = self.fp.read(min(amt, MAXAMOUNT))
  File "F:\Program Files (x86)\Anaconda3\lib\socket.py", line 586, in readinto
    return self._sock.recv_into(b)
   timeout: timed out

Write except as

except Exception:

It’s no use either

伊谢尔伦
伊谢尔伦

小伙看你根骨奇佳,潜力无限,来学PHP伐。

reply all(1)
伊谢尔伦

Try requests.get(url)

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!