When doing penetration testing, there is a relatively large project with hundreds of websites, so you must first determine which websites are normal and which websites are abnormal. So I wrote a small script for easy use in the future.
The specific implementation code is as follows:
#!/usr/bin/python # -*- coding: UTF-8 -*- ''' @Author:joy_nick @博客:http://byd.dropsec.xyz/ ''' import requests import sys f = open('url.txt', 'r') url = f.readlines() length = len(url) url_result_success=[] url_result_failed=[] for i in range(0,length): try: response = requests.get(url[i].strip(), verify=False, allow_redirects=True, timeout=5) if response.status_code != 200: raise requests.RequestException(u"Status code error: {}".format(response.status_code)) except requests.RequestException as e: url_result_failed.append(url[i]) continue url_result_success.append(url[i]) f.close() result_len = len(url_result_success) for i in range(0,result_len): print '网址%s' % url_result_success[i].strip()+'打开成功'
The test results are as follows:
Problems encountered:
When I first started testing, if I encountered an error that could not be made or did not exist, I would directly report an error and stop the program. Later I found out that it was because response.status_code != 200 was wrong when getting the status code here.
Because some websites will not return a status code if they cannot be opened. So the program doesn’t know! ==What to do with 200?
Solution:
Use try except else to catch exceptions
The specific code is:
try: response = requests.get(url[i].strip(), verify=False, allow_redirects=True, timeout=5) if response.status_code != 200: raise requests.RequestException(u"Status code error: {}".format(response.status_code)) except requests.RequestException as e: url_result_failed.append(url[i]) continue
The above is the problem and solution introduced by the editor to use Python scripts to implement batch website survival detection. I hope it will be helpful to you. If you have any questions, please leave me a message and the editor will reply to you in time. . I would also like to thank everyone for your support of the Script House website!