Home Backend Development Python Tutorial python scan proxy and how to get the available proxy IP example sharing

python scan proxy and how to get the available proxy IP example sharing

Aug 07, 2017 pm 03:36 PM
proxy python how

The following editor will bring you an example of python scanning proxy and obtaining the available proxy IP. The editor thinks it’s pretty good, so I’ll share it with you now and give it as a reference. Let’s follow the editor and take a look.

Today we will write a very practical tool, which is to scan and obtain available proxies

First of all , I first found a website on Baidu: www.xicidaili.com As an example

This website publishes many proxy IPs and ports available at home and abroad

We still proceed as usual. For analysis, let’s scan all domestic proxies first.

Click on the domestic part to review and find that the domestic proxy and directory are the following url:

www.xicidaili.com/nn/x

This x has almost more than 2,000 pages, so it seems that thread processing is required again. . .

As usual, we try to see if we can get the content directly with the simplest requests.get()

returns 503, then we add a simple headers

and return 200, OK

Okay, let’s first analyze the content of the web page and get the content we want

We found that the content containing IP information is within the <tr> tag, so we can easily Use bs to obtain the tag content

But we later found that the contents of ip, port, and protocol were in the 2nd, 3rd, and 6th <td> tags of the extracted <tr> tags.

So we started to try to write, here is the writing idea:

When processing the page, we first extract the tr tag, and then add the tr tag to the page. Extracting the td tag in the tag

Therefore, two bs operations are used, and str processing is required when using the bs operation for the second time

Because after we obtain tr, we need 2 of them, Things No. 3 and 6,

But when we use the i output by a for loop, we cannot perform group operations

So we simply perform a second operation on the soup of each td separately Then directly extract 2,3,6

After extraction, just add .string to extract the content


r = requests.get(url = url,headers = headers)
 soup = bs(r.content,&quot;html.parser&quot;)
 data = soup.find_all(name = &#39;tr&#39;,attrs = {&#39;class&#39;:re.compile(&#39;|[^odd]&#39;)})
 for i in data:

  soup = bs(str(i),&#39;html.parser&#39;)
  data2 = soup.find_all(name = &#39;td&#39;)
  ip = str(data2[1].string)
  port = str(data2[2].string)
  types = str(data2[5].string).lower() 

  proxy = {}
  proxy[types] = &#39;%s:%s&#39;%(ip,port)
Copy after login

In this way, we can get it every time in the loop Generate the corresponding proxy dictionary so that we can use the

dictionary to verify the IP availability. One thing to note here is that we have an operation to change the type to lowercase, because it is written in proxies in the get method. The protocol name should be in lowercase, and the webpage captures the content in uppercase, so a case conversion is performed

So what is the idea of ​​verifying the availability of the IP

It is very simple, we use get, add Go to our proxy and request the website:

http://1212.ip138.com/ic.asp

This is a magical website that can return your external network IP


url = &#39;http://1212.ip138.com/ic.asp&#39;
r = requests.get(url = url,proxies = proxy,timeout = 6)
Copy after login

Here we need to add timeout to remove those agents that wait too long. I set it to 6 seconds

We use one IP Try and analyze the returned page

The returned content is as follows:


&lt;html&gt;

&lt;head&gt;

&lt;meta xxxxxxxxxxxxxxxxxx&gt;

&lt;title&gt; 您的IP地址 &lt;/title&gt;

&lt;/head&gt;

&lt;body style=&quot;margin:0px&quot;&gt;&lt;center&gt;您的IP是:[xxx.xxx.xxx.xxx] 来自:xxxxxxxx&lt;/center&gt;&lt;/body&gt;&lt;/html&gt;
Copy after login

Then we only need to extract the [] The content can be

If our proxy is available, the proxy’s IP

will be returned (the returned address here will still be our local external network IP, although I am not the same) It's very clear, but I excluded this situation. The proxy should still be unavailable)

Then we can make a judgment. If the returned ip is the same as the ip in the proxy dictionary, then the ip is considered available. Agent and write it to the file

This is our idea. Finally, we can process the queue and threading threads

The code above:


#coding=utf-8

import requests
import re
from bs4 import BeautifulSoup as bs
import Queue
import threading 

class proxyPick(threading.Thread):
 def __init__(self,queue):
  threading.Thread.__init__(self)
  self._queue = queue

 def run(self):
  while not self._queue.empty():
   url = self._queue.get()

   proxy_spider(url)

def proxy_spider(url):
 headers = {
   .......
  }

 r = requests.get(url = url,headers = headers)
 soup = bs(r.content,&quot;html.parser&quot;)
 data = soup.find_all(name = &#39;tr&#39;,attrs = {&#39;class&#39;:re.compile(&#39;|[^odd]&#39;)})

 for i in data:

  soup = bs(str(i),&#39;html.parser&#39;)
  data2 = soup.find_all(name = &#39;td&#39;)
  ip = str(data2[1].string)
  port = str(data2[2].string)
  types = str(data2[5].string).lower() 


  proxy = {}
  proxy[types] = &#39;%s:%s&#39;%(ip,port)
  try:
   proxy_check(proxy,ip)
  except Exception,e:
   print e
   pass

def proxy_check(proxy,ip):
 url = &#39;http://1212.ip138.com/ic.asp&#39;
 r = requests.get(url = url,proxies = proxy,timeout = 6)

 f = open(&#39;E:/url/ip_proxy.txt&#39;,&#39;a+&#39;)

 soup = bs(r.text,&#39;html.parser&#39;)
 data = soup.find_all(name = &#39;center&#39;)
 for i in data:
  a = re.findall(r&#39;\[(.*?)\]&#39;,i.string)
  if a[0] == ip:
   #print proxy
   f.write(&#39;%s&#39;%proxy+&#39;\n&#39;)
   print &#39;write down&#39;
   
 f.close()

#proxy_spider()

def main():
 queue = Queue.Queue()
 for i in range(1,2288):
  queue.put(&#39;http://www.xicidaili.com/nn/&#39;+str(i))

 threads = []
 thread_count = 10

 for i in range(thread_count):
  spider = proxyPick(queue)
  threads.append(spider)

 for i in threads:
  i.start()

 for i in threads:
  i.join()

 print &quot;It&#39;s down,sir!&quot;

if __name__ == &#39;__main__&#39;:
 main()
Copy after login

In this way we can write all the available proxy IPs provided on the website into the file ip_proxy.txt

The above is the detailed content of python scan proxy and how to get the available proxy IP example sharing. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot Article Tags

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

How to download deepseek Xiaomi How to download deepseek Xiaomi Feb 19, 2025 pm 05:27 PM

How to download deepseek Xiaomi

What are the advantages and disadvantages of templating? What are the advantages and disadvantages of templating? May 08, 2024 pm 03:51 PM

What are the advantages and disadvantages of templating?

Google AI announces Gemini 1.5 Pro and Gemma 2 for developers Google AI announces Gemini 1.5 Pro and Gemma 2 for developers Jul 01, 2024 am 07:22 AM

Google AI announces Gemini 1.5 Pro and Gemma 2 for developers

For only $250, Hugging Face's technical director teaches you how to fine-tune Llama 3 step by step For only $250, Hugging Face's technical director teaches you how to fine-tune Llama 3 step by step May 06, 2024 pm 03:52 PM

For only $250, Hugging Face's technical director teaches you how to fine-tune Llama 3 step by step

Share several .NET open source AI and LLM related project frameworks Share several .NET open source AI and LLM related project frameworks May 06, 2024 pm 04:43 PM

Share several .NET open source AI and LLM related project frameworks

A complete guide to golang function debugging and analysis A complete guide to golang function debugging and analysis May 06, 2024 pm 02:00 PM

A complete guide to golang function debugging and analysis

How do you ask him deepseek How do you ask him deepseek Feb 19, 2025 pm 04:42 PM

How do you ask him deepseek

How to save the evaluate function How to save the evaluate function May 07, 2024 am 01:09 AM

How to save the evaluate function

See all articles