Playing with Python HTTP proxy
0x00 Preface
Everyone should be very familiar with HTTP proxy. It has extremely wide applications in many aspects. HTTP proxies are divided into forward proxies and reverse proxies. The latter is generally used to provide users with access to services behind the firewall or for load balancing. Typical ones include Nginx, HAProxy, etc. This article discusses forward proxies.
The most common uses of HTTP proxy are for network sharing, network acceleration and network limit breakthrough, etc. In addition, HTTP proxies are also commonly used for Web application debugging, monitoring and analysis of Web APIs called in Android/IOS APPs. Currently, well-known software includes Fiddler, Charles, Burp Suite, and mitmproxy. HTTP proxy can also be used to modify request/response content, add additional functions to web applications or change application behavior without changing the server.
0x01 What is HTTP proxy
HTTP proxy is essentially a web application, and it is not fundamentally different from other ordinary web applications. After receiving the request, the HTTP proxy comprehensively determines the target host based on the host name in the Host field in the header and the Get/POST request address, establishes a new HTTP request, forwards the request data, and forwards the received response data to the client.
If the request address is an absolute address, the HTTP proxy uses the Host in the address, otherwise the HOST field in the Header is used. Do a simple test, assuming the network environment is as follows:
192.168.1.2 Web server
192.168.1.3 HTTP proxy server
Use telnet to test
$ telnet 192.168.1.3 GET / HTTP/1.0 HOST: 192.168.1.2
Note that two consecutive carriage returns are required at the end, which is a HTTP protocol requirement . After completion, you can receive the page content of http://192.168.1.2/. Let’s make some adjustments. Bring the absolute address when making the GET request
$ telnet 192.168.1.3 GET http://httpbin.org/ip HTTP/1.0 HOST: 192.168.1.2
Note that the HOST is also set to 192.168.1.2, but the running result returns the content of the http://httpbin.org/ip page, which is the public Network IP address information.
As you can see from the above test process, HTTP proxy is not a very complicated thing, as long as the original request is sent to the proxy server. When an HTTP proxy cannot be set, for a small number of hosts that require an HTTP proxy, the simplest way is to point the IP of the target host domain name to the proxy server, which can be achieved by modifying the hosts file.
0x02 Set HTTP proxy in Python program
urllib2/urllib Proxy setting
urllib2 is a Python standard library with very powerful functions, but it is just a little more troublesome to use. In Python 3, urllib2 is no longer retained and moved to the urllib module. In urllib2, ProxyHandler is used to set up the proxy server.
proxy_handler = urllib2.ProxyHandler({'http': '121.193.143.249:80'}) opener = urllib2.build_opener(proxy_handler) r = opener.open('http://httpbin.org/ip') print(r.read())
You can also use install_opener to install the configured opener into the global environment, so that all urllib2.urlopen will automatically use the proxy
urllib2.install_opener(opener) r = urllib2.urlopen('http://httpbin.org/ip') print(r.read())
In Python 3, use urllib.
proxy_handler = urllib.request.ProxyHandler({'http': 'http://121.193.143.249:80/'}) opener = urllib.request.build_opener(proxy_handler) r = opener.open('http://httpbin.org/ip') print(r.read())
requests proxy settings
requests is one of the best HTTP libraries currently, and it is also the library I use most when constructing http requests. Its API design is very user-friendly and easy to use. Setting up a proxy for requests is very simple. You only need to set a parameter in the form {'http': 'x.x.x.x:8080', 'https': 'x.x.x.x:8080'} for proxies. Among them, http and https are independent of each other.
In [5]: requests.get('http://httpbin.org/ip', proxies={'http': '121.193.143.249:80'}).json() Out[5]: {'origin': '121.193.143.249'}
You can directly set the proxies attribute of the session, eliminating the trouble of bringing proxies parameters with every request.
s = requests.session() s.proxies = {'http': '121.193.143.249:80'} print(s.get('http://httpbin.org/ip').json())
0x03 HTTP_PROXY / HTTPS_PROXY environment variables
urllib2 and the Requests library both recognize the HTTP_PROXY and HTTPS_PROXY environment variables, and will automatically set up and use the proxy once these environment variables are detected. This is very useful when debugging with HTTP proxy, because you can adjust the IP address and port of the proxy server according to environment variables without modifying the code. Most software in *nix also supports HTTP_PROXY environment variable recognition, such as curl, wget, axel, aria2c, etc.
$ http_proxy=121.193.143.249:80 python -c 'import requests; print(requests.get("http://httpbin.org/ip").json())' {u'origin': u'121.193.143.249'} $ http_proxy=121.193.143.249:80 curl httpbin.org/ip { "origin": "121.193.143.249" }
In the IPython interactive environment, you may often need to temporarily debug HTTP requests. You can simply add/cancel the HTTP proxy by setting os.environ['http_proxy'].
In [245]: os.environ['http_proxy'] = '121.193.143.249:80' In [246]: requests.get("http://httpbin.org/ip").json() Out[246]: {u'origin': u'121.193.143.249'} In [249]: os.environ['http_proxy'] = '' In [250]: requests.get("http://httpbin.org/ip").json() Out[250]: {u'origin': u'x.x.x.x'}
0x04 MITM-Proxy
MITM originates from Man-in-the-Middle Attack, which refers to a man-in-the-middle attack, generally intercepting, monitoring and tampering with data in the network between the client and the server.
mitmproxy is an open source man-in-the-middle proxy artifact developed in Python language. It supports SSL, transparent proxy, reverse proxy, traffic recording and playback, and custom scripts. The function is somewhat similar to Fiddler in Windows, but mitmproxy is a console program without a GUI interface, but it is quite convenient to use. Using mitmproxy, you can easily filter, intercept, and modify any proxy HTTP request/response packets. You can even use its scripting API to write scripts to automatically intercept and modify HTTP data.
# test.py def response(flow): flow.response.headers["BOOM"] = "boom!boom!boom!"
上面的脚本会在所有经过代理的Http响应包头里面加上一个名为BOOM的header。用mitmproxy -s 'test.py'命令启动mitmproxy,curl验证结果发现的确多了一个BOOM头。
$ http_proxy=localhost:8080 curl -I 'httpbin.org/get' HTTP/1.1 200 OK Server: nginx Date: Thu, 03 Nov 2016 09:02:04 GMT Content-Type: application/json Content-Length: 186 Connection: keep-alive Access-Control-Allow-Origin: * Access-Control-Allow-Credentials: true BOOM: boom!boom!boom! ...
显然mitmproxy脚本能做的事情远不止这些,结合Python强大的功能,可以衍生出很多应用途径。除此之外,mitmproxy还提供了强大的API,在这些API的基础上,完全可以自己定制一个实现了特殊功能的专属代理服务器。
经过性能测试,发现mitmproxy的效率并不是特别高。如果只是用于调试目的那还好,但如果要用到生产环境,有大量并发请求通过代理的时候,性能还是稍微差点。我用twisted实现了一个简单的proxy,用于给公司内部网站增加功能、改善用户体验,以后有机会再和大家分享。

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics



PHP and Python have their own advantages and disadvantages, and the choice depends on project needs and personal preferences. 1.PHP is suitable for rapid development and maintenance of large-scale web applications. 2. Python dominates the field of data science and machine learning.

Python and JavaScript have their own advantages and disadvantages in terms of community, libraries and resources. 1) The Python community is friendly and suitable for beginners, but the front-end development resources are not as rich as JavaScript. 2) Python is powerful in data science and machine learning libraries, while JavaScript is better in front-end development libraries and frameworks. 3) Both have rich learning resources, but Python is suitable for starting with official documents, while JavaScript is better with MDNWebDocs. The choice should be based on project needs and personal interests.

Docker uses Linux kernel features to provide an efficient and isolated application running environment. Its working principle is as follows: 1. The mirror is used as a read-only template, which contains everything you need to run the application; 2. The Union File System (UnionFS) stacks multiple file systems, only storing the differences, saving space and speeding up; 3. The daemon manages the mirrors and containers, and the client uses them for interaction; 4. Namespaces and cgroups implement container isolation and resource limitations; 5. Multiple network modes support container interconnection. Only by understanding these core concepts can you better utilize Docker.

In VS Code, you can run the program in the terminal through the following steps: Prepare the code and open the integrated terminal to ensure that the code directory is consistent with the terminal working directory. Select the run command according to the programming language (such as Python's python your_file_name.py) to check whether it runs successfully and resolve errors. Use the debugger to improve debugging efficiency.

Python excels in automation, scripting, and task management. 1) Automation: File backup is realized through standard libraries such as os and shutil. 2) Script writing: Use the psutil library to monitor system resources. 3) Task management: Use the schedule library to schedule tasks. Python's ease of use and rich library support makes it the preferred tool in these areas.

VS Code extensions pose malicious risks, such as hiding malicious code, exploiting vulnerabilities, and masturbating as legitimate extensions. Methods to identify malicious extensions include: checking publishers, reading comments, checking code, and installing with caution. Security measures also include: security awareness, good habits, regular updates and antivirus software.

CentOS Installing Nginx requires following the following steps: Installing dependencies such as development tools, pcre-devel, and openssl-devel. Download the Nginx source code package, unzip it and compile and install it, and specify the installation path as /usr/local/nginx. Create Nginx users and user groups and set permissions. Modify the configuration file nginx.conf, and configure the listening port and domain name/IP address. Start the Nginx service. Common errors need to be paid attention to, such as dependency issues, port conflicts, and configuration file errors. Performance optimization needs to be adjusted according to the specific situation, such as turning on cache and adjusting the number of worker processes.

VS Code is the full name Visual Studio Code, which is a free and open source cross-platform code editor and development environment developed by Microsoft. It supports a wide range of programming languages and provides syntax highlighting, code automatic completion, code snippets and smart prompts to improve development efficiency. Through a rich extension ecosystem, users can add extensions to specific needs and languages, such as debuggers, code formatting tools, and Git integrations. VS Code also includes an intuitive debugger that helps quickly find and resolve bugs in your code.
