Load balancing - some questions about nginx reverse proxy?
给我你的怀抱
给我你的怀抱 2017-05-16 17:08:55
0
2
599

I use two servers to configure the reverse proxy.
Server A forwards all requests to the test server group through the following configuration (although there is only 1 in the group)

listen 80;

root   /home/www;

......

upstream test {
    server 123.45.567.89:80 weight=1;  // 服务器B
}

location / {
    proxy_pass http://test;
}

Accessing server A through the browser returns the content in server B.
Here comes the question:
1. Server B (the reverse proxy server) does not seem to require any configuration?
2. As long as the proxy server does not have any blocking settings, can I join upstream to be my reverse proxy server?
3. When using reverse proxy for load balancing, do the projects on server B need to be consistent with the projects on server A?

给我你的怀抱
给我你的怀抱

reply all(2)
刘奇

Point 1: Simply speaking, the proxy object does not require any configuration, but it should be noted that if you want to process the source IP of the request on B, then you need to do some configuration on nginx on A, and change the source IP Put it in the request header, for example,

Point 2: I have never used upstream, usually I use proxy_pass

location / {
      proxy_set_header X-Real-IP $remote_addr;
      proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
      proxy_set_header Host $http_host;
      proxy_set_header X-NginX-Proxy true;

      proxy_pass http://127.0.0.1:6101/;
      proxy_redirect off;
      proxy_buffering off;
    }
Point 3: Generally speaking, yes.

小葫芦

1 Proxied server B does not require any settings
2 Adding multiple groups of servers to upstream is the so-called load balancing configuration. Then configure proxy_pass
3. The load-balanced server theoretically shares data and has the same business, but is distributed and deployed on different machines to reduce server pressure. (The poster can start multiple services with the same business on different ports on the same server, and configure load balancing, verify it do yourself!)

There are currently several methods of load balancing strategy

  • Polling (default): Each request is assigned to a different backend server one by one in chronological order. If the backend server goes down, it can be automatically eliminated.

  • weight: Specify the polling probability, the weight is proportional to the access ratio, and is used when the back-end server performance is uneven.

  • ip_hash: Each request is allocated according to the hash result of the accessed IP, so that each visitor has fixed access to a backend server, which can solve the session problem.

  • fair (third party): Allocate requests according to the response time of the backend server, with priority given to those with short response times.

  • url_hash (third party): Distribute requests according to the hash result of the accessed URL, so that each URL is directed to the same back-end server. It is more effective when the back-end server is cached.

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template