How to configure nginx as a load balancer

(*-*)浩
Release: 2019-11-20 09:53:38
Original
9215 people have browsed it

How to configure nginx as a load balancer

##1. The role of load balancing

1. Forwarding function ##) According to a certain algorithm [weight, polling], client requests are forwarded to different application servers to reduce the pressure on a single server and increase system concurrency.

2. Fault removal

Use heartbeat detection to determine whether the application server can currently work normally. If the server goes down, the request will be automatically sent to other application server.

3. Recovery Addition

If it is detected that the failed application server has resumed work, it will be automatically added to the team that handles user requests.

How to configure nginx as a load balancer 2. Nginx implements load balancing

Also uses two tomcats to simulate two application servers, with port numbers 8080 and 8081

1. Nginx’s load distribution strategy

Nginx’s upstream currently supports the distribution algorithm:

1), Polling - 1:1 processing requests in turn (default)

Each request is assigned to a different application server one by one in chronological order. If the application server goes down, it will be automatically eliminated and the remaining ones will continue to be polled.

2), weight - you can you up

By configuring the weight, specify the polling probability, the weight is proportional to the access ratio, and is used for uneven application server performance .

3), ip_hash algorithm

Each request is allocated according to the hash result of the accessed IP, so that each visitor has a fixed access to an application server, which can solve the problem of session sharing.

2. Configure Nginx's load balancing and distribution strategy


Just add the specified parameters after the application server IP added in the upstream parameter. Implement

, such as:

upstream tomcatserver1 {  
    server 192.168.72.49:8080 weight=3;  
    server 192.168.72.49:8081;  
    }   
  
 server {  
        listen       80;  
        server_name  8080.max.com;  
        #charset koi8-r;  
        #access_log  logs/host.access.log  main;  
        location / {  
            proxy_pass   http://tomcatserver1;  
            index  index.html index.htm;  
        }  
     }
Copy after login
Through the above configuration, it can be realized. When accessing the website 8080.max.com, since the proxy_pass address is configured, all requests will first be reversed through nginx The proxy server, when the server forwards the request to the destination host, reads the upstream address of tomcatsever1, reads the distribution policy, and configures tomcat1 weight to 3, so nginx will send most requests to tomcat1 on server 49, which is 8080 Port; a smaller part is given to tomcat2 to achieve conditional load balancing. Of course, this condition is the hardware index processing request capability of servers 1 and 2.

The above is the detailed content of How to configure nginx as a load balancer. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template