This article mainly introduces the nginx application: using nginx for load balancing has a certain reference value. Now I share it with you. Friends in need can refer to it
##nginx It can generally be used for seven-layer load balancing. This article will introduce some basic knowledge of load balancing and a simple example of using nginx for load balancing.
OSI Model Layer | Description | |
---|---|---|
MAC layer | Response based on MAC address | |
IP layer | Response based on IP address | |
TCP layer | Respond based on the IP address and port number | |
HTTP layer | On the basis of four layers, you can continue Further response based on seven layers of information such as URL/browser category |
Four-layer load balancing | Seven-layer load balancing | |
---|---|---|
Lightweight implementation | Supports http and mail, performance is similar to haproxy | |
- | Supports seven-layer load balancing | |
Supports four-layer load balancing to achieve heavier | - | |
Hardware implementation , high cost | - |
Load balancing algorithm (E) | nginx support or not | Explanation | Applicable scenarios | |
---|---|---|---|---|
Round Robin | Supports | Polling with the same weight | Suitable for scenarios where external service requests and internal servers are relatively balanced | |
Weighted Round Robin | Support (weight) | You can set different weights for polling | The server has different processing capabilities, or you want to control the flow, such as Canary Release | |
Random | - | Randomly allocated to the server | When both external and internal are very balanced, or the demand for random allocation is strong | |
Weighted Random | - | Randomly assigned to the server in combination with the weight | Can be adjusted in combination with the weight Random strategy, better adapted to the distribution situation in reality | |
Response Time | Support (fair) | Based on The response speed of the server is allocated | The combination of server performance and the current operating status of the server. This strategy can dynamically adjust the status to avoid being assigned a large number of jobs even when the capable ones are no longer able | |
Least Connection | Allocation based on the number of connections | Polling is used to allocate tasks, because in actual situations it is impossible to control the allocation of polling task, but the speed at which the task is completed cannot be confirmed, which will lead to differences in the number of connections that reflect the real server load. It is suitable for businesses that provide long-term connection services for a long time, such as the implementation of WebSocket for online customer service, or services such as FTP/SFTP. | ||
Flash DNS | - | According to the fastest returned DNS Parse the results to continue requesting services, ignoring the IP addresses returned by other DNS | Applicable to situations with global load balancing, such as CDN |
Load balancing Algorithm (E) | nginx support or not | Explanation | Applicable scenarios | |
---|---|---|---|---|
Round Robin | Supports | Polling with the same weight | Suitable for scenarios where external service requests and internal servers are relatively balanced |
Load balancing algorithm | Load balancing algorithm (E) | nginx support or not | Description | Applicable scenarios |
---|---|---|---|---|
Weighted Round Robin | Weighted Round Robin | Support (weight) | You can set different weights for polling | The servers have different processing capabilities, or you want to control the flow, such as Canary Release |
Modify default.conf as follows
# cp default.conf default.conf.org # vi default.conf # diff default.conf default.conf.org 2,3c2,3 < server 192.168.163.117:7001 weight=100;< server 192.168.163.117:7002 weight=200; ---> server 192.168.163.117:7001; > server 192.168.163.117:7002; #
[root@kong ~]# docker restart nginx-lbnginx-lb [root@kong ~]#
You can see the polling results as follows The proportions of 1/3 and 2/3 are being carried out:
[root@kong ~]# curl http://localhost:9080Hello, Service :User Service 1: 7001[root@kong ~]# curl http://localhost:9080Hello, Service :User Service 1: 7002[root@kong ~]# curl http://localhost:9080Hello, Service :User Service 1: 7002[root@kong ~]#
Related recommendations:
The above is the detailed content of nginx application: using nginx for load balancing. For more information, please follow other related articles on the PHP Chinese website!