How to solve the syntax error in high concurrency reporting in PHP

藏色散人
Release: 2023-03-17 15:52:01
Original
4755 people have browsed it

Solution to the syntax error reported by high concurrency in php: 1. Check the number of configured accesses or connections of nginx, and add the two parameters of nginx; 2. Confirm whether the worker process of php-fpm is sufficient, and then Increase the number of worker_connections processes; 3. Disable the recorded slow log.

How to solve the syntax error in high concurrency reporting in PHP

The operating environment of this tutorial: Windows 10 system, PHP version 8.1, Dell G3 computer.

How to solve the syntax error in high concurrency reporting in php?

Nginx Php is high and concurrently reports 502 and 504 problem solving:

Recently I am helping the company optimize the PHP project. Baidu while optimizing. The number of visits to this project is quite large (the average number of requests per minute is 80,000).

Used three aws servers. Two 8-core 16G and one 4-core 16G. The small one runs Nginx and runs a small number of php-fpm processes. Basically put it up and hang it up. The accesses are all 502 and 504. Because there is no problem with the project and the test has been run before. Then I started looking for problems on Baidu.

1. It is suspected that the number of configured accesses or connections of nginx is too small to handle, and then increase the two parameters of nginx.

The maximum number of connections allowed by each process. Theoretically, the maximum number of connections per nginx server is worker_processes*worker_connections

 worker_connections 5000;
Copy after login

The maximum number of file descriptors opened by an nginx process. The theoretical value should be It is the maximum number of open files (ulimit -n) divided by the number of nginx processes

worker_rlimit_nofile 20000;
Copy after login

PHP request timeout and cache, etc.

fastcgi_connect_timeout 300;
fastcgi_send_timeout 300;
fastcgi_read_timeout 300;
fastcgi_buffer_size 64k;
fastcgi_buffers 4 64k;
fastcgi_busy_buffers_size 128k;
fastcgi_temp_file_write_size 256k;
Copy after login

Restart nginx after setting it up. . But when I tested it, there was no response at all.

2. It is suspected to be a php configuration problem.

Confirm whether the worker process of php-fpm is enough. If it is not enough, it means it is not enabled.

Calculate the number of worker processes that are opened:

ps -ef | grep 'php-fpm'|grep -v 'master'|grep -v 'grep' |wc -l
Copy after login

Calculate the worker processes in use , the request being processed

netstat -anp | grep 'php-fpm'|grep -v 'LISTENING'|grep -v 'php-fpm.conf'|wc -l
Copy after login

If the above two values ​​​​are close, you can consider increasing the number of worker_connections processes

and modifying the number of php processes in php-fpm.conf. It doesn't matter whether you turn these parameters up or down. . . . Desperate!

Modified the log level log_level = debug of php-fpm.conf. I saw an error in the error_log file:

[29-Mar-2014 22:40:10] ERROR: failed to ptrace(PEEKDATA) pid 4276: Input/output error (5)
[29-Mar-2014 22:53:54] ERROR: failed to ptrace(PEEKDATA) pid 4319: Input/output error (5)
[29-Mar-2014 22:56:30] ERROR: failed to ptrace(PEEKDATA) pid 4342: Input/output error (5)
Copy after login

So, I started to Google this error again. Find the article (http://www.mamicode.com/info-detail-1488604.html). It says above that the recorded slow log needs to be disabled; slowlog = /var/log/php-fpm/slow.log; request_slowlog_timeout = 15s. At this time, I realized that PHP also records slow requests during access logs. Then open the slow log file. It was found that all error logs were caused by PHP requesting redis. . .

The cause of the problem is found. When PHP requests redis data, it should be that too many connections are requested. Problems caused by redis failure to connect. . Because the business here is relatively complex, the redis key is spliced ​​into multiple fields. Fuzzy query is used when querying. All this leads to a decrease in the performance of redis, and a large number of subsequent requests cannot connect to redis. Because the code for linking to redis was changed by me. . So I restored the original code for requesting mysql. .

Currently, the project is running normally, and the CPU of each server is basically close to 100%. I am worried that there will be problems, the CPU will be full, and the MySQL connection request will not be able to withstand it. . . Let’s optimize it later! ! ! !

Recommended learning: "PHP Video Tutorial"

The above is the detailed content of How to solve the syntax error in high concurrency reporting in PHP. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template