I tested this:
ab -n 1000 -c 100 -w http://www.**.cn/index.php >>D:/1.html
Isn’t this concurrency very small?
I tested this:
ab -n 1000 -c 100 -w http://www.**.cn/index.php >>D:/1.html
Isn’t this concurrency very small?
It depends on your definition of size.
Let’s make a rough estimate
Based on the peak concurrency of 100 on the homepage of an ordinary website, assuming your average response time is 200ms, that is 500 qps.
If the daily average is 1/4 of the peak (assuming that the number of visits in the early morning is close to 0, which lowers the average), then the daily average is 125 qps.
One day’s PV = 125 × 24 × 3600 = 10800000 = 10 million
Ten million PV, not a small site.
Of course, the actual situation cannot be calculated like this. Concurrency does not necessarily mean the homepage. Staticizing the homepage is also a common way to reduce PHP concurrency. There are many specific solutions, so I won’t describe them in detail.
I feel like I’m not young anymore. The answer from the 1st floor is very good,
1 day=86400 seconds
100 concurrency in 1 second
8640000 requests in one day
8.4 million clicks in 1 day
The data is very beautiful, but stress tests generally require database operations to count.
If there is no database operation, the QPS executed alone is completely meaningless. There are over 500 in any language.
Coupled with distributed deployment, there is no upper limit at all.
The above statement is wrong. The data given by the questioner does not include "100 concurrency in 1 second".
Assume that after the questioner presses Enter, the data is
Complete requests: 1000
Failed requests: 999
That means the questioner has not provided any information at all.
For this kind of topic owner, I can only say that I have blocked him