For a simple example, you have a batch of tasks at this time, but the characteristics are as follows:
1. Long execution time
2. Don’t care about the execution results
If there are a large number of tasks like this, you should choose multi-threading (that is, one task starts a thread). Then the CPU will eventually be "busy" in context switching due to too many threads.
Since you don't care about the execution results, you can put all the tasks into the queue and use a thread pool to execute them. For example, if you only execute 4 tasks at the same time, then you will only have 4 threads globally, context switching is fast, and tasks are orderly. Completed execution. It's just that the efficiency may not be as fast as multi-threading to some extent.
For this difference, please refer to the difference between tomcat or apache httpd and nginx or node.
In order to buffer, for example, when the request you submit is not stable, you can have a buffer through the queue, so that the data can be backlogged into the queue first
When you were learning operating systems, did you learn a producer-consumer model? The producer produces products and puts them into the queue, and the consumer reads data from the queue for consumption. Why put the two actions of production and consumption into two threads? If you understand this principle, you will understand why queues are used.
The thread has a bottleneck, and the queue can be processed slowly first.
For a simple example, you have a batch of tasks at this time, but the characteristics are as follows:
1. Long execution time
2. Don’t care about the execution results
If there are a large number of tasks like this, you should choose multi-threading (that is, one task starts a thread). Then the CPU will eventually be "busy" in context switching due to too many threads.
Since you don't care about the execution results, you can put all the tasks into the queue and use a thread pool to execute them. For example, if you only execute 4 tasks at the same time, then you will only have 4 threads globally, context switching is fast, and tasks are orderly. Completed execution. It's just that the efficiency may not be as fast as multi-threading to some extent.
For this difference, please refer to the difference between tomcat or apache httpd and nginx or node.
In order to buffer, for example, when the request you submit is not stable, you can have a buffer through the queue, so that the data can be backlogged into the queue first
Multiple threads and queues are not the same thing
Multithreading is unstable.
You need to consider memory safety issues, scheduling and many other issues, which are relatively uncontrollable.
The execution frequency and order are within controllable range, so it is very suitable for buffer zone~
You will know when you need to store dozens of gigabytes of data in memory
When you were learning operating systems, did you learn a producer-consumer model? The producer produces products and puts them into the queue, and the consumer reads data from the queue for consumption. Why put the two actions of production and consumption into two threads? If you understand this principle, you will understand why queues are used.