Now there is a need to import excel data into a data table, but the amount of data is relatively large. At most, dozens of W pieces of data need to be imported at a time. The imported data is not written directly into the db, but through an interface. The interface is once It can process up to 1,000 items, which takes a long time. Now I have an idea. I want to make this a task-based import. Each time I import, I create an import task, make all the tasks into a queue, and complete the import tasks in sequence. What's the benefit? Methods and suggestions
Now there is a need to import excel data into a data table, but the amount of data is relatively large. At most, dozens of W pieces of data need to be imported at a time. The imported data is not written directly into the db, but through an interface. The interface is once It can process up to 1,000 items, which takes a long time. Now I have an idea. I want to make this a task-based import. Each time I import, I create an import task, make all the tasks into a queue, and complete the import tasks in sequence. What's the benefit? Methods and suggestions
Your idea is very good, but unfortunately PHP is not suitable for this kind of thing. It is recommended to use a language like Java to implement your idea.
PHP has a task and directly submits the task parameters to the database.
Java accesses the database every 100ms in the background to see if there are any tasks and handle them if there are any.
php defaults to 30 seconds; you can add a set_time_limit(99999999);
hundreds of thousands of data should be dumped in less than 1 minute
Crontab + task list does background asynchronous tasks, which is very suitable for this scenario
mysql load data infile quickly imports data into a temporary table
The php daemon synchronizes data to the interface in batches (or crontab writes scheduled tasks)
Splice php into sql and directly import mysql background source
It is best to import in sections, not all at once