A mysql data table contains a large amount of data, such as 10 million items, occupying 5G space. Insert data into it. How to regularly update the status of the newly inserted data to make it more efficient.
For example: I inserted 1000 items into it and updated the status of these 1000 items every 10 minutes. How to improve efficiency
A mysql data table contains a large amount of data, such as 10 million items, occupying 5G space. Insert data into it. How to regularly update the status of the newly inserted data to make it more efficient.
For example: I inserted 1000 items into it and updated the status of these 1000 items every 10 minutes. How to improve efficiency
Use something like cron job
I don’t know what status you are talking about
If it is a certain field in that table, just update it according to the primary key
The amount of table data is large, and the speed of update is related to the table index. If the updated field does not involve an index, and it is updated according to the primary key, it should be very fast, even if it is tens of millions of data
If you want to query the 1,000 new items among the 10 million, it may indeed be slow, depending on whether your query conditions are correctly indexed