Home > Database > Mysql Tutorial > Detailed explanation of performance optimization of MySQL batch SQL insertion

Detailed explanation of performance optimization of MySQL batch SQL insertion

coldplay.xixi
Release: 2020-12-21 17:43:29
forward
3356 people have browsed it

mysql tutorialColumn introduction to batch SQL insertion

Detailed explanation of performance optimization of MySQL batch SQL insertion

Recommended (Free): mysql tutorial

For some systems with large amounts of data, in addition to low query efficiency, the database faces It just takes a long time for data to be stored in the database. Especially for reporting systems, the time spent on data import may last for several hours or more than ten hours every day. Therefore, it makes sense to optimize database insertion performance.

One SQL statement inserts multiple pieces of data

INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`) 
    VALUES ('0', 'userid_0', 'content_0', 0);INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`) 
    VALUES ('1', 'userid_1', 'content_1', 1);
Copy after login
INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`) 
    VALUES ('0', 'userid_0', 'content_0', 0), ('1', 'userid_1', 'content_1', 1);
Copy after login
  • The main reason for the high efficiency of SQL execution is merging The amount of logs [mysql's binlog and InnoDB's transaction logs] are reduced, which reduces the data volume and frequency of log flushing, thereby improving efficiency.
  • By merging SQL statements, it can also reduce the number of SQL statement parsing and reduce network transmission IO.

Test and compare data, respectively, importing a single piece of data and converting it into a SQL statement for import.
Detailed explanation of performance optimization of MySQL batch SQL insertion

Insertion processing in transactions

START TRANSACTION;INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`)     VALUES ('0', 'userid_0', 'content_0', 0);INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`)     VALUES ('1', 'userid_1', 'content_1', 1);...COMMIT;
Copy after login
  • Using transactions can improve the efficiency of data insertion, which This is because when an insert operation is performed, a transaction is created internally in MySQL, and the actual insert processing operation is performed within the transaction.
  • Reduce the consumption of creating transactions by using transactions. All inserts are submitted before being executed.

Test and compare data, divided into written tests not applicable to transactions and using transaction operations

Detailed explanation of performance optimization of MySQL batch SQL insertion

Orderly insertion of data

Orderly insertion of data is to insert records on the primary key. Sequential sorting

INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`)     VALUES ('1', 'userid_1', 'content_1', 1);INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`)     VALUES ('0', 'userid_0', 'content_0', 0);INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`)     VALUES ('2', 'userid_2', 'content_2',2);
Copy after login
INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`)     VALUES ('0', 'userid_0', 'content_0', 0);INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`)     VALUES ('1', 'userid_1', 'content_1', 1);INSERT INTO `insert_table` (`datetime`, `uid`, `content`, `type`)     VALUES ('2', 'userid_2', 'content_2',2);
Copy after login
  • Since the database needs to maintain index data when inserting, unnecessary records will increase the cost of maintaining the index.

Refer to the B tree index used by InnoDB. If each inserted record is at the end of the index, the index positioning efficiency is very high and there will be less adjustment to the index; if the inserted record In the middle of the index, B trees need to be split and merged, which will consume more computing resources, and the index positioning efficiency of inserted records will decrease. When the amount of data is large, there will be frequent disk operations.

Test comparison data, performance comparison of random data and sequential data

Detailed explanation of performance optimization of MySQL batch SQL insertion

Delete the index first and insert Rebuild the index after completion

Comprehensive performance test

Detailed explanation of performance optimization of MySQL batch SQL insertion

  • Merge data transaction When the amount of data is small, the performance of the method is obviously improved. When the amount of data is large, the performance drops sharply. This is because the amount of data exceeds the capacity of innodb_buffer. Each positioning index involves more disk read and write operations, and the performance Decline faster.
  • The orderly method of merging data transactions still performs well when the data volume reaches tens of millions. When the data volume is large, the ordered data index positioning is more convenient and does not require frequent read and write operations on the disk. Can maintain a higher value

Notes

  • ##SQL statements have length limits, so when merging data The SQL length limit must not be exceeded in the same SQL. It can be modified through the

    max_allowed_packet configuration. The default is 1M, which can be modified to 8M during testing.

  • Transactions need to be controlled in size. Things that are too large may affect execution efficiency. MySQL has the

    innodb_log_buffer_size configuration item. If this value is exceeded, the innodb data will be flushed to the disk. At this time, the efficiency will be reduced. So a better approach is to perform transaction commit before the data reaches this value.

The above is the detailed content of Detailed explanation of performance optimization of MySQL batch SQL insertion. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:learnku.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template