Mysql batch data entry method: 1. Use loop insertion; 2. Reduce connection resources and splice a sql; 3. Use stored procedures; 4. Use [MYSQL LOCAL_INFILE].
The operating environment of this tutorial: Windows 7 system, mysql version 8.0.22, DELL G3 computer. This method is suitable for all brands of computers.
Related free learning recommendations: mysql video tutorial
Mysql batch data entry method:
Method 1 : Loop insertion
This is also the most common method. If the amount of data is not very large, it can be used, but it will consume the resources of connecting to the database every time.
The general thinking is as follows
(I write pseudo code here, the specific writing can be combined with your own business logic or framework syntax)
for($i=1;$i<=100;$i++){ $sql = 'insert...............'; //querysql } foreach($arr as $key => $value){ $sql = 'insert...............'; //querysql } while($i <= 100){ $sql = 'insert...............'; //querysql $i++ }
Because it is too common and not difficult at the same time At the same time, it is not what I am mainly writing today, so I won’t say much here
Method 2: Reduce connection resources and splice a sql
The pseudo code is as follows
//这里假设arr的key和数据库字段同步,其实大多数框架中在php操作数据库的时候都是这么设计的 $arr_keys = array_keys($arr); $sql = 'INSERT INTO tablename (' . implode(',' ,$arr_keys) . ') values'; $arr_values = array_values($arr); $sql .= " ('" . implode("','" ,$arr_values) . "'),"; $sql = substr($sql ,0 ,-1); //拼接之后大概就是 INSERT INTO tablename ('username','password') values ('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx') ....... //querysql
It is not a big problem to write 10,000 normal inserts like this, unless the data is very long, it is enough to handle ordinary batch insertion, such as: batch generation of card numbers, batch generation of random codes, etc. . .
Method 3: Use stored procedures
I happen to use this in my hand to pay the sql. You can combine the specific business logic yourself.
delimiter $$$ create procedure zqtest() begin declare i int default 0; set i=0; start transaction; while i<80000 do //your insert sql set i=i+1; end while; commit; end $$$ delimiter; call zqtest();
This is just a test code, you can define the specific parameters yourself
I am inserting 80,000 items at a time. Although it is not much, the amount of data in each item is large, and there are many varchar4000 and text field
takes 6.524s
Method 4: Use MYSQL LOCAL_INFILE
I am currently using this, so By the way, I also copy the pdo code for your reference
//设置pdo开启MYSQL_ATTR_LOCAL_INFILE /*hs96.cn@gmail.com public function pdo_local_info () { global $system_dbserver; $dbname = 'hs96.cn@gmail.com'; $ip = 'hs96.cn@gmail.com'; $user = 'hs96.cn@gmail.com'; $pwd = 'hs96.cn@gmail.com'; $dsn = 'mysql:dbname=' . $dbname . ';host=' . $ip . ';port=3306'; $options = [PDO::MYSQL_ATTR_LOCAL_INFILE => true]; $db = new PDO($dsn ,$user ,$pwd ,$options); return $db; } //伪代码如下 public function test(){ $arr_keys = array_keys($arr); $root_dir = $_SERVER["DOCUMENT_ROOT"] . '/'; $my_file = $root_dir . "hs96.cn@gmail.com/sql_cache/" . $order['OrderNo'] . '.sql'; $fhandler = fopen($my_file,'a+'); if ($fhandler) { $sql = implode("\t" ,$arr); $i = 1; while ($i <= 80000) { $i++; fwrite($fhandler ,$sql . "\r\n"); } $sql = "LOAD DATA local INFILE '" . $myFile . "' INTO TABLE "; $sql .= "tablename (" . implode(',' ,$arr_keys) . ")"; $pdo = $this->pdo_local_info (); $res = $pdo->exec($sql); if (!$res) { //TODO 插入失败 } @unlink($my_file); } }
Each piece of data also has a large amount of data, and there are many varchar4000 and text fields
It takes 2.160s
The above meets the basic needs. 1 million data is not a big problem, otherwise the data is too large and involves sub-databases and tables, or it is inserted using a queue.
The above is the detailed content of How to enter data in batches in mysql. For more information, please follow other related articles on the PHP Chinese website!