This article mainly introduces the maximum amount of data supported by THINKPHP's addAll. Friends who need it can refer to it
There are two methods for Model operation in Thinkphp: add() and addAll
$User = M("User"); // 实例化User对象
$data['name'] = 'ThinkPHP';
$data['email'] = 'ThinkPHP@gmail.com';
$User->add($data);
$dataList[] = array('name'=>'thinkphp','email'=>'thinkphp@gamil.com');
$dataList[] = array('name'=>'onethink','email'=>'onethink@gamil.com');
$User->addAll($dataList);
Copy after login
# The ##addAll method can add data in batches, which is how MySQL is used:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
Copy after login
When the amount of data is large, try to choose batch insertion instead of looping one by one, otherwise your database will be overwhelmed. Lose.
But if you take it for granted and store all the data into an array and perform addAll, you will also face a hang-up situation. Why is this?
The reason is that the configuration of the max_allowed_packet variable in mysql limits the length of the upload sql statement. Just configure it larger in the mysql configuration.
max_allowed_packet = 100M
At the same time, When inserting data, also limit the length of batch insertion. After all, you never know when the data will become millions.
The above is the entire content of this article. I hope it will be helpful to everyone's study. For more related content, please pay attention to the PHP Chinese website!
Related recommendations:
ThinkPHP3.2 framework uses the addAll() method to insert data in batches
The above is the detailed content of Analysis of the maximum data amount supported by THINKPHP's addAll. For more information, please follow other related articles on the PHP Chinese website!