INSERT INTO test_table (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
But you can’t insert too many items each time. For example, insert 100 items at a time, and then insert 1,000 times in total, which will be 100,000 items, and the speed will not be very slow
1. SQL statements can definitely be implemented, but inserting so much data will definitely be very slow. 2. It can also be achieved using the client tool SQLyog, just import it.
Let me give you what I summarized before: --Declare a stored procedure (understood as a function)
delimiter ;;
create procedure myproc ()
begin
declare num int ;
set num = 1 ;
while num < 10 do
insert into user (id, `name`, sex)
values
('', concat("name", num), 1) ;
set num = num + 1 ;
end
while ;
end;;
--Execute this function call myproc(); --View the results of inserting data select * from user; --Delete this stored procedure drop procedure myproc;
Is there any other way besides sql? Generate the sql file and import it directly. 100,000 pieces of data are not too much.
Navicat Import Wizard, you deserve it

Use insert to batch insert, for example:
But you can’t insert too many items each time. For example, insert 100 items at a time, and then insert 1,000 times in total, which will be 100,000 items, and the speed will not be very slow
1. SQL statements can definitely be implemented, but inserting so much data will definitely be very slow.
2. It can also be achieved using the client tool SQLyog, just import it.
Let me give you what I summarized before:
--Declare a stored procedure (understood as a function)
--Execute this function
call myproc();
--View the results of inserting data
select * from user;
--Delete this stored procedure
drop procedure myproc;