84669 person learning
152542 person learning
20005 person learning
5487 person learning
7821 person learning
359900 person learning
3350 person learning
180660 person learning
48569 person learning
18603 person learning
40936 person learning
1549 person learning
1183 person learning
32909 person learning
数据库down下一个200M的sql文件准备做测试,求一个导入数据库的方法;Tips:cli下用source 导入直接导致server断掉,亲测用工具SQLyog一样不行另,修改了my.inimax_allowed_packet=500M重启mysql后没起到作用,也很奇怪
拥有18年软件开发和IT教学经验。曾任多家上市公司技术总监、架构师、项目经理、高级软件工程师等职务。 网络人气名人讲师,...
There is a tool that can split large sql data files into multiple files according to a certain size such as 10M. The data structure file is in a separate file. You can search it online. The name is SQLDumpSplitter
Can the data source be imported again? Direct
mysqldump -u -p -P > *.sql mysql -u -p -P < *.sql
How fast is this
Modify wait_timeout=2880000interactive_timeout = 2880000max_allowed_packet = 20M --This parameter cannot be set too large.
There is a tool that can split large sql data files into multiple files according to a certain size such as 10M. The data structure file is in a separate file. You can search it online. The name is SQLDumpSplitter
Can the data source be imported again?
Direct
How fast is this
Modify
wait_timeout=2880000
interactive_timeout = 2880000
max_allowed_packet = 20M --This parameter cannot be set too large.