如何将大型 sql 文件导入 mysql 表 [英] How to import LARGE sql files into mysql table
问题描述
我有一个 php 脚本来解析 XML 文件并创建一个看起来像这样的大型 SQL 文件:
I have a php script that parses XML files and creates a large SQL file that looks something like this:
INSERT IGNORE INTO table(field1,field2,field3...)
VALUES ("value1","value2",int1...),
("value1","value2",int1)...etc
这个文件加起来超过 20GB(我已经测试了一个 2.5GB 的文件,但它也失败了).
This file adds up to be over 20GB (I've tested on a 2.5GB file but it fails too).
我尝试过以下命令:
mysql -u root -p table_name </var/www/bigfile.sql
mysql -u root -p table_name < /var/www/bigfile.sql
这适用于较小的文件,比如大约 50MB.但它不适用于更大的文件.
this works on smaller files, say around 50MB. but it doesn't work with a larger file.
我试过了:
mysql> source /var/www/bigfile.sql
我也尝试过 mysqlimport,但它甚至无法正确处理我的文件.
I also tried mysqlimport but that won't even properly process my file.
我一直收到一个错误提示
I keep getting an error that says
ERROR 2006 (HY000): MySQL server has gone away
大约发生.我开始执行后 30 秒.
Happens approx. 30 seconds after I start executing.
我将 allowed_max_packet 设置为 4GB,但在使用 SHOW VARIABLES 进行验证时,它仅显示 1GB.
I set allowed_max_packet to 4GB but when verifying it with SHOW VARIABLES it only shows 1GB.
有没有办法在不浪费另外 10 个小时的情况下做到这一点?
Is there a way to do this without wasting another 10 hours?
推荐答案
尝试将文件拆分为多个 INSERT 查询.
Try splitting the file into multiple INSERT queries.
这篇关于如何将大型 sql 文件导入 mysql 表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!