MySQL 大表导入 [英] MySQL Large Table Import

查看:42
本文介绍了MySQL 大表导入的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个 1GB 的 sql 文本文件要导入 MySQL.我使用了一个大文本文件查看器",可以看到它是一个标准的 mysql 表导出 - 从删除表开始,然后创建新表,然后插入.注意:我也有表格的 csv.

I have a 1GB sql text file I'm importing into MySQL. I've used a 'large text file viewer' and can see it a std mysql table export - starts with drop table, then create new table and then insert. Note: I also have a csv of the table.

我使用以下方法导入它:

I'm used the following methods to import it:

  1. php.ini - upload_max_filesize、memory_limit 和 post_max_size(增加这些大小但它仍然说它太大了 - 使这些数字非常大 - 默认数字后面有 10 个零).

  1. php.ini - upload_max_filesize, memory_limit and post_max_size (increase sizes of these but it still said it was too large - made these numbers very large - 10 zeros after default number).

C:\wamp\bin\mysql\mysql5.1.53\bin>mysqlimport -h localhost -u root -p --fields-terminated-by='\t' --lines-terminated-by='\n' --local 测试 c:\temp\filename.csv

C:\wamp\bin\mysql\mysql5.1.53\bin>mysqlimport -h localhost -u root -p --fields-t erminated-by='\t' --lines-terminated-by='\n' --local testing c:\temp\filename.csv

源 c:\temp\filename.sql

source c:\temp\filename.sql

我可以看到任务管理器"中的第二个 2 方法似乎在尝试将它们插入 MySQL 之前将整个 1GB 文件移动到内存中.我可以看到每个进程,它慢慢增长到超过 1GB 并因错误而失败.

I can see the second 2 methods 'in task manager' seem to be moving the entire 1GB file into memory before they try to insert them into MySQL. I can see the process for each and it slowly grows to over 1GB and failed with a error.

问题:是否有更好且最快的方法来导入表?有没有一种方法不需要将整个文件移动到内存中?第一次用这么大的sql表.

Question: Is there a preferable and fastest way to import a table? is there a way that doesn't need to move the entire file into memory? First time I've worked with such a large sql table.

谢谢

推荐答案

拥有如此庞大的出口的部分问题(大概)是由于其中的冗余.可能有数以千计的INSERT"命令在一次加载时会占用大量内存.

Part of the problem of having such a huge export is (presumably) due to the redundancies in it. There are probably thousands of "INSERT" commands which when loaded all at once, takes up a huge chunk of memory.

如果您拥有与 CSV 完全相同的数据,我将创建一个小的 PHP 脚本来逐行浏览 CSV 文件并动态创建一个 INSERT sql 查询,然后执行它.这应该可以保持非常低的内存占用,因为您可以在每次插入后不断替换临时变量(对应于字段)的值.

If you have the exact same data as a CSV, what I would do is create a small PHP script to go line-by-line through the CSV file and create an INSERT sql query dynamically and then execute it. This should keep the memory footprint very low since you could keep replacing the values of your temporary variables (corresponding to the fields) after each insertion.

我已链接到我尝试使用的功能.它被称为 fgetcsv.该链接还包含一些由其他用户创建的示例脚本,我认为这些脚本可能对您有用.

I've linked to the function that I'd try to use. It's called fgetcsv. That link also has some sample scripts which I believe might be of use to you that were created by other users.

祝你好运!

用于逐行读取的 fgetcsv PHP 函数

这篇关于MySQL 大表导入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆