导入大型SQL文件 [英] Import Large SQL File

查看:102
本文介绍了导入大型SQL文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是一名学生,从事一些涉及数据挖掘的研究.我有几个自愿的节点"服务器,这些服务器收集并生成SQL文件供我导入到我的服务器上并进行分析.

I am a student conducting some research which involves a sort of data mining. I have several volunteer "node" servers which gather and produce SQL files for me to import on my server and analyze.

问题是,这些文件很大,我需要一种快速导入它们的方法.网络最近扩展了,现在硬盘上没有足够的吞吐量供MySQL控制台将其导入时导入.时间很重要-进行研究存在最后期限,我想积极主动地收集尽可能多的时间,而不必等待插入队列.

The problem is, these are very big files, and I need a way to import them quickly. The network recently expanded, and now there just isn't enough throughput on the hard drive for the MySQL console to import them as they come in. And time is important - there is a deadline for the research to be in, and I want to be actively gathering for as much time as possible beforehand and not have a queue waiting to be inserted.

我想知道是否有更好的方法可以导入非常大的文件-每个文件的大小约为100 MB.我已经尝试过"\.myfile.sql",但这太慢了.PHPMyAdmin不会接受那么大的文件.

I am wondering if there is a better way to import very large files - each one weighs in at about 100 MB. I've tried "\. myfile.sql"" but that is incredibly slow. PHPMyAdmin won't take files that big.

有什么主意吗?谢谢!

推荐答案

您尝试过mysql -uYOU -p < myfile.sql吗?

UPD:

即使是mysql -uYOU -p < myfile.sql &,如果您有短暂的远程控制台会话

even mysql -uYOU -p < myfile.sql & if you have short-live remote console session

UPD2:

但最有效的方法是按照 PinnyM 的建议使用mysqlimport. 假定name_same_as_table.txt是带有DOS样式的EOL和制表符分隔的字段的文本文件.字段的数量和类型必须与目标表中的相同.

But most efficient way it's using mysqlimport as PinnyM advised. Assuming name_same_as_table.txt is text file with DOS-style EOLs and tab-separated fields. Count and type of fields must be the same as in destination table.

mysqlimport -uYOU -p --lock-tables --lines-terminated-by="\r\n" --fields-terminated-by="\t" YOUR_DB name_same_as_table.txt

这篇关于导入大型SQL文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆