优化大型MySQL INSERT [英] Optimize massive MySQL INSERTs

查看:84
本文介绍了优化大型MySQL INSERT的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个需要运行日常脚本的应用程序;日常脚本包括下载具有1,000,000行的CSV文件,然后将这些行插入表格中.

I've got an application which needs to run a daily script; the daily script consists in downloading a CSV file with 1,000,000 rows, and inserting those rows into a table.

我将应用程序托管在Dreamhost中.我创建了一个while循环,该循环遍历了CSV的所有行,并为每个循环执行INSERT查询.问题是我收到"500 Internal Server Error".即使我将它分成1000个文件(每个文件有1000行),在同一循环中也不能插入40或5万行以上.

I host my application in Dreamhost. I created a while loop that goes through all the CSV's rows and performs an INSERT query for each one. The thing is that I get a "500 Internal Server Error". Even if I chop it out in 1000 files with 1000 rows each, I can't insert more than 40 or 50 thousand rows in the same loop.

有什么方法可以优化输入?我也在考虑使用专用服务器.你觉得怎么样?

Is there any way that I could optimize the input? I'm also considering going with a dedicated server; what do you think?

谢谢!

佩德罗

推荐答案

大多数数据库具有优化的大容量插入过程-

Most databases have an optimized bulk insertion process - MySQL's is the LOAD DATA FILE syntax.

要加载CSV文件,请使用:

To load a CSV file, use:

LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
  FIELDS TERMINATED BY ',' ENCLOSED BY '"'
  LINES TERMINATED BY '\r\n'
  IGNORE 1 LINES;

这篇关于优化大型MySQL INSERT的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆