将大型csv导入mysql数据库 [英] Importing large csv into mysql database

查看:805
本文介绍了将大型csv导入mysql数据库的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



csv大约有55 MB,大约有75万行。

在试图将大型csv文件导入到mysql中时, >

现在求助于编写一个脚本来解析csv并将行1转储一个



遇到代码:

  $ row = 1; 
if(($ handle = fopen(postal_codes.csv,r))!== FALSE)
{
while(($ data = fgetcsv ,))!== FALSE)
{
$ num = count($ data);
$ row ++;
for($ c = 0; $ c <$ num; $ c ++)
{
$ arr = explode('|',$ data [$ c]);

$ postcode = mysql_real_escape_string($ arr [1]);
$ city_name = mysql_real_escape_string($ arr [2]);
$ city_slug = mysql_real_escape_string(toAscii($ city_name));
$ prov_name = mysql_real_escape_string($ arr [3]);
$ prov_slug = mysql_real_escape_string(toAscii($ prov_name));
$ prov_abbr = mysql_real_escape_string($ arr [4]);
$ lat = mysql_real_escape_string($ arr [6]);
$ lng = mysql_real_escape_string($ arr [7]);

mysql_query(insert code into city(`postcode`,`city_name`,`city_slug`,`prov_name`,`prov_slug`,`prov_abbr`,`lat`,`lng')
值('$ postcode','$ city_name','$ city_slug','$ prov_name','$ prov_slug','$ prov_abbr','$ lat','$ lng')或者mysql_error ));
}
}
fclose($ handle);
}

问题是这是永远执行...任何解决方案太棒了。

解决方案

您正在重新发明轮子。查看MySQL附带的 mysqlimport 工具。



mysqlimport是 LOAD DATA LOCAL INFILE SQL语句。



每行应该比执行INSERT要快10-20倍。


im having a really troublesome time trying to import a large csv file into mysql on localhost

the csv is about 55 MB and has about 750,000 rows.

now ive resorted to writing a script that parses the csv and dumps rows 1 by one

heres the code:

$row = 1;
if (($handle = fopen("postal_codes.csv", "r")) !== FALSE) 
{
    while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) 
    {
        $num = count($data);
        $row++;
        for ($c=0; $c < $num; $c++) 
        {
            $arr = explode('|', $data[$c]);

            $postcode = mysql_real_escape_string($arr[1]);
            $city_name = mysql_real_escape_string($arr[2]);
            $city_slug = mysql_real_escape_string(toAscii($city_name));
            $prov_name = mysql_real_escape_string($arr[3]);
            $prov_slug = mysql_real_escape_string(toAscii($prov_name));
            $prov_abbr = mysql_real_escape_string($arr[4]);
            $lat = mysql_real_escape_string($arr[6]);
            $lng = mysql_real_escape_string($arr[7]);

            mysql_query("insert into cities (`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`) 
                         values ('$postcode', '$city_name', '$city_slug', '$prov_name', '$prov_slug', '$prov_abbr', '$lat', '$lng')") or die(mysql_error());
        }
    }
    fclose($handle);
}

the problem is this is taking forever to execute...any solutions would be great.

解决方案

You are reinventing the wheel. Check out the mysqlimport tool, which comes with MySQL. It is an efficient tool for importing CSV data files.

mysqlimport is a command-line interface for the LOAD DATA LOCAL INFILE SQL statement.

Either should run 10-20x faster than doing INSERT row by row.

这篇关于将大型csv导入mysql数据库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆