如何处理大型CSV文件插入到mysql中 [英] How to handle large CSV files to insert into mysql

查看:360
本文介绍了如何处理大型CSV文件插入到mysql中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个文件csv文件(做为.txt),我目前正在解析,但该文件是大约350mb未压缩。当它被压缩,它在zip文件中显示为23mb。我的系统完全冻结时,我试图解析350mb文件。我把这些行存储在这样的数组中。第一行是标题。

  $ fh = fopen($ inputFile,'r'); 
$ content = fread($ fh,filesize($ inputFile)); // 5KB
fclose($ fh);
// $ contents = str_replace(''','',$ contents);

$ fileLines = explode(\\\
,$ contents)确定我们只使用第一行。

然后我通过每一行插入一个循环因为文件大约是350mb,是否有一种方法来解析它从.zip文件,如.zip_filename.txt或者甚至会产生什么影响?



文件过大,无法通过导入方法直接插入到mysql中。

解决方案

使用内置函数 fgetcsv

 <?php 
$ row = 1;
if(($ handle = fopen($ inputFile,r))!== FALSE){
while(($ data = fgetcsv($ handle,1000,,))!== FALSE){
$ num = count($ data);
echo< p& $ num字段在行$ row:< br />< / p> \\\
;
$ row ++;
for($ c = 0; $ c< $ num; $ c ++){
echo $ data [$ c]。< br /> \\\
;
}
}
fclose($ handle);
}
?>

也可以使用多插入:

 插入表格(col1,col2)值(row1-col1,row1-col2); 
insert into table(col1,col2)values(row2-col1,row2-col2);


$ b b

以这种方式构建一个查询要快得多:

 插入表(col1,col2)
(row1-col1,row1-col2),
(row2-col1,row2-col2);

顺便说一句,你也可以将文件直接加载到mysql

 加载数据本地文件'file.csv'到表table_name以','结尾的字段
由'\\\
'终止的行
(col1,col2)


I have a file a csv file (made as .txt) that I am currently parsing right now, but the file is about 350mb uncompressed. When it's zipped, it shows in the zip file as 23mb. My system completely freezes when I try to parse the 350mb file. I store the lines in an array like this. The first row are the headings.

$fh = fopen($inputFile, 'r');
    $contents = fread($fh, filesize($inputFile)); // 5KB
fclose($fh);
//$contents = str_replace('"','',$contents);

$fileLines = explode("\n", $contents); // explode to make sure we are only using the first line.

Then I go through each line to insert it in a loop into mySQL. Since the file is about 350mb, would there be a way to parse it from the .zip file like .zip_filename.txt or would that even make a difference at all?

The file is too large to insert directly into mysql through the import method.

解决方案

Use the built in function fgetcsv:

<?php
$row = 1;
if (($handle = fopen($inputFile, "r")) !== FALSE) {
    while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
        $num = count($data);
        echo "<p> $num fields in line $row: <br /></p>\n";
        $row++;
        for ($c=0; $c < $num; $c++) {
            echo $data[$c] . "<br />\n";
        }
    }
    fclose($handle);
}
?>

Also use multi insert if possible. Instead of running multiple queries:

insert into table (col1, col2) values("row1-col1", "row1-col2");
insert into table (col1, col2) values("row2-col1", "row2-col2");

Building one query like this is much quicker:

insert into table (col1, col2) 
values ("row1-col1", "row1-col2"),
       ("row2-col1", "row2-col2");

By the way, you can also load a file directly into mysql:

load data local infile 'file.csv' into table table_name fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(col1, col2)

这篇关于如何处理大型CSV文件插入到mysql中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆