如何通过PHP将数百万行从MySQL导出到CSV而不耗尽内存? [英] How to export millions of rows from MySQL to CSV via PHP without exhausting memory?

查看:240
本文介绍了如何通过PHP将数百万行从MySQL导出到CSV而不耗尽内存?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

所以我有这个表:

mysql> DESCRIBE table;
+-------+------------------+------+-----+---------+----------------+
| Field | Type             | Null | Key | Default | Extra          |
+-------+------------------+------+-----+---------+----------------+
| id    | int(15) unsigned | NO   | PRI | NULL    | auto_increment |
| unid  | char(9)          | NO   | UNI | NULL    |                |
| rs    | varchar(255)     | NO   |     | NULL    |                |
+-------+------------------+------+-----+---------+----------------+
3 rows in set (0.00 sec)

其中包含数百万行:

mysql> SELECT COUNT(1) FROM table;
+----------+
| COUNT(1) |
+----------+
|  9435361 |
+----------+
1 row in set (0.00 sec)

我愿意导出 .csv 文件中的所有行(我使用 Symfony2.6 )。

I'm willing to export all rows in a .csv file (I'm using Symfony2.6). This file is meant to be stored on the server (not downloaded) and later on, read by PHP.

第一次尝试

我试图做一个巨大的请求,一次选择所有(),但是尽管使用了 - > iterate() 运行约9秒后允许的内存大小为1073741824字节耗尽

I tried to make a huge request to select all at once (as per this blog post) but this has, despite the use of ->iterate(), led to Allowed memory size of 1073741824 bytes exhausted after having run for ~9s.

    ini_set('memory_limit', '1024M');
    ini_set('max_execution_time', -1);

    $results = $em
        ->getRepository('MyBundle:Entity')
        ->createQueryBuilder('e')
        ->getQuery()
        ->iterate();
    $handle = fopen('/path/to/csv/file/', 'w');
    while (false !== ($row = $results->next())) {
        fputcsv($handle, $row[0]->toArray());
        $em->detach($row[0]);
    }
    fclose($handle);

第二次尝试

我检索了总的行,然后,做了一个循环,使相同数量的查询逐个检索行。但是在将 .csv 文件中写入〜260K行后,PHP会耗尽内存并抛出与上述相同的错误:允许的内存大小为1073741824字节耗尽

I retrieved the total of rows and then, did a loop to make the same number of queries to retrieve rows one by one. But after having written ~260K rows into the .csv file, PHP runs out of memory and throws the same error as above : Allowed memory size of 1073741824 bytes exhausted.

    ini_set('memory_limit', '1024M');
    ini_set('max_execution_time', -1);

    $total = (int) $em
        ->getRepository('MyBundle:Entity')
        ->countAll();
    $csv = '/path/to/csv/file';
    $handle = fopen($csv, 'w');
    for($i = 1; $i < $total; $i++)
    {
        $entity = $em->getRepository('MyBundle:Entity')->findOneById($i);
        fputcsv($handle, $entity->toArray());
        $em->detach($entity);
    }
    fclose($handle);

第三次尝试

我想过使用 exec()函数来运行导出表的MySQL命令行。但是,我的经理似乎不喜欢这个选项。

I have thought of the use of the exec() function to run the MySQL command line that would export the table. However, my manager seems not to like this option.

所以我是自己的傻瓜, M行使用PHP到 .csv 文件甚至是可能的?有没有其他方法我还没有意识到?

So am I making a fool of myself thinking that dumping ~9.5M of rows using PHP into a .csv file is even possible? Are there any other way I'm not yet aware of?

感谢您对这一方面的帮助。

Thanks for your help on this one.

推荐答案

而不是试图构建对象树,你可以直接尝试选择结果到一个文件: http://dev.mysql.com/doc/refman/5.7/en/select.html

Rather than attempting to build the object-tree, you could directly try to select the result into a file: http://dev.mysql.com/doc/refman/5.7/en/select.html

类似

SELECT * INTO OUTFILE "c:/temp/mycsv.csv"
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY "\n"
FROM theTable;

这应该将作业留给mysql,并绕过任何php内存限制。

This should leave the job up to mysql and bypass any php memory limitations.

因为venca注意到:在这种情况下,您正在运行mysql服务的用户需要对该目录的写权限。

As venca noted: In this case the user under which you are running the mysql service needs write permissions to the directory in question.

这篇关于如何通过PHP将数百万行从MySQL导出到CSV而不耗尽内存?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆