使用 phpMyAdmin 导入大型 csv 文件 [英] Import large csv file using phpMyAdmin

查看:36
本文介绍了使用 phpMyAdmin 导入大型 csv 文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个 csv 文件,很大,有 30 000 行.我尝试使用 LOAD 文件等从终端导入它,正如我在谷歌上发现的那样,但它没有用.它正在导入,但我的表有 30 000 行 NULL 单元格.

I have a csv file, a big one, 30 000 rows. I've tried to import it using LOAD file etc.. from the terminal, as I found on google, but it didn't work. It was making the import but my table got to 30 000 rows of NULL cells.

在那之后,我尝试了 phpMyAdmin,在那里我发现我的 csv 太大了.我已经使用 CSV Splitter 将它分成了 5 个.我已经导入了第一个文件.一切都很顺利.比我试图导入第二个,但我得到了错误:

After that I tried phpMyAdmin and there I found out that my csv was too big. I've split it in 5 using CSV Splitter. I've made the import for the first file. Everything went great. Than I tried to import the second one, but I got thos error:

致命错误:允许的内存大小为 134217728 字节(已尝试分配 35 个字节)在 C:xamppphpMyAdminlibrariesimportcsv.php在线 370

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 35 bytes) in C:xamppphpMyAdminlibrariesimportcsv.php on line 370

有时会出现 1064 错误.

or 1064 error sometimes.

你知道为什么以及我该如何解决吗?谢谢.

Do you know why and how can I solve it? Thank you.

推荐答案

增加php的内存限制和脚本时间,这是因为你通过php服务器执行mysql指令.

Increase your php's memory limit and script time, that's beacause your executing the mysql instruction through the php server.

检查你的 php.ini 文件是否有这个变量:

Check your php.ini file for this vars:

memory_limit
max_execution_time

但无论如何我都会通过mysql客户端(终端)来做,检查mysql doc

But anyway I would do it through the mysql client (terminal), check mysql doc

LOAD DATA LOCAL INFILE '/path/to/your/csv/file/csv_file.csv' INTO TABLE database_name.table_name FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '
';

Mysql 文档 - 加载数据文件语法PHP 文档 - Ini 核心设置

这篇关于使用 phpMyAdmin 导入大型 csv 文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆