PhpMyAdmin数据导入性能问题 [英] PhpMyAdmin data import performance issues

查看:97
本文介绍了PhpMyAdmin数据导入性能问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

最初,我的问题与PhpMyAdmin的SQL部分无法正常工作有关.如评论中所建议,我意识到这是无法处理的输入量.但是,这并没有为我提供一种有效的解决方案,该方案如何处理格式为(CSV)(在我的情况下为3.5万行)的文件:

...
20120509,126,1590.6,0
20120509,127,1590.7,1
20120509,129,1590.7,6
...

与SQL部分中的基本复制粘贴输入一样,PhpMyadmin中的导入"选项也很费劲.这次,和以前一样,需要5分钟才能调用最大执行时间,然后才停止.有趣的是,它在表中添加了大约6-7千条记录.因此,这意味着输入实际上已经通过并且几乎成功地完成了输入.我还尝试将文件中的数据量减半.但是什么都没有改变.

现在显然出了点问题.当简单的数据导入不起作用时,不得不在php脚本中处理数据是很烦人的.

解决方案

更改您的php上传最大大小.

您知道您的php.ini文件在哪里吗?

首先,请尝试将此文件放入您的Web根目录:

phpinfo.php

(请参见 http://php.net/manual/en/function.phpinfo. php )

包含:

<?php

phpinfo();

?>

然后导航到 http://www.yoursite.com/phpinfo.php

寻找"php.ini".

要上传大文件,您需要max_execution_time,post_max_size,upload_max_filesize

此外,您知道您的error.log文件在哪里吗?希望它将为您提供有关发生问题的线索.

这是我用于文件导入的查询:

$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY
    ENCLOSED BY '\"' LINES TERMINATED BY '$nl'";

其中$ file_name是来自php全局变量$ _FILES的临时文件名,$ table_name是已经准备导入的表,而$ nl是csv行结尾的变量(默认为Windows行结尾,但是我可以选择选择linux行结尾).

另一件事是,我的脚本中的表($ table_name)是通过首先扫描csv以确定列类型而预先准备的.确定适当的列类型后,将创建MySQL表以接收数据.

我建议您首先尝试创建MySQL表定义,以匹配文件中的内容(数据类型,字符长度等).然后尝试上面的查询,并查看其运行速度.我不知道MySQL表定义对速度有多大的影响.

此外,在加载数据之后,我还没有在表中定义索引.索引会减慢数据加载速度.

Originally, my question was related to the fact that PhpMyAdmin's SQL section wasn't working properly. As suggested in the comments, I realized that it was the amount of the input is impossible to handle. However, this didn't provide me with a valid solution of how to deal with the files that have (in my case - 35 thousand record lines) in format of (CSV):

...
20120509,126,1590.6,0
20120509,127,1590.7,1
20120509,129,1590.7,6
...

The Import option in PhpMyadmin is struggling just as the basic copy-paste input in SQL section does. This time, same as previously, it takes 5 minutes until the max execution time is called and then it stops. What is interesting tho, it adds like 6-7 thousand of records into the table. So that means the input actually goes through and does that almost successfully. I also tried halving the amount of data in the file. Nothing has changed however.

There is clearly something wrong now. It is pretty annoying to have to play with the data in php script when simple data import is not work.

解决方案

Change your php upload max size.

Do you know where your php.ini file is?

First of all, try putting this file into your web root:

phpinfo.php

( see http://php.net/manual/en/function.phpinfo.php )

containing:

<?php

phpinfo();

?>

Then navigate to http://www.yoursite.com/phpinfo.php

Look for "php.ini".

To upload large files you need max_execution_time, post_max_size, upload_max_filesize

Also, do you know where your error.log file is? It would hopefully give you a clue as to what is going wrong.

EDIT:

Here is the query I use for the file import:

$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY
    ENCLOSED BY '\"' LINES TERMINATED BY '$nl'";

Where $file_name is the temporary filename from php global variable $_FILES, $table_name is the table already prepared for import, and $nl is a variable for the csv line endings (default to windows line endings but I have an option to select linux line endings).

The other thing is that the table ($table_name) in my script is prepared in advance by first scanning the csv to determine column types. After it determines appropriate column types, it creates the MySQL table to receive the data.

I suggest you try creating the MySQL table definition first, to match what's in the file (data types, character lengths, etc). Then try the above query and see how fast it runs. I don't know how much of a factor the MySQL table definition is on speed.

Also, I have no indexes defined in the table until AFTER the data is loaded. Indexes slow down data loading.

这篇关于PhpMyAdmin数据导入性能问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆