在MySQL DB上导入大文件 [英] Import large file on MySQL DB
问题描述
我想在mysql db中插入大约50,000个mysql查询'insert',
为我提供了两个选项,
1-直接导入(.sql)file:
出现以下错误
您可能尝试上传的文件过大,请参阅文档以了解解决此限制的方法。
2-使用php代码以不同的块的形式从(.sql)文件插入这些查询。
这里是我的代码:
<?php
//配置DB
includeconfig.php;
//获取文件数据
$ file = file('country.txt');
//设置指针&位置变量
$ position = 0;
$ eof = 0;
while($ eof< sizeof($ file))
{
for($ i = $ position; $ i<($ position + 2); $ i ++ )
{
if($ i< sizeof($ file))
{
$ flag = mysql_query($ file [$ i]);
if(isset($ flag))
{
echoInsert Successfully< br />;
$ position ++;
}
else
{
echo mysql_error()。 < br> \\\
;
}
}
else
{
echo< br /> File Of End;
break
}
}
$ eof ++;
}
?>但是内存大小错误发生,但我已经扩展内存限制从128M到256M甚至512M。 / p>
然后我认为,如果我能够从(.sql)文件加载有限的行像一个1000,并执行mysql查询,那么它可能是导入所有记录从文件到数据库。
但是在这里我不知道如何处理文件开始位置到结束,如何更新开始和结束位置,以便它不会从.sql文件抓取以前获取的行。
解决方案这里是您需要的代码,现在优雅! = D
<?php
include('config.php');
$ file = @fopen('country.txt','r');
if($ file)
{
while(!feof($ file))
{
$ line = trim );
$ flag = mysql_query($ line);
if(isset($ flag))
{
echo'Insert Successfully< br />';
}
else
{
echo mysql_error()。 '< br />';
}
flush();
}
fclose($ file);
}
echo'< br />文件结束';
?>
基本上,这是一个不那么贪婪的代码版本,而不是打开整个文件在内存中读取,执行小块(一行)的SQL语句。
I want to insert about 50,000 mysql query for 'insert' in mysql db,
for this i have 2 options,
1- Directly import the (.sql) file:
Following error is occur
" You probably tried to upload too large file. Please refer to documentation for ways to workaround this limit. "
2- Use php code to insert these queries in form of different chunks from the (.sql) file.
here is my code:
<?php
// Configure DB
include "config.php";
// Get file data
$file = file('country.txt');
// Set pointers & position variables
$position = 0;
$eof = 0;
while ($eof < sizeof($file))
{
for ($i = $position; $i < ($position + 2); $i++)
{
if ($i < sizeof($file))
{
$flag = mysql_query($file[$i]);
if (isset($flag))
{
echo "Insert Successfully<br />";
$position++;
}
else
{
echo mysql_error() . "<br>\n";
}
}
else
{
echo "<br />End of File";
break;
}
}
$eof++;
}
?>
But memory size error is occur however i have extend memory limit from 128M to 256M or even 512M.
Then i think that if i could be able to load a limited rows from (.sql) file like 1000 at a time and execute mysql query then it may be import all records from file to db.
But here i dont have any idea for how to handle file start location to end and how can i update the start and end location, so that it will not fetch the previously fetched rows from .sql file.
解决方案 Here is the code you need, now prettified! =D
<?php
include('config.php');
$file = @fopen('country.txt', 'r');
if ($file)
{
while (!feof($file))
{
$line = trim(fgets($file));
$flag = mysql_query($line);
if (isset($flag))
{
echo 'Insert Successfully<br />';
}
else
{
echo mysql_error() . '<br/>';
}
flush();
}
fclose($file);
}
echo '<br />End of File';
?>
Basically it's a less greedy version of your code, instead of opening the whole file in memory it reads and executes small chunks (one liners) of SQL statements.
这篇关于在MySQL DB上导入大文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!