从一个文件批量加载到Cassandra使用phpcassa的问题 [英] Problem in bulk load from a file into Cassandra using phpcassa
问题描述
我尝试使用phpcassa将文本文件中的1000000条记录加载到 Cassandra 中。但是在加载过程的中途我收到了以下错误。
I am trying to load 1000000 records from a text file into Cassandra using phpcassa. But halfway through the loading process I got the following error.
PHP致命错误:/ usr / share中最大执行时间超过30秒/php/phpcassa/columnfamily.php on line 759 **
PHP Fatal error: Maximum execution time of 30 seconds exceeded in /usr/share/php/phpcassa/columnfamily.php on line 759**
如何增加执行时间?我必须更改 columnfamily.php
中的任何参数吗?
请在下面找到我的代码。
How do I increase the execution time? Do I have to change any parameter in columnfamily.php
?
Please find my code below.
<?
require_once('phpcassa/connection.php');
require_once('phpcassa/columnfamily.php');
try {
$servers = array("127.0.0.1:9160");
$pool = new ConnectionPool("Keyspace1", $servers);
$column_family = new ColumnFamily($pool, 'Product');
$cnt=0;
$files = fopen("dump.txt", "r");
$mtime = microtime();
$mtime = explode(" ",$mtime);
$mtime = $mtime[1] + $mtime[0];
$starttime = $mtime;
while (!feof($files)) {
$line = fgets($files);
$keyname="P".$cnt;
$split_line = explode("," , $line);
for ($i=0;$i<count($split_line);$i++) {
//echo $split_line[$i];
$column_family->insert(
$keyname, array(
'code' => $split_line[0] ,
'pname' => $split_line[1] ,
'price' => $split_line[2]
)
);
}
$cnt += 1;
}
$mtime = microtime();
$mtime = explode(" ",$mtime);
$mtime = $mtime[1] + $mtime[0];
$endtime = $mtime;
fclose($files);
$totaltime = ($endtime - $starttime);
echo "$cnt keys loaded in ".$totaltime." seconds";
}
catch (Exception $e)
{
echo 'Exception: ' . $e->getMessage();
}
?>
推荐答案
尝试将它添加到脚本的顶部:
set_time_limit(0);
Try and add this to top of your script:
set_time_limit(0);
这篇关于从一个文件批量加载到Cassandra使用phpcassa的问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!