Laravel。播种大型sql文件 [英] Laravel. seeding large sql file
问题描述
以下是我的种子脚本。
class MembershipTableSeeder extends Seeder {
public function run()
{
DB :: table('members') - > delete );
foreach(range(1,99)as $ days){
Members :: create(array('membership_code'=>'test'。$ days));
}
DB :: unprepared(file_get_contents(app_path()。/ database / seeds / members.sql));
}
}
所以我做的是添加一个我的种子脚本无限制
ini_set('memory_limit','-1');
现在问题是,当我运行脚本时,在终端中注销sql脚本的内容。 (这是非常大的)
有没有一个很好的方法来运行我的数据库种子中的sql转储,不消耗很多内存?我现在做的是运行一个手册mysql -uuser -p db< script.sql
当使用Db :: unprepared时,它也将查询记录到laravel.log文件,在后台制作更多的动作,然后你认为,从这边你有内存耗尽。如果您没有运行安全模式,我将坚持执行控制台命令:
exec(mysql -u .\Config :: get('database.mysql.user')。-p.\Config :: get('database.mysql.password')。.\Config :: get('database .mysql.database')。< script.sql)
A memory exhaust happens when I run my db seed script in production.
Below is the my seed script.
class MembershipTableSeeder extends Seeder {
public function run()
{
DB::table('members')->delete();
foreach (range(1, 99) as $days){
Members::create(array( 'membership_code' => 'test'.$days));
}
DB::unprepared(file_get_contents(app_path()."/database/seeds/members.sql"));
}
}
So what I did was add a no-limit on my seed script.
ini_set('memory_limit', '-1');
Problem now is,when I run the script it logs out in the terminal the content of the sql script. (Which is very very big)
Is there a good way of running a sql dump inside my db seeds that doesn't consume much memory? What i did now was run a manual "mysql -uuser -p db < script.sql"
The problem happens because when using Db::unprepared it also logs the query to the laravel.log file, making in background much more actions then you think, from this side you have memory exhaust. If you are not running the safe mode I would stick to executing the console command like this:
exec("mysql -u ".\Config::get('database.mysql.user')." -p".\Config::get('database.mysql.password')." ".\Config::get('database.mysql.database')." < script.sql")
这篇关于Laravel。播种大型sql文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!