如何在laravel中处理MySQL的更新查询的大小 [英] how to handle large size of update query in mysql with laravel

查看:158
本文介绍了如何在laravel中处理MySQL的更新查询的大小的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



假设有一个表用户包含了十万条记录,并且有一个用户包含了十万条记录,我必须更新大约五万条记录和更新我有这些记录的ID意味着大约五万记录存储在csv文件,

1 - 将查询可以,因为查询的大小会过大?或者如果有什么办法可以放入更小的chunck让我知道?

2 - 考虑到laravel框架,如果有任何选项来读取文件的一部分而不是整个文件,以避免内存泄漏,因为我不想同时阅读所有文件,请提出建议。



欢迎任何建议!

解决方案

如果您正在考虑构建一个像 UPDATE users SET column ='value'WHERE id = 1 OR id = 2 id = 3 ... or id = 50000 WHERE id IN(1,2,3,...,50000)可能太大了。如果你可以用一些逻辑来总结一下,那么它将缩短查询的时间,并显着地加快MySQL的速度。也许你可以让它 WHERE id> = 1 AND id< = 50000 p 如果这不是选项,你可以做到爆发。您可能要遍历CSV文件的行,将查询构建为一个大的 WHERE id = 1或id = 2 ... 查询和每100行(或50,如果这仍然太大),运行查询,并为接下来的50个ID开始一个新的。



或者你可以运行50.000单 UPDATE 查询您的数据库。老实说,如果表格正确使用索引,运行50.000查询应该只需要几秒钟,在大多数现代的Web服务器。即使最繁忙的服务器应该能够在一分钟内处理。



至于阅读块文件,您可以使用PHP的基本文件访问功能: / p>

  $ file = fopen('/ path / to / file.csv','r'); 

//从文件中一次读取一行(fgets读取到
//下一个换行符,如果不提供字节数)
while (!feof($ file)){
$ line = fgets($ file);

//或者,因为它是一个CSV文件:
$ row = fgetcsv($ file);
// $ row不是一个包含所有CSV列的数组

//用行/行
来做东西

// set文件指针60 kb到文件
fseek($ file,60 * 1024);

//关闭文件
fclose($ file);

这不会将完整的文件读入内存。不知道是否Laravel有自己的方式处理文件,但这是如何在基本的PHP。


is There a way that I can update 100k records in a query and mysql database will work smoothly?

Suppose there is a table users containg hundred thousand of records and I have to update approx fifty thousand of records and for update I have IDs of those records means to around fifty thousand of records somewhere stored in csv file,

1 - Will query be ok as size of query would be too large ? or if there is any way to put in smaller chuncks let me know ?

2- Considering laravel framework, if there any option to read a part of file not the whole file, to avoid memory leakage, As I donot want to read all file at the same time, please suggest.

Any suggestion are welcome !

解决方案

If you're thinking of building a query like UPDATE users SET column = 'value' WHERE id = 1 OR id = 2 OR id = 3 ... OR id = 50000 or WHERE id IN (1, 2, 3, ..., 50000) then that will probably be too big. If you can make some logic to summarize that, it would shorten the query and speed things up on MySQL's end significantly. Maybe you could make it WHERE id >= 1 AND id <= 50000.

If that's not an option, you could do it in bursts. You're probably going to loop through the rows of the CSV file, build the query as a big WHERE id = 1 OR id = 2... query and every 100 rows or so (or 50 if that's still too big), run the query and start a new one for the next 50 IDs.

Or you could just run 50.000 single UPDATE queries on your database. Honestly, if the table makes proper use of indexes, running 50.000 queries should only take a few seconds on most modern webservers. Even the busiest servers should be able to handle that in under a minute.

As for reading a file in chunks, you can use PHP's basic file access functions for that:

$file = fopen('/path/to/file.csv', 'r');

// read one line at a time from the file (fgets reads up to the
// next newline character if you don't provide a number of bytes)
while (!feof($file)) {
    $line = fgets($file);

    // or, since it's a CSV file:
    $row = fgetcsv($file);
    // $row is not an array with all the CSV columns

    // do stuff with the line/row
}

// set the file pointer to 60 kb into the file
fseek($file, 60*1024);

// close the file
fclose($file);

This will not read the full file into memory. Not sure if Laravel has its own way of dealing with files, but this is how to do that in basic PHP.

这篇关于如何在laravel中处理MySQL的更新查询的大小的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆