处理来自mysql的大量数据 [英] Working with a large amount of data from mysql

查看:129
本文介绍了处理来自mysql的大量数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在允许用户从数据库下载大型报告时遇到麻烦.

I am having trouble with allowing users to download a large report from the data base.

用户可以浏览网站上的报告,我为页面设置了限制和分页,所以在那里没有问题.

Users have the ability to browse the report on the website, i have set up a limit and a pagination for the pages so no problem there.

但是,现在我添加了一次将整个报告下载到csv文件中的功能.我收到内存错误,说我已达到允许的最大内存,并且我没有服务器上的权限来增加它.有谁知道如何做到这一点?到目前为止,仅获得具有700k结果的查询就停止了我的脚本.

But now i am adding the functionality to download the whole report at once in a csv file. I receive memory error saying i have reach the maximum memory allowed, and i don't have permission on the server to increase it. Does anybody know how can make this possible? so far just getting the query which has 700k result stops my script.

ps我在stackoverflow周围进行搜索,到目前为止找不到答案.我正在使用php和mysql来获取数据.

ps i have search around stackoverflow and so far can't find an answer to it. I am using php and mysql to get the data.

预先感谢

推荐答案

就像@symbcbean所说的那样,创建文件并向每个查询偏移量附加数据的过程非常缓慢.因此,我想出了一个我认为应该分享的选择,以防别人遇到相同的问题.

Just like @symbcbean said, the process of creating a file and appending the data with each query offset is very slow. So i came up with an alternative that i thought i should share in case someone else is facing the same issue.

我已经设置了一个cronjob,它会在晚上创建包含所有必要数据的文件的过程,因为该数据每周仅更改几次.因此,现在我已经消除了每次有人需要访问它时都会生成新的csv的开销,取而代之的是,我只需重定向到预制文件即可完成工作!

I have setup a cronjob that goes through the process of creating the file with all the necessary data in it at night, since the data only change a few times a week. So now i've cleared the overhead of generating a new csv everytime someone needs to access it, instead i simply redirect to the pre-made file and the work is done!

谢谢@Gaurav& @symbcbean!

Thank you @Gaurav & @symbcbean!

这篇关于处理来自mysql的大量数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆