如何解析大型CSV文件而不超时? [英] How to parse Large CSV file without timing out?

查看:456
本文介绍了如何解析大型CSV文件而不超时?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试解析一个50 MB的.csv文件。文件本身是好的,但我试图超越涉及的大量超时问题。每个都设置上传明智,我可以轻松地上传和重新打开文件,但在浏览器超时后,我收到一个500内部错误。

I'm trying to parse a 50 megabyte .csv file. The file itself is fine, but I'm trying to get past the massive timeout issues involved. Every is set upload wise, I can easily upload and re-open the file but after the browser timeout, I receive a 500 Internal error.

我的猜测是我可以保存文件到服务器上,打开它并保持我处理的行的会话值。在某一行后,我通过刷新重置连接,并打开我离开的行的文件。这是一个可行的想法吗?以前的开发人员创建了一个非常低效的MySQL类,它控制整个网站,所以我不想写我自己的类,如果我不必,我不想乱他的类。

My guess is I can save the file onto the server, open it and keep a session value of what line I dealt with. After a certain line I reset the connect via refresh and open the file at the line I left off with. Is this a do-able idea? The previous developer made a very inefficient MySQL class and it controls the entire site, so I don't want to write my own class if I don't have to, and I don't want to mess with his class.

TL; DR版本:保存我目前在CSV文件中最后一行的效率高达38K行,后X行数,重置连接,从我离开的地方开始?或者是否有其他方法可以解析大型CSV文件而不会超时?

TL;DR version: Is it efficient to save the last line I'm currently on of a CSV file that has 38K lines of products then, and after X number of rows, reset the connection and start from where I left off? Or is there another way to parse a Large CSV file without timeouts?

注意:这是PHP脚本执行时间。目前在38K行,它通过命令行运行大约46分钟和5秒。它从100%的时间正常工作,当我从浏览器中删除它,表明它是一个浏览器超时。 Chrome的超时是不可编辑的,只要Google告诉我,Firefox的超时很少。

NOTE: It's the PHP script execution time. Currently at 38K lines, it takes about 46 minutes and 5 seconds to run via command line. It works correctly 100% of the time when I remove it from the browser, suggesting that it is a browser timeout. Chrome's timeout is not editable as far as Google has told me, and Firefox's timeout works rarely.

推荐答案

我建议运行php命令行并将其设置为cron作业,这样您就不必修改代码,没有超时问题,您可以轻松解析大型CSV文件

i suggest to run php from command line and set it as a cron job, this way you dont have to modify your code, there will be no timeout issue and you can easily parse large CSV files

请检查此链接

这篇关于如何解析大型CSV文件而不超时?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆