$ _SESSION的最大大小是多少? [英] What can be the maximum size for the $_SESSION?

查看:212
本文介绍了$ _SESSION的最大大小是多少?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在导入一个csv文件,其中有多于5000条记录。我目前做的是,将所有文件内容作为一个数组,并将它们逐个保存到数据库。但在脚本失败的情况下,整个过程将再次运行,如果我开始检查他们一个接一个的形式数据库,它将使用大量的查询,所以我想保持导入的值临时会话。

I am importing a csv file with more then 5,000 records in it. What i am currently doing is, getting all file content as an array and saving them to the database one by one. But in case of script failure, the whole process will run again and if i start checking the them again one by one form database it will use lots of queries, so i thought to keep the imported values in session temporarily.

是在会话中保留大量记录的好习惯。还是有其他方法吗?

Is it good practice to keep that much of records in the session. Or is there any other way to do this ?

谢谢。

推荐答案

如果你必须分阶段完成这个任务(这里有几个建议可以改进你做事情的方式)一个单一的通行证),不持有csv文件在$ _SESSION ...这是无意义的开销,因为你已经有磁盘上的csv文件,无论如何,它只是添加了大量的序列化/反序列化开销的过程作为会话数据已写入。

If you have to do this task in stages (and there's a couple of suggestions here to improve the way you do things in a single pass), don't hold the csv file in $_SESSION... that's pointless overhead, because you already have the csv file on disk anyway, and it's just adding a lot of serialization/unserialization overhead to the process as the session data is written.

您正在一次处理一个CSV记录,因此请记下在$ _SESSION中成功处理了多少个CSV记录。如果脚本超时或barfs,然后重新启动并读取已处理的数量,以便知道文件中的哪个位置重新启动。

You're processing the CSV records one at a time, so keep a count of how many you've successfully processed in $_SESSION. If the script times out or barfs, then restart and read how many you've already processed so you know where in the file to restart.

这篇关于$ _SESSION的最大大小是多少?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆