Python酸洗词典EOFError [英] Python Pickling Dictionary EOFError

查看:119
本文介绍了Python酸洗词典EOFError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有几个脚本在服务器上运行,该服务器可以打开并打开各种各样的字典。他们都使用相同的基本代码进行酸洗,如下所示:

  SellerDict = open('/ home / hostadl / SellerDictkm' 'rb')
SellerDictionarykm = pickle.load(SellerDict)
SellerDict.close()

SellerDict = open('/ home / hostadl / SellerDictkm','wb')
pickle.dump(SellerDictionarykm,SellerDict)
SellerDict.close()

所有的脚本运行正常,除了其中之一。有问题的一个到各种网站,并刮擦数据并将其存储在字典中。该代码运行全天候的酸洗和解压缩字典,并在午夜停止。一个cronjob然后再次启动
第二天早上。该脚本可以运行几周而不会出现问题,但是当尝试打开字典时,脚本会由于EOFError而死亡。字典的大小通常是大约80 MB。我甚至尝试在SellerDict.close()之前添加SellerDict.flush(),当酸洗数据,以确保晚上被刷新。



任何想法可能是什么原因? Python是非常稳固的,所以我不认为这是由于文件的大小。哪里的代码运行良好很长一段时间才能死亡,导致我相信,也许某些东西被保存在导致这个问题的字典中,但我不知道。





更新:



这是我从日志文件中的追溯。

 追溯(最多最近呼叫最后):
文件/home/hostadl/CompileRecentPosts.py,第782行,< module>
main()
文件/home/hostadl/CompileRecentPosts.py,第585行,主要
SellerDictionarykm = pickle.load(SellerDict)
EOFError


解决方案

所以这实际上是一个内存问题。当计算机用尽RAM并尝试解压或加载数据时,该过程将无法声明此EOFError。我增加了计算机上的RAM,这再也不是一个问题。



感谢所有的意见和帮助。


I have several script running on a server which pickle and unpickle various dictionaries. They all use the same basic code for pickling as shown below:

SellerDict=open('/home/hostadl/SellerDictkm','rb')
SellerDictionarykm=pickle.load(SellerDict)
SellerDict.close()

SellerDict=open('/home/hostadl/SellerDictkm','wb')
pickle.dump(SellerDictionarykm,SellerDict)
SellerDict.close()

All the scripts run fine except for one of them. The one that has issues goes to various websites and scrapes data and stores it in dictionary. This code runs all day long pickling and unpickling dictionaries and stops at midnight. A cronjob then starts it again the next morning. This script can run for weeks without having a problem but about once a month the script dies due to a EOFError when it tries to open a dictionary. The size of the dictionaries are usually about 80 MB. I even tried adding SellerDict.flush() before SellerDict.close() when pickling the data to make sure evening was being flushed.

Any idea's what could be causing this? Python is pretty solid so I don't think it is due to the size of the file. Where the code runs fine for a long time before dying it leads me to believe that maybe something is being saved in the dictionary that is causing this issue but I have no idea.

Also, if you know of a better way to be saving dictionaries other than pickle I am open to options. Like I said earlier, the dictionaries are constantly being opened and closed. Just for clarification, only one program will use the same dictionary so the issue is not being caused by several programs trying to access the same dictionary.

UPDATE:

Here is the traceback that I have from a log file.

Traceback (most recent call last):
  File "/home/hostadl/CompileRecentPosts.py", line 782, in <module>
    main()
  File "/home/hostadl/CompileRecentPosts.py", line 585, in main
    SellerDictionarykm=pickle.load(SellerDict)
EOFError

解决方案

So this actually turned out to be a memory issue. When the computer would run out of RAM and try to unpickle or load the data, the process would fail claiming this EOFError. I increase the RAM on the computer and this never was an issue again.

Thanks for all the comments and help.

这篇关于Python酸洗词典EOFError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆