如何使用Python处理内存不足 [英] How to handle Out of memory with Python

查看:627
本文介绍了如何使用Python处理内存不足的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有很多字典可以操作.散列超过一千万个单词.它的速度太慢,有时会耗尽内存.

I have huge dictionaries that I manipulate. More than 10 Million words are hashed. Its is too slow and some time it goes out of memory.

是否有更好的方法来处理这些庞大的数据结构?

Is there a better way to handle these huge data structure ?

推荐答案

是.它称为数据库.由于字典为您工作(除了内存方面的问题),我认为sqlite数据库对您来说很好.您可以很容易地使用 sqlite3 ,并且该文档已被很好地记录.

Yes. It's called a database. Since a dictionary was working for you (aside from memory concerns) I would suppose that an sqlite database would work fine for you. You can use the sqlite3 quite easily and it is very well documented.

当然,仅当您可以将值表示为json之类的值或愿意信任本地文件中的腌制数据时,这才是一个很好的解决方案.也许您应该发布有关字典值的详细信息. (我假设键是单词,如果不是,请指正)

Of course this will only be a good solution if you can represent the values as something like json or are willing to trust pickled data from a local file. Maybe you should post details about what you have in the values of the dictionary. (I'm assuming the keys are words, if not please correct me)

您可能还想看看不生成整个字典,而只是分块地处理它.在您的特定用例中,这可能不切实际(不幸的是,这通常与字典所用的东西不一样),但是如果您能想到一种方法,则重新设计算法以允许它是值得的.

You might also want to look at not generating the whole dictionary and only processing it in chunks. This may not be practical in your particular use case (It often isn't with the sort of thing that dictionaries are used for unfortunately) but if you can think of a way, it may be worth it to redesign your algorithm to allow it.

这篇关于如何使用Python处理内存不足的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆