无法保留内存块,在python中导入json错误 [英] Could not reserve memory block, import json error in python

查看:1014
本文介绍了无法保留内存块,在python中导入json错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

import pandas as pd
with open(r'data.json') as f:
   df = pd.read_json(f, encoding='utf-8')

我收到无法保留内存块"错误. Json的大小为300 mb,保留用于在python中运行程序的内存有任何限制吗?我使用Windows 10在PC上有8 GB RAM

Im getting "Could not reserve memory block" error. Json has 300 mb size, is there any limit for reserving memory for running program in python? I have 8 GB RAM on PC, using windows 10

loading of json file into df
Traceback (most recent call last):
  File "C:\Program Files\JetBrains\PyCharm 2018.1.4\helpers\pydev\pydev_run_in_console.py", line 52, in run_file
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "C:\Program Files\JetBrains\PyCharm 2018.1.4\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "C:/Users/Beorn/PycharmProjects/project_0/projekt/test.py", line 7, in <module>
    df = pd.read_json(f, encoding='utf-8')
  File "C:\Users\Beorn\AppData\Local\Programs\Python\Python36-32\lib\site-packages\pandas\io\json\json.py", line 422, in read_json
    result = json_reader.read()
  File "C:\Users\Beorn\AppData\Local\Programs\Python\Python36-32\lib\site-packages\pandas\io\json\json.py", line 529, in read
    obj = self._get_object_parser(self.data)
  File "C:\Users\Beorn\AppData\Local\Programs\Python\Python36-32\lib\site-packages\pandas\io\json\json.py", line 546, in _get_object_parser
    obj = FrameParser(json, **kwargs).parse()
  File "C:\Users\Beorn\AppData\Local\Programs\Python\Python36-32\lib\site-packages\pandas\io\json\json.py", line 638, in parse
    self._parse_no_numpy()
  File "C:\Users\Beorn\AppData\Local\Programs\Python\Python36-32\lib\site-packages\pandas\io\json\json.py", line 853, in _parse_no_numpy
    loads(json, precise_float=self.precise_float), dtype=None)
ValueError: Could not reserve memory block
PyDev console: starting.
Python 3.6.6 (v3.6.6:4cf1f54eb7, Jun 27 2018, 02:47:15) [MSC v.1900 32 bit (Intel)] on win32

推荐答案

因此,在阅读了大量文章和解决方案后,我决定通过摆脱Uselles数据来缩小文件大小.也许您觉得这很有用.顺便提一句.我读到某个地方,您需要比您的json文件多至少25倍的内存,所以就我而言,我需要的内存超过8Gb.

So after reading plenty of posts and solutions I decided to just reduce my file size by getting rid of uselles data. Maybe you find this usefull. Btw. I read somewhere that u need at least x25 more memory than your json file has, so in my case i needed more than 8Gb.

with open('data.json', 'r') as data_file:
    data = json.load(data_file)

print(data.keys())
del data['author']

with open('datav2.json', 'w') as data_file:
    data = json.dump(data, data_file)

这篇关于无法保留内存块,在python中导入json错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆