如何串联多个pandas.DataFrames而不会遇到MemoryError [英] How to concatenate multiple pandas.DataFrames without running into MemoryError

查看:80
本文介绍了如何串联多个pandas.DataFrames而不会遇到MemoryError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我要连接三个DataFrame.

I have three DataFrames that I'm trying to concatenate.

concat_df = pd.concat([df1, df2, df3])

这将导致MemoryError.我该如何解决?

This results in a MemoryError. How can I resolve this?

请注意,大多数现有的类似问题都与读取大文件时发生的MemoryErrors有关.我没有那个问题.我已经将文件读入DataFrames中.我只是无法连接这些数据.

Note that most of the existing similar questions are on MemoryErrors occuring when reading large files. I don't have that problem. I have read my files in into DataFrames. I just can't concatenate that data.

推荐答案

我非常感谢社区为他们提供的答案.但是,就我而言,我发现问题实际上是由于我使用的是32位Python.

I'm grateful to the community for their answers. However, in my case, I found out that the problem was actually due to the fact that I was using 32 bit Python.

为Windows 32和64位定义了内存限制操作系统.对于32位进程,它只有2 GB.因此,即使您的RAM超过2GB,并且即使您正在运行64位OS,但您正在运行32位进程,该进程也将仅限于2 GB RAM-在我的情况下,该进程是Python.

There are memory limits defined for Windows 32 and 64 bit OS. For a 32 bit process, it is only 2 GB. So, even if your RAM has more than 2GB, and even if you're running the 64 bit OS, but you are running a 32 bit process, then that process will be limited to just 2 GB of RAM - in my case that process was Python.

我已升级到64位Python,此后没有出现内存错误!

I upgraded to 64 bit Python, and haven't had a memory error since then!

其他相关问题是:在64位上的Python 32位内存限制Windows 我应该使用Python 32位还是Python 64位为什么numpy数组太大而无法加载?

Other relevant questions are: Python 32-bit memory limits on 64bit windows, Should I use Python 32bit or Python 64bit, Why is this numpy array too big to load?

这篇关于如何串联多个pandas.DataFrames而不会遇到MemoryError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆