如何使用Pandas将巨大的CSV转换为SQLite? [英] How to transform a huge CSV into SQLite using Pandas?
问题描述
我有一个巨大的表(大约60 GB),以CSV存档文件的形式存在.我想将其转换为SQLite文件.
I have a huge table (about 60 GB) in form of an archived CSV file. I want to transform it into an SQLite file.
目前我在做什么:
import pandas
import sqlite3
cnx = sqlite3.connect('db.sqlite')
df = pandas.read_csv('db.gz', compression='gzip')
df.to_sql('table_name', cnx)
它适用于较小的文件,但是对于大文件,我有内存问题.问题是熊猫将整个表读入内存(RAM),然后将其保存到SQLite文件中.
It works fine for smaller files but with the huge files I have memory problem. The problem is that pandas reads the whole table into memory (RAM) and then saves it into SQLite file.
有没有解决这个问题的好方法?
Is there an elegant solution to this problem?
推荐答案
由于pandas
的大小,这将成为问题.出于任何原因,您不能使用csv
模块,而只能遍历文件.
This is going to be problematic with pandas
due to its size. Any reason you can't use the csv
module and just iterate through the file.
基本想法(未经测试):
Basic idea (untested):
import gzip
import csv
import sqlite3
with gzip.open('db.gz') as f, sqlite3.connect('db.sqlite') as cnx:
reader = csv.reader(f)
c = cnx.cursor()
c.executemany('insert into table_name values (?,?,...)', reader)
这篇关于如何使用Pandas将巨大的CSV转换为SQLite?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!