用Python编写大型CSV的最快方法 [英] Fastest way to write large CSV with Python
本文介绍了用Python编写大型CSV的最快方法的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我想在csv文件中写入一些随机样本数据,直到它的大小为1GB.以下代码正在工作:
I want to write some random sample data in a csv file until it is 1GB big. Following code is working:
import numpy as np
import uuid
import csv
import os
outfile = 'data.csv'
outsize = 1024 # MB
with open(outfile, 'ab') as csvfile:
wtr = csv.writer(csvfile)
while (os.path.getsize(outfile)//1024**2) < outsize:
wtr.writerow(['%s,%.6f,%.6f,%i' % (uuid.uuid4(), np.random.random()*50, np.random.random()*50, np.random.randint(1000))])
如何使其更快?
推荐答案
删除所有不必要的内容,因此应该更快,更容易理解:
Removing all unnecessary stuff, and therefore it should be faster and easier to understand:
import random
import uuid
outfile = 'data.csv'
outsize = 1024 * 1024 * 1024 # 1GB
with open(outfile, 'ab') as csvfile:
size = 0
while size < outsize:
txt = '%s,%.6f,%.6f,%i\n' % (uuid.uuid4(), random.random()*50, random.random()*50, random.randrange(1000))
size += len(txt)
csvfile.write(txt)
这篇关于用Python编写大型CSV的最快方法的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文