如何一次将巨大的CSV文件插入python的SQL Server中? [英] How to insert huge CSV file at once into SQL Server in python?

查看:81
本文介绍了如何一次将巨大的CSV文件插入python的SQL Server中?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个很大的CSV文件,我想一次全部插入,而不是逐行插入.这是我的代码:

I have a large CSV file and I want to insert it all at once, instead of row by row. This is my code:

import pypyodbc

import csv

con = pypyodbc.connect('driver={SQL Server};' 'server=server_name;' 'database=DB-name;' 'trusted_connection=true')

cur = con.cursor()

csfile = open('out2.csv','r')

csv_data = csv.reader(csfile)

for row in csv_data:

    try:
        cur.execute("BULK INSERT INTO Table_name(Attribute, error, msg, Value, Success, TotalCount, SerialNo)" "VALUES (?, ?, ?, ?, ?, ?, ?)", row)
    except Exception:
        time.sleep(60)
cur.close()

con.commit()

con.close()

推荐答案

这实际上取决于您的系统资源.您可以将CSV文件存储在内存中,然后将其插入数据库.但是,如果您的CSV文件大于您的RAM,则应该出现一些时间问题.您可以将csv文件的每一行另存为python List中的元素.这是我的代码:

It really depends on your system resources. You can store CSV file in memory and then insert it into database. But if your CSV file is larger than your RAM there should be some Time issue. You can save each row of csv file as an element in python List.here is my code:

csvRows = []
csvFileObj = open('yourfile.csv', 'r')
readerObj = csv.reader(csvFileObj)
for row in readerObj:
    element1 = row[0]
    .......
    csvRows.append((element1,element2,...))

在列表的该读取元素之后,并将其插入到您的数据库中.我认为没有直接将所有csv行立即插入sqldb的直接方法.您需要一些预处理.

after that read element of the list and insert it to your db. I don't think there is a direct way to insert All csv rows into sqldb at once. you need some preprocessing.

这篇关于如何一次将巨大的CSV文件插入python的SQL Server中?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆