使用SQLAlchemy批量上传 [英] Bulk upsert with SQLAlchemy
问题描述
我正在使用SQLAlchemy 1.1.0b将大量数据批量上传到PostgreSQL中,并且遇到了重复的关键错误。
I am working on bulk upserting lots of data into PostgreSQL with SQLAlchemy 1.1.0b, and I'm running into duplicate key errors.
from sqlalchemy import *
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.automap import automap_base
import pg
engine = create_engine("postgresql+pygresql://" + uname + ":" + passw + "@" + url)
# reflectively load the database.
metadata = MetaData()
metadata.reflect(bind=engine)
session = sessionmaker(autocommit=True, autoflush=True)
session.configure(bind=engine)
session = session()
base = automap_base(metadata=metadata)
base.prepare(engine, reflect=True)
table_name = "arbitrary_table_name" # this will always be arbitrary
mapped_table = getattr(base.classses, table_name)
# col and col2 exist in the table.
chunks = [[{"col":"val"},{"col2":"val2"}],[{"col":"val"},{"col2":"val3"}]]
for chunk in chunks:
session.bulk_insert_mappings(mapped_table, chunk)
session.commit()
运行它时,我会得到:
sqlalchemy.exc.IntegrityError: (pg.IntegrityError) ERROR: duplicate key value violates unique constraint <constraint>
我似乎无法正确实例化 mapped_table
作为 Table()
对象。
I can't seem to properly instantiate the mapped_table
as a Table()
object, either.
我正在处理时间序列数据,所以我我正在大量时间范围内重叠地获取数据。我想进行批量增补,以确保数据的一致性。
I'm working with time series data, so I'm grabbing data in bulk with some overlap in time ranges. I want to do a bulk upsert to ensure data consistency.
对大型数据集进行批量增补的最佳方法是什么?我知道现在 PostgreSQL支持upserts ,但是我不确定如何在SQLAlchemy中执行此操作。
What's the best way to do a bulk upsert with a large data set? I know PostgreSQL support upserts now, but I'm not sure how to do this in SQLAlchemy.
推荐答案
来自https://stackoverflow.com/a/26018934/465974
此命令可以执行upsert,但是值得一提的是
,对于批量 upsert而言,此操作很慢。
After I found this command, I was able to perform upserts, but it is worth mentioning that this operation is slow for a bulk "upsert".
获取您想要
upsert的主键的列表,并在数据库中查询任何匹配的id:
The alternative is to get a list of the primary keys you would like to upsert, and query the database for any matching ids:
这篇关于使用SQLAlchemy批量上传的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!