SQLAlchemy-批量插入忽略:“重复条目" [英] SQLAlchemy - bulk insert ignore: "Duplicate entry"
问题描述
我有一个名为user_data
的表,列id
和user_id
作为唯一键.我想将一些历史数据导入该表.我使用 bulk_insert_mappings 方法批量插入数据.但是有如下错误:
I have a table named user_data
, the column id
and user_id
as the unique key. I want to import some history data to this table. I use bulk_insert_mappings method to batch insert data. But there are errors as below:
IntegrityError:(pymysql.err.IntegrityError)(1062,u键'idx_on_id_and_user_id'的重复条目'1-1234'")
IntegrityError: (pymysql.err.IntegrityError) (1062, u"Duplicate entry '1-1234' for key 'idx_on_id_and_user_id'")
批量插入时如何忽略此错误并丢弃重复数据?
How to ignore this error and discard duplicate data when batch insert?
推荐答案
您应该处理所有错误.但是,如果您真的只想忽略所有错误,则不能真正进行批量插入.有时,您要导入的实际数据中会出现完整性错误.您必须一个接一个地插入,然后忽略.我只会在一次关闭脚本中使用它.
You should handle every error. But if you really want to just ignore all errors, you can't really do a bulk insert. Sometimes there will be integrity errors in the actual data you are importing. You have to insert one by one and ignore. I would only use this in once off scripts.
for item in dict_list:
try:
session.merge(orm(**item))
session.commit()
except Exception as e:
session.rollback()
这篇关于SQLAlchemy-批量插入忽略:“重复条目"的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!