从CSV导入时,忽略重复 [英] Ignore duplicates when importing from CSV

查看:190
本文介绍了从CSV导入时,忽略重复的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用PostgreSQL数据库,创建表后,我必须使用CSV文件填充它们。但是,CSV文件已损坏,并且它违反主键规则,因此数据库抛出错误,并且无法填充表。任何想法如何告知数据库忽略从CSV导入时的重复?编写脚本将其从CSV文件中删除是不可接受的。任何解决方法也是受欢迎的。谢谢! :

I'm using PostgreSQL database, after I've created my table I have to populate them with a CSV file. However the CSV file is corrupted and it violates the primary key rule and so the database is throwing an error and I'm unable to populate the table. Any ideas how to tell the database to ignore the duplicates when importing from CSV? Writing a script to remove them from the CSV file is no acceptable. Any workarounds are welcome too. Thank you! : )

推荐答案

在postgreSQL上,如果违反唯一约束,则不允许重复行
我认为你最好的选择是将CSV文件导入到没有约束的临时表中,从其中删除重复的值,最后从此临时表导入到最终的表。

On postgreSQL, duplicate rows are not permitted if they violate a unique constraint. I think that your best option, is to import your CSV file on to temp table that has no constraint, delete from it duplicate values, and finally import from this temp table to your final table.

这篇关于从CSV导入时,忽略重复的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆