将大量数据(3mio +)插入/更新到SQL Server [英] Insert/Update large amount of data (3mio+) to SQL Server

查看:86
本文介绍了将大量数据(3mio +)插入/更新到SQL Server的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Hello

我必须每周将一个文件导入我们的数据库。文件中的列与数据库不匹配,因此我必须解析文件并构建查询。从这一点来说,我可以插入数据。

但是接下来的问题是,如果已经存在具有相同id的记录,我必须更新,否则我必须删除。

快速导入的最佳做法是什么?我目前需要1-2分钟/ 10000条记录。所以需要很长时间。

感谢您的任何建议,想法等


Hello

I have to import weekly a file into our database. The columns from the file do not match the database, so I have to parse the file and build the query. From this point I can insert the data.

But the next problem is, if a record with same id already exists, I have to update, else I have to delete.

What's the best practice to get fast import? I currently require something like 1-2 minute / 10000 records. So it takes to long.

Thanks for any suggestions, ideas, etc.

推荐答案

对于批量插入,这种响应时间是非常雄心勃勃的这需要完整性检查并决定是否会发生更新或插入,但最有效的方法可能是将所有逻辑放在存储过程中。
That sort of response time is quite ambitious for a bulk insert that requires an integrity check and a decision as to whether an update or insert will occur, but probably the most efficient method would be to put all of your logic in a stored procedure.


这篇关于将大量数据(3mio +)插入/更新到SQL Server的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆