一个CSV最快导入到数据库表 [英] fastest Import of a csv to a database table

查看:646
本文介绍了一个CSV最快导入到数据库表的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经实现了进口功能,这在Asp.Net器件的应用需要从CSV文件中的数据。大小的文件可以从几KB的为10 MB的一个最大变化。

I have implemented an import functionality which takes data from a csv file in an Asp.Net appication. The file of the size can vary from a few kb's to a max of 10 MB.

然而,当导入时,如果文件大小> 50000约需20分钟 。
,其为太多的时间。我需要2-3分钟]时间跨度内执行周边的30万记录的导入。

However when an import occurs and if the file size is > 50000 it takes around 20 MINS . Which is way too much of a time. I need to perform an import for around 300000 records within a timespan of 2-3 Mins .

我知道导入到数据库也要看。我创建批量插入脚本和执行数据库服务器的物理内存。我也知道使用SqlBulkCopy的也将是另一种选择,但在我的情况下,它只是没有产品的所发生的,而且更新和删除这是一个被称为功能码字段,决定是否插入,更新或删除的插入。

I know that the import to a database also depends on the physical memory of the db server .I create insert scripts in bulk and execute . I also know using SqlBulkCopy would also be another option but in my case its just not the inserting of product's that take place but also update and delete that is a field called "FUNCTION CODE" which decides whether to Insert,Update Or Delete.

有关至于怎样去这将不胜感激任何建议。

Any suggestions regarding as to how to go about this would be greatly appreciated.

朝着这个办法之一要实现这simultaneosly进行处理多个线程,但我从来没有执行线程直到日期,所以我不知道的复杂性,我将通过实施同样招致的。

One approach towards this would be to implement multiple threads which carry out processes simultaneosly ,but i have never implemented threading till date and hence am not aware of the complication i would incur by implementing the same.

感谢和放大器;问候,
弗朗西斯P上。

Thanks & Regards, Francis P.

推荐答案

SqlBulkCopy的肯定将是最快的。我会通过将数据插入到数据库的临时表处理这个。一旦数据在临时表中,可以使用SQL合并/插入/删除相应

SqlBulkCopy is definitely going to be fastest. I would approach this by inserting the data into a temporary table on the database. Once the data is in the temp table, you could use SQL to merge/insert/delete accordingly.

这篇关于一个CSV最快导入到数据库表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆