如何优化我的VB.NET csv导入 [英] How do I optimize my VB.NET csv import

查看:67
本文介绍了如何优化我的VB.NET csv导入的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

嗨大家好



我想知道处理大量数据行(在200 000 - 1之间)的csv文件导入的最佳方法是什么200 000条记录)。



最初,当我构建模块时,文件大约有2万条记录。



我目前正在使用textfield passer,然后在阅读每条记录时将记录添加到对象并将对象添加到数据库。



问题是什么时候有像现在进口的更大的进口进口需要15-20分钟。我现在需要优化



我使用的是oralce数据库



我有什么试过:



我现在尝试的是将记录读入数据表,然后尝试数据表的批量插入,只是想知道是否有一种更好的方法来处理批量导入

Hi Guys

I would like to know what would be the best way to handle imports of csv files, with large rows of data(between 200 000 - 1 200 000 records).

Initially when i built the module the files went up to around 20 000 records.

I am currently using textfield passer and then while reading each record add the record to the object and add the object to the database.

Issue is when there is larger imports like now imports are taking over 15 - 20 min to import. which i now need to optimize

I am using an oralce database

What I have tried:

What i have now tried is reading the records into a datatable and then trying a bulk insert of the datatable, Just would like to know if there is a better way to handle the bulk imports

推荐答案

看看我最近的文章。转换为VB.Net应该很简单。



CSV文件解析器 [ ^ ]



您也可以将它放入DLL程序集并从VB应用程序调用它。



就批量插入而言,我会写一个存储过程并一次提交一个记录。那会更快。
Look at my recent article. It should be quite simple to convert to VB.Net.

CSV File Parser[^]

You could also just put it into a DLL assembly and call it from your VB app.

As far as bulk inserts go, I would write a stored procedure and submit the records one at a time. That would be faster.


这篇关于如何优化我的VB.NET csv导入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆