小批量读取大型Excel文件数据 [英] Read large excel file data in small batches

查看:81
本文介绍了小批量读取大型Excel文件数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个excel文件,其中包含大约15万条记录。我的目的是将excel文件中的前50000条记录读入我的数据表(这里是c#datatable即dt)对这些记录进行一些处理,然后读取下一条50000条记录,依此类推,直到我从文件中获取所有记录。(每次只有50000行)

因为如果我读取所有数据(15万条记录)并保存在数据表或数据集中,我的应用程序肯定会在处理这些行时变慢和崩溃。



目前我使用以下代码获取所有记录

I have an excel file that contains some 15 lakh records. My intention is to read first 50000 records from the excel file into my datatable (here c# datatable i.e dt) do some processing on these records ,then read the next 50000 records and so on until i fetch all records from file.(each time only 50000 rows)
Because if i read all the data (15 lakh records) and hold in a datatable or dataset my app will definately get slow and crash during processing these rows .

Currently i am fetching all records using below code

using (OleDbCommand cmd = new OleDbCommand("SELECT * FROM [" + sheetName + "]", conn))
                    {

                        using (OleDbDataAdapter da = new OleDbDataAdapter())
                        {
                            da.SelectCommand = cmd;
                            try
                            {

                                da.Fill(dt);
                                dt.TableName = sheetName.Replace("$", "");

                            }
                            catch (Exception ex)
                            {
                                ErrSheetName = ErrSheetName + "," + sheetName.Replace("$", "");
                            }
                        }

                    }



但是我希望小批量的数据50000.any想法,任何建议?

请分享。


but i want the data in small batches of 50000.any ideas,any suggestions?
Please share.

推荐答案

);

}
catch (例外情况)
{
ErrSheetName = ErrSheetName + + sheetName.Replace(
", ""); } catch (Exception ex) { ErrSheetName = ErrSheetName + "," + sheetName.Replace("


);
}
}

}
", ""); } } }



但我希望小批量的数据为5000 0.任何想法,任何建议?

请分享。


but i want the data in small batches of 50000.any ideas,any suggestions?
Please share.


我找到了 google [ ^ ]搜索有一些非常具体的结果可以回答你的确切问题。



就个人而言,我会将所有数据加载到数据库中并在数据库而不是内存中进行处理,它会更快更强大。
I found the google [^]search has some fairly specific results that answer you precise question.

Personally I would load all the data into a database and do the processing in the database rather than memory, it will be a lot faster and more robust.


这篇关于小批量读取大型Excel文件数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆