超过800000条记录的Excel超出系统资源错误 [英] System Resources are exceeded error for Excel with more than 800000 records

查看:351
本文介绍了超过800000条记录的Excel超出系统资源错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Hi, I am working on some migration project where data is exported from Oracle DB to Excel sheets. Extraction tool is different which is able to extract data to Excel around 1 Million also. In extraction tool Oracle Data reader is used and its extracting 1 Million records successfully. Even its extracting more than 1 M data as well in to multiple sheets in the same excel with multiple sheets like Sheet1,Sheet2 etc. I need to read the entire excel data and store it in Data table for my Ingestion. its working fine for below 8 lac data. But when i am trying to read the data from Excel using OleDb Data Adapter or Data Reader its failing when records exceed 8 Lac and giving error like "System Resources exceeded." My server is free and nothing is running there and its 64 GB ram as well.

提取使用Oracle数据读取器从Oracle DB获取数据的单独工具。我的工具不同,需要一次读取excel数据并存储在Data表中以进行xml准备。
找到下面的代码片段。

尝试使用数据适配器和数据读取器。


使用(var cmd = conn.CreateCommand())
{
cmd.CommandText =" SELECT * FROM [" + sheet +"]" ;;
OleDbDataReader reader = cmd.ExecuteReader();

dtDocExcelData.Load(读者);

LogWriter.LogWrite("为文件处理文件/文档的数量:"+ inputFilePath +" Sheet Rows Count:" + dtDocExcelData.Rows.Count);


//清除资源
cmd.ResetCommandTimeout();
cmd.Dispose();
reader.Dispose();
}


visual studio是2013版本,代码是C#。 Excel版本也是最新的2016(.XLSX)。

请检查上面的代码部分,并告诉我哪里没有从excel加载批量数据。有没有其他方法可以解决这个问题。

为我提供任何其他替代解决方案。项目将在月底前投入使用。 UAT是阶段失败。请尽早给我替代解决方案。

等待您宝贵的建议。提前致谢。

问候,
Venky

Extraction separate tool which is fetching data from Oracle DB using Oracle Data reader. My tool is different which needs to read the excel data at a time and store in Data table for xml preparation. Find the code snap below. tried with both Data Adapter and Data reader. using (var cmd = conn.CreateCommand()) { cmd.CommandText = "SELECT * FROM [" + sheet + "]"; OleDbDataReader reader = cmd.ExecuteReader(); dtDocExcelData.Load(reader); LogWriter.LogWrite("No of Files/docs to process for the File : " + inputFilePath + " Sheet Rows Count : " + dtDocExcelData.Rows.Count); //clearing the resources cmd.ResetCommandTimeout(); cmd.Dispose(); reader.Dispose(); } visual studio is 2013 version and code is in C#. Excel version is also latest as 2016 (.XLSX). please check the above code part and let me know where it is failing to load the bulk data from excel. Is there any other way to resolve this.? Provide me any other alternative solutions to over come this. Project is going to live by month end. UAT is phase is failing. please give me alternate solution at earliest. Awaiting for your valuable suggestions. Thanks in advance. Regards, Venky

推荐答案

你好,

不是一次读取所有表数据,而是获取记录计数,将其除以读取固定数量的记录,例如1000(或任何适合您的数字10,000),读取这些记录,添加到Excel,继续直到导出所有数据。

Rather than read all the table data in at once, get the record count, divide it up to read a fix number of records e.g. 1,000 (or whatever number suits you like 10,000), read those records, add to Excel, continue until all data is exported.

您可以使用DataReader将数据导入DataTable或在没有DataTable的情况下迭代DataReader或使用SpreadSheetLight等免费库,请参阅

以下文章
为起点。 SpreadSheetLight中的方法是ImportDataTable(RowIndex,ColumnIndex,DataTable,IncludeHeader),所以当如上所述迭代数据时,您可以通过GetWorksheetStatistics方法导入到最后一行
导入的主要方法将提供最后一行正在导入WorkSheet数据中的已用行。

You can stay with a DataReader to get data into a DataTable or iterate the DataReader without a DataTable or use a free library such as SpreadSheetLight, see the following article for a starting point. The method in SpreadSheetLight is ImportDataTable(RowIndex,ColumnIndex,DataTable,IncludeHeader) so when iterating data as mentioned above you can do the import to the last row via GetWorksheetStatistics method of the main method for importing which will provide the last used row in the WorkSheet data is being imported into.

上面不应该出现问题,因为您正在使用块而不是所有记录。

The above should not have the issue as stated as you are working in chunks rather than all records.


这篇关于超过800000条记录的Excel超出系统资源错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆