无法写入大量数据以流式传输 [英] Failed to write large amount of data to stream

查看:270
本文介绍了无法写入大量数据以流式传输的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我尝试使用 CsvHelper ,它会引发异常"System.IO.IOException:流太长." .

When I'm trying to write very large amount of data (list with 300 000 rows and more) to memory stream using CsvHelper, it throws the exception "System.IO.IOException: Stream was too long.".

数据类相当大,具有约30个属性,因此文件中的每个记录将具有约30列.

Data class is rather big and has ~30 properties, consequently each record in the file would have ~30 columns.

这是抛出异常的实际编写代码(通过这种方式,代码基于 CsvHelper lib的答案作者):

This is the actual writing code where exception throws (by the way this code is based on that answer of CsvHelper lib's author):

using (var memoryStream = new MemoryStream())
{
    using (var streamWriter = new StreamWriter(memoryStream, encoding ?? Encoding.ASCII))
    {
        var csvWriter = new CsvWriter(streamWriter, GetConfiguration(delimiter, mappingClassType, mappingActions));
        csvWriter.WriteRecords(data); //data is IEnumerable<T> and has more than 300k records

        streamWriter.Flush();
        return memoryStream.ToArray();
    }
}

然后将结果字节数组保存到文件中.

Then I save the resulted bytes array into the file.

File.WriteAllBytes(filePath, resultedBytesArray); 

请注意,当我向文件写入100 000条记录时,相同的代码可以很好地工作(在这种情况下,文件的大小约为1GB).顺便说一句,我的目标是写出超过60万条数据记录.

Please note, that the same code works well when I write 100 000 records to the file (in that case the file has size about 1GB). By the way, my goal is to write more then 600 000 data records.

这是与此问题相关的堆栈跟踪的相关部分.

This is the relevant part of the stack trace related to this issue.

Stream was too long.|System.IO.IOException: Stream was too long.
at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) 
at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder) 
at System.IO.StreamWriter.Write(Char[] buffer, Int32 index, Int32 count) 
at CsvHelper.CsvWriter.NextRecord() in C:\Users\Josh\Projects\CsvHelper\src\CsvHelper\CsvWriter.cs:line 290 
at CsvHelper.CsvWriter.WriteRecords(IEnumerable records) in C:\Users\Josh\Projects\CsvHelper\src\CsvHelper\CsvWriter.cs:line 490 
at FileExport.Csv.CsvDocument.Create[T](IEnumerable`1 data, String delimiter, Encoding encoding, Type mappingClassType, IDictionary`2 mappingActions) in d:\Dev\DrugDevExport\FileExport\Csv\CsvDocument.cs:line 33 

就我而言,实现目标和避免该问题的基本方法是将我的书面数据列表分成几部分,然后将它们连接在一起,但是可能有什么非常明显且简单的解决方案没有大量的代码重构(例如增加默认的流/缓冲区大小等)?

As far as I'm concerned the basic way to achieve my goal and avoid that issue is to split my list of written data up on few parts and concatenate them together then, but may be is there any pretty obvious and easy solution without a significant code refactoring (like increase the default stream/buffer size, etc..)?

还请记住,为了防止内存不足"对象异常,我还应用了两种可能的解决方案.

Also keep in mind, that I've also applied two possible solutions in order to prevent "Out Of Memory" objects exception.

  • got rid of 2GB limitation for objects (from here https://stackoverflow.com/a/20912869) Yes, I'm running on x64 OS with 32GB RAM.
  • set up x64 "Platform target" in the build settings section (from here https://stackoverflow.com/a/22592876)

先谢谢了.

推荐答案

非常感谢 Spender ,就像他在在问题下方发表评论,已通过用FileStream替换MemoryStream并将数据直接写入文件中来解决.

Many thanks Spender, like he mentioned in the comment below the question, it has been fixed by replacing MemoryStream with FileStream and writing data direct into the file.

在我的情况下,将数据写入MemoryStream然后无任何理由将其再次复制到文件中绝对是没有用的.再次感谢,让我对这个事实大开眼界.

It was absolutely useless in my case to write data to MemoryStream and then copy it again into the file without any reason. Thanks him again for opening my eyes on that fact.

我下面的固定代码.

using (var fileStream = File.Create(path))
{
    using (var streamWriter = new StreamWriter(fileStream, encoding ?? Encoding.ASCII))
    {
        var csvWriter = new CsvWriter(streamWriter, GetConfiguration(delimiter, mappingClassType, mappingActions));
        csvWriter.WriteRecords(data);
    }
}

现在它可以处理任何数量的输入数据.

Now it works with any amount of input data.

这篇关于无法写入大量数据以流式传输的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆