文件流到字节数组和数组拆分 [英] File Stream to Byte Array and Array Split

查看:83
本文介绍了文件流到字节数组和数组拆分的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



1.处理文件(文件流/内存流)的哪种更好的方法?
2.我认为字节数组会降低处理速度.这是真的吗?

我需要您的宝贵意见,以使以下代码更高效

在此先感谢

Hi,

1. Which is better way to process the files (File Stream / Memory Stream)?
2. I think byte array decrease the process speed. is this true?

I need your valuable inputs to make the following code more efficient

Thanks in Advance

//split the array into multiple array and stored it in the list for analysis
 private void Split()
            {
                byte[] content = ReadFile(test.txt);
                List<string[]> record = new List<string[]>();
                int splitLength = 130;
                byte[] spiltResult = new string[130];
                for(int i=0; i<content.Length;i = i+spitLength)
                {
                    Array.Copy(content,i,spiltResult,0,splitLength);
                    record.Add(spiltResult);
                }
             content =nuul;
            }
//Returns the byte array of the given file
//Is this a right way to do? i need more efficient code
//Which is better File Stream/Memory Stream
 private byte[] ReadFile(string filePath)
            {
            FileStream fs = new FileStream(filePath, FileMode.Open, ileAccess.Read);
            int length = Convert.ToInt32(fs.Length);
            byte[] data = new byte[length];
            fs.Read(data, 0, length);
            fs.Close();   
            return data;
            }



问候,
Sowraaj



Regards,
Sowraaj

推荐答案

Sowraaj,

文件的大小与读取文件的方式同样重要.当您处理较小的文件时,可以读取整个文件,而不必担心这种方法.但是,当您处理更大的文件(例如10Mb或更高)时,就很担心自己在做什么.最好逐步阅读它们.一次说100K,处理该文件的那个位,然后读下一个100K,从内存中分出先前的文件内容.

Sowraaj,

The size of your file is just as important as the way that you read it in. When you''re dealing with smaller files, you can read the whole file in and not worry too much about the method. However, when you''re dealing with much larger files, say 10Mb and up, its a concern about what you''re doing. It may be best to read them in incrementally. Say 100K at a time, process that bit of the file and then read in the next 100K disgarding the previous file contents from memory.

byte[] data = new byte[1024 * 100];
while(fs.Read(data, 0, length) > 0)
{ 
     Process File Contents Here
}





Hogan





Hogan


如果您试图找出其中哪一个最快,我想您会惊讶第二个的速度更快-它仅执行一个内存分配和填充,而第一个将占用大量内存,这可能会非常慢.


给他们计时:使用Stopwatch类并尝试. (请看一下:对字符串中的行进行计数 [ ^ ]-其中大多数显示时间安排做同一件事的不同方式的效果)
If you are trying to fwork out which of these will be quickest, I think that you be surprised how much faster the second one is - it does only the one memory allocation and a fill, whereas the first will doe a lot of memory allocations which can be seriously slow.


Time them: Use the Stopwatch class and try it. (Have a look at this: Counting lines in a string[^] - most of it shows timing the effects of different ways to do the same thing)


1.除非计划将文件保存到磁盘,然后使用FileStream,否则我将使用MemoryStream.但是,如果您要从流转换为字节,则MemoryStream非常有用

2.我已经完成了一个需要该应用程序的应用程序,但我没有发现任何速度降低

1. I use MemoryStream unless I plan to save the file to the disk then go with FileStream. However, if you are going from stream to bytes MemoryStream is very helpful

2. I''ve completed an application that needed this an I didnt notice any speed reduction

/// <summary>
      ///     Reads a Stream and outputs and return a byte array byte[]
      /// </summary>
      /// <param name="input"> Stream </param>
      /// <returns> byte[] </returns>
      public static byte[] ReadFully(Stream input)
      {
          using (var ms = new MemoryStream())
          {
              input.CopyTo(ms);
              return ms.ToArray();
          }
      }



这是我在网上搜索流到字节数组后为我的项目找到的最佳解决方案. 3行

数组已经被拆分了,所以您如何拆分"它,请给我一些细节



This is the best soltion I found for my project after scouring the web for stream to byte array. 3 lines

An array is already split so how do you want it "split", give me some detail


这篇关于文件流到字节数组和数组拆分的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆