从数据库加载大块数据时如何避免OutOfMemoryException? [英] How to avoid OutOfMemoryException while loading large chunks of data from a database?

查看:121
本文介绍了从数据库加载大块数据时如何避免OutOfMemoryException?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

说,我们有一个表,其中包含一些很大的text字段,其中包含jpg文件的二进制数据.任务是从磁盘上的数据库中获取这些文件.因此,起初我决定执行以下操作:

Say, we have a table with some large text field containg jpg-files' binary data. The task is to get those files from a database on disk. So, at first I decided to do the following:

MyDataContext dc = new MyDataContext();
foreach(ImageTable t in dc.ImageTable.OrderBy(i=>i.Id))
{
    using (StreamWriter writer = new StreamWriter(new FileStream(string.Concat(t.Name,".jpg"), FileMode.CreateNew), Encoding.GetEncoding(1251)))
    {
          writer.Write(t.Data);
          writer.Close();
    }
}

但是当表有大约2万行时,我很快就会得到一个OutOfMemoryException.

But as soon as the table had about 20 thousand of rows, in a while I got an OutOfMemoryException.

最后,为了避免将所有行加载到一个数据上下文中,我执行了以下操作:

In the end, to avoid loading of all the rows into one datacontext I did the following:

MyDataContext dc = new MyDataContext();
foreach(int id in dc.ImageTable.OrderBy(i=>i.Id).Select(i=>i.Id))
{
     using (MyDataContext _dc = new MyDataContext())
     {
           ImageTable t = _dc.ImageTable.FirstOrDefault(i => i.Id == id);
           using (StreamWriter writer = new StreamWriter(new FileStream(string.Concat(t.Name,".jpg"), FileMode.CreateNew), Encoding.GetEncoding(1251)))
           {
                writer.Write(t.Data);
                writer.Close();
           }
      }
}    

因此,每一行都由一个单独的datacontext加载...没有内存问题! 但是肯定不是执行任务的最佳方法.

So each row is loaded by a separate datacontext...no memory problem left! But surely it's not the best approach to do the task.

有人可以建议吗?

推荐答案

您可以尝试关闭 查看全文

登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆