如何使用Linq to Entities处理大型结果集? [英] How to deal with large result sets with Linq to Entities?

查看:67
本文介绍了如何使用Linq to Entities处理大型结果集?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在网站上显示的实体查询的查询相当复杂.它使用分页,因此我一次不会下拉超过50条记录进行显示.

但是我还希望为用户提供将完整结果导出到Excel或其他文件格式的选项.

我担心的是,一次可能会将大量记录全部加载到内存中.

有没有一种方法可以像处理一次数据读取器那样一次处理linq结果集1条记录,因此一次只将1条记录真正保留在内存中?

我已经看到一些建议,如果通过带有foreach循环的linq查询进行枚举,则记录不会一次全部读入内存,也不会使服务器负担过多.

有人能链接到我可以阅读的内容吗?

我将不胜感激.

谢谢

解决方案

分离对象.

如何拆卸

foreach( IQueryable)
{
  //do something 
  objectContext.Detach(object);
}

编辑:如果您使用的是NoTracking选项,则无需分离

Edit2 :我写信给马特·沃伦(Matt Warren)设想.并在得到他批准的情况下在此处发布相关私人信件

SQL Server的结果可能不正确 甚至全部由服务器生产 然而.查询已开始于 服务器和第一批结果 被转移给客户,但没有 产生了更多(或被缓存了) 在服务器上),直到客户端 要求继续阅读它们. 这就是所谓的"firehose" 光标模式,有时也称为 作为流媒体.服务器正在发送 他们尽可能快地和客户 正在尽可能快地阅读它们 (您的代码),但是有数据 下面的传输协议 需要得到 客户继续发送更多数据.

由于IQueryable继承自IEnumerable,因此我相信发送到服务器的基础查询将是相同的.但是,当我们执行IEnumerable.ToList()时,基础连接所使用的数据读取器将开始填充对象,这些对象将被加载到应用程序域中,并且可能会用尽这些对象,而这些内存仍无法处理. /p>

在使用foreachIEunmerable时,数据读取器一次读取一个SQL结果集,然后创建对象,然后将其处置.基础连接可能会分块接收数据,并且可能在读取所有分块之前可能不会将响应发送回SQL Server.因此,您不会遇到内存不足"异常

Edit3 :

查询运行时,您实际上可以打开SQL Server活动监视器",并查看查询,任务状态"为挂起",等待类型"为Async_network_IO-实际上表明结果在SQL Server网络中缓冲.您可以阅读有关它的更多信息 解决方案

set the ObjectContext to MergeOption.NoTracking (since it is a read only operation). If you are using the same ObjectContext for saving other data, Detach the object from the context.

how to detach

foreach( IQueryable)
{
  //do something 
  objectContext.Detach(object);
}

Edit: If you are using NoTracking option, there is no need to detach

Edit2: I wrote to Matt Warren about this scenario. And am posting relevant private correspondences here, with his approval

The results from SQL server may not even be all produced by the server yet. The query has started on the server and the first batch of results are transferred to the client, but no more are produced (or they are cached on the server) until the client requests to continue reading them. This is what is called ‘firehose cursor’ mode, or sometimes referred to as streaming. The server is sending them as fast as it can, and the client is reading them as fast as it can (your code), but there is a data transfer protocol underneath that requires acknowledgement from the client to continue sending more data.

Since IQueryable inherits from IEnumerable, I believe the underlying query sent to the server would be the same. However, when we do a IEnumerable.ToList(), the data reader, which is used by the underlying connection, would start populating the object, the objects get loaded into the app domain and might run out of memory these objects cannot yet be disposed.

When you are using foreach and IEunmerable the data reader reads the SQL result set one at a time, the objects are created and then disposed. The underlying connection might receive data in chunks and might not send a response to SQL Server back until all the chunks are read. Hence you will not run into 'out of memory` exception

Edit3:

When your query is running, you actually can open your SQL Server "Activity Monitor" and see the query, the Task State as SUSPENDED and Wait Type as Async_network_IO - which actually states that the result is in the SQL Server network buffer. You can read more about it here and here

这篇关于如何使用Linq to Entities处理大型结果集?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆