Symfony2/Doctrine使$ statement-> execute()不是"buffer".所有值 [英] Symfony2 / Doctrine make $statement->execute() not "buffer" all values

查看:116
本文介绍了Symfony2/Doctrine使$ statement-> execute()不是"buffer".所有值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在控制器内部有一个这样的基本代码集:

I've got a basic codeset like this (inside a controller):

$sql = 'select * from someLargeTable limit 1000';
$em = $this->getDoctrine()->getManager();
$conn = $em->getConnection();
$statement = $conn->prepare($sql);
$statement->execute();

我的困难是,当结果集只有几条记录时,内存使用情况还不错.在运行代码的 $ statement-> execute(); 部分之前和之后,我回显了一些调试信息,并为我的实现发现以下内容:

My difficulty is that when the resultset is only a few records, the memory usage is not that bad. I echoed some debugging information before and after running the $statement->execute(); part of the code, and found for my implementation that I have the following:

pre-execute... rowCount :: 0 memory: 49.614 MB
post-execute... rowCount :: 1000 memory: 50.917 MB

当将其从1000条记录增加到1万时,MB使用量的差异将增加到13 MB

When moving this up from 1000 records, to 10k the difference in MB usage grows to 13 MB

pre-execute... rowCount :: 0 memory: 49.614 MB
post-execute... rowCount :: 10000 memory: 62.521 MB

最终,我检索到约5万条记录,接近最大内存分配:

Eventually, retrieving around 50k records I get close to my maximum memory allocation:

pre-execute... rowCount :: 0 memory: 49.614 MB
post-execute... rowCount :: 50000 memory: 114.096 MB

使用此实现,无法编写允许我检索数据CSV的控制器(或什至是命令).当然,超过50k的条目听起来很多,而问题却是为什么,但这不是问题.

With this implementation, there is no way I could write a controller (or even command for that matter) that will allow me to retrieve a CSV of data. Sure, 50k+ entries sounds a lot and the question begs why, but that's not the issue.

我的最终问题是:是否可以告诉DBAL/Connection或DBAL/Statement在执行时将数据缓存在SQL内部而不是整个PHP中.例如,如果我有1000万行,则只将第一个说的1万行发送给PHP ...让我通过 @ statement-> fetch(); 以及当游标到达10k的末尾,截断数组并从数据库中获取下一个10k?

My ultimate question is: Is it possible to tell the DBAL/Connection or DBAL/Statement to, when executing, buffer the data inside SQL rather than in PHP in it's entirety. For instance, if I have 10 million rows, to only send the first say 10k rows to PHP... let me look through them by way of @statement->fetch(); and when the cursor gets to the end of the 10k, truncate the array and fetch the next 10k from the DB?

推荐答案

我只是遇到了同样的问题,想分享一个可能的解决方案.您的DBAL可能使用PDO库并将其PDO::MYSQL_ATTR_USE_BUFFERED_QUERY设置为true,这意味着查询中的所有结果都将缓存在mysql端,并由PDO缓冲到内存中,即使您从未调用$statement->fetchAll()也不例外.为了解决这个问题,我们只需要将PDO::MYSQL_ATTR_USE_BUFFERED_QUERY设置为false,但是DBAL并没有给我们提供解决方法-它的PDO连接类受到保护,没有使用公共方法来检索它,也没有给我们提供使用 setAttribute .

I just ran into the same problem and wanted to share a possible solution. Chances are your DBAL uses PDO library and its PDO::MYSQL_ATTR_USE_BUFFERED_QUERY set to true which means all the results in your query are cached on mysql side and buffered into memory by PDO even though you never call $statement->fetchAll(). To fix this, we just need to set PDO::MYSQL_ATTR_USE_BUFFERED_QUERY to false but DBAL does not give us a way to do it - its PDO connection class is protected without a public method to retrieve it and it does not give us a way to use setAttribute on the PDO connection.

因此,在这种情况下,我只是使用自己的PDO连接来节省内存并加快速度.您可以使用这样的学说db参数轻松实例化一个:

So, in such situations, I just use my own PDO connection to save memory and speed things up. You can easily instantiate one with your doctrine db parameters like this:

$dbal_conn = $this->getDoctrine()->getManager()->getConnection();
$params = $dbal_conn->getParams();
$pdo_conn = new \PDO(
  'mysql:dbname='.$dbal_conn->getDatabase().';unix_socket='.$params['unix_socket'],
  $dbal_conn->getUsername(),
  $dbal_conn->getPassword()
);
$pdo_conn->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);

我正在使用unix套接字,但是IP主机地址也可以轻松使用.

I am using unix sockets but IP host addresses can also be easily used.

这篇关于Symfony2/Doctrine使$ statement-> execute()不是"buffer".所有值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆