升级到Rails 4后,内存在postgresql服务器上泄漏 [英] Memory leaks on postgresql server after upgrade to Rails 4

查看:104
本文介绍了升级到Rails 4后,内存在postgresql服务器上泄漏的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Heroku上的Rails应用程序遇到了一个奇怪的问题。 Juste从Rails 3.2.17迁移到Rails 4.0.3后,我们的postgresql服务器显示内存使用量无限增加,然后它对每个请求返回以下错误:



<$ p $ b














$ b $在使用rails 4发布应用程序后,Juste在postgresql内存开始增加。



正如你在下面的截图中看到的,它从500MO增加到更多在3小时内比3,5Go


同时,每秒提交次数翻倍。它从每秒120次提交:



以每秒280提交:


值得注意的是,当我们重新启动应用程序时,内存会下降到正常值为600 Mo之前超过3个小时过去(然后每个SQL请求都显示'内存不足'错误)。这就像是杀死ActiveRecord连接在postgresql服务器上释放内存一样。



我们可能会在某处发生内存泄漏。
但是:


  • 它在Rails 3.2中运行良好。也许这个问题是我们为使代码适应Rails 4和Rails 4代码本身所做的更改之间的结合。

  • 在Rails 4升级后,每秒提交次数的增加似乎很奇怪。



我们的堆栈是:


  • ,x2 dynos

  • Postgresql,Ika在heroku上的计划

  • 独角兽,每个实例3个工作人员
  • Rails 4.0 .3

  • Redis缓存。

  • 值得注意的Gems:延迟工作(4.0.0),Active Admin(主分支) 1.11.2)


在我们的代码中,没有什么东西真的很喜欢。



我们的postgresql配置为:


  • work_mem:100MB

  • shared_buffers:1464MB
  • max_connections:500

  • maintenance_work_mem:64MB


有人在切换到Rails 4时遇到过这种行为吗?我正在寻找重现的想法。

所有帮助都非常受欢迎。

预先感谢。

解决方案

我不知道什么更好:回答我的问题或更新它...所以我选择回答。请让我知道是否更新更好



我们终于找出问题所在。从版本3.1开始,Rails就像User.find(id)这样简单的请求添加了准备好的语句。版本4.0,为关联请求(has_many,belongs_to,has_one)添加了准备好的语句。
例如下面的代码:

  class User 
has_many:adresses
end
user.addresses

生成请求

  SELECTaddresses。* FROMaddressesWHEREaddresses。user_id= $ 1 [[user_id,1]] 

问题是Rails只为外键(这里是user_id)添加预备的语句变量。如果你使用自定义的sql请求,比如

  user.addresses.where(moved_at<?,Time.now  -  3。月)

它不会为已移动的已预备语句添加变量。因此,每次调用请求时都会生成一个准备好的语句。然而,postgresql预处理语句不是通过连接共享的,因此在一两个小时内每个连接都有1000条准备好的语句。其中一些非常大。这导致postgreqsl服务器上的内存消耗非常高。

We are experiencing a strange problem on a Rails application on Heroku. Juste after migrate from Rails 3.2.17 to Rails 4.0.3 our postgresql server show an infinite increase of memory usage, then it returns the following error on every request :

ERROR: out of memory
DETAIL: Failed on request of size xxx

Juste after releasing the application with rails 4, postgresql memory start to increase.

As you can see in the screenshot below, It increase from 500 MO to more than 3,5Go in 3 hours

Simultaneously, commit per second doubled. It passed from 120 commit per second :

to 280 commit per second :

It is worth noting that when we restart the application, memory go down to a normal value of 600 Mo before going up to more than 3 Go few hours later (then every sql request show the 'out of memory' error). It is like if killing ActiveRecord connections were releasing memory on postgresql server.

We may well have a memory leak somewhere. However :

  • It was working very well with Rails 3.2. Maybe this problem is a conjunction between changes we made to adapt our code to Rails 4 and Rails 4 code itself.
  • Ihe increase of the number of commit per second juste after Rails 4 upgrade seems very odd.

Our stack is :

  • Heroku, x2 dynos
  • Postgresql, Ika plan on heroku
  • Unicorn, 3 workers per instance
  • Rails 4.0.3
  • Redis Cache.
  • Noteworthy Gems : Delayed jobs (4.0.0), Active Admin (on master branch), Comfortable Mexican Sofa (1.11.2)

Nothing seems really fancy in our code.

Our postgresql config is :

  • work_mem : 100MB
  • shared_buffers : 1464MB
  • max_connections : 500
  • maintenance_work_mem : 64MB

Did someone ever experienced such a behaviour when switching to Rails 4 ? I am looking for idea to reproduce as well.

All help is very welcome.

Thanks in advance.

解决方案

I don't know what is better : answer my question or update it ... so I choose to answer. Please let me know if it's better to update

We finally find out the problem. Since version 3.1, Rails added prepared statements on simple request like User.find(id). Version 4.0, added prepared statements to requests on associations (has_many, belongs_to, has_one). For exemple following code :

class User
  has_many :adresses
end
user.addresses

generate request

SELECT "addresses".* FROM "addresses" WHERE "addresses"."user_id" = $1  [["user_id", 1]]

The problem is that Rails only add prepared statement variables for foreign keys (here user_id). If you use custom sql request like

user.addresses.where("moved_at < ?", Time.now - 3.month) 

it will not add a variable to the prepared statements for moved_at. So it generate a prepared statements every time the request is called. Rails handle prepared statements with a pool of max size 1000.

However, postgresql prepared statements are not shared across connection, so in one or two hours each connection has 1000 prepared statements. Some of them are very big. This lead to very high memory consumption on postgreqsl server.

这篇关于升级到Rails 4后,内存在postgresql服务器上泄漏的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆