每个会话打开和关闭时,Tensorflow泄漏1280个字节吗? [英] Tensorflow leaks 1280 bytes with each session opened and closed?

查看:99
本文介绍了每个会话打开和关闭时,Tensorflow泄漏1280个字节吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

似乎我打开和关闭的每个Tensorflow会话都会占用GPU内存中的1280个字节,直到python内核终止后才会释放这些字节.

要进行复制,请将以下python脚本另存为memory_test.py:

import tensorflow as tf
import sys
n_Iterations=int(sys.argv[1])
def open_and_close_session():
   with tf.Session() as sess:
      pass
for _ in range(n_Iterations):
   open_and_close_session()
with tf.Session() as sess:
   print("bytes used=",sess.run(tf.contrib.memory_stats.BytesInUse()))

然后从命令行以不同的迭代次数运行它:

  • python memory_test.py 0产生bytes used= 1280
  • python memory_test.py 1产生bytes used= 2560.
  • python memory_test.py 10产生bytes used= 14080.
  • python memory_test.py 100产生bytes used= 129280.
  • python memory_test.py 1000产生bytes used= 1281280.

数学很容易-每个会话打开和关闭都会泄漏1280个字节.我在带有tensorflow-gpu 1.6和1.7的两个不同的ubuntu 17.10工作站以及不同的NVIDIA GPU上测试了此脚本.

我是否错过了一些显式的垃圾回收,还是Tensorflow的错误?

编辑:请注意,与此问题中所述的情况不同,我没有添加任何内容循环中的默认全局图,除非tf.Session()对象本身计数".在这种情况下,如何删除它们? tf.reset_default_graph()或使用with tf.Graph().as_default(), tf.Session() as sess:都无济于事.

解决方案

将我的评论变成答案:

我可以重现此行为.我想您应该在GitHub-Issue-Tracker上创建一个Issue. TF使用它自己的分配器机制,并且该会话对象的文档明确指出this question, I add nothing to the default global graph within the loop, unless the tf.Session() objects themselves 'count'. If this is the case, how can one delete them? tf.reset_default_graph() or using with tf.Graph().as_default(), tf.Session() as sess: doesn't help.

解决方案

Turning my comment into an answer:

I can reproduce this behavior. I guess you should create an Issue on the GitHub-Issue-Tracker. TF uses it own Allocator-mechanism and the documentation of the session object clearly states that close()

Calling this method frees all resources associated with the session.

Which is apparently not the case here. However, even the 1281280 bytes could be potentially reused from the memory pool in a consecutive session.

So the answer is: It seems to be a bug (even in a recent '1.8.0-rc0' Version of TensorFlow.) -- either in close() or in the memory_stats Implementation.

这篇关于每个会话打开和关闭时,Tensorflow泄漏1280个字节吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆