Tensorflow中正确的批次标准化功能是什么? [英] What is right batch normalization function in Tensorflow?

查看:182
本文介绍了Tensorflow中正确的批次标准化功能是什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在tensorflow 1.4中,我发现了两个执行批处理规范化的函数,它们看起来相同:

In tensorflow 1.4, I found two functions that do batch normalization and they look same:

  1. tf.layers.batch_normalization(链接)
  2. tf.contrib.layers.batch_norm(链接)
  1. tf.layers.batch_normalization (link)
  2. tf.contrib.layers.batch_norm (link)

我应该使用哪个功能?哪一个更稳定?

Which function should I use? Which one is more stable?

推荐答案

只需添加到列表中,还有其他几种方法可以在tensorflow中进行批处理规范

Just to add to the list, there're several more ways to do batch-norm in tensorflow:

  • tf.nn.batch_normalization 是低级操作.调用方负责自己处理meanvariance张量.
  • tf.nn.fused_batch_norm 是另一个低级操作,类似于前一个.不同之处在于它针对4D输入张量进行了优化,这在卷积神经网络中很常见. tf.nn.batch_normalization接受大于1的任何等级的张量.
  • tf.layers.batch_normalization 是以前版本的高级包装行动.最大的区别在于,它负责创建和管理运行均值和方差张量,并在可能的情况下调用快速融合的op.通常,这应该是您的默认选择.
  • tf.contrib.layers.batch_norm 是批处理规范的早期实现,然后再升级到核心API(即tf.layers).不建议使用它,因为在将来的版本中可能会删除它.
  • tf.nn.batch_norm_with_global_normalization 是另一种不推荐使用的操作.当前,将呼叫委托给tf.nn.batch_normalization,但将来可能会被放弃.
  • 最后,还有Keras层 keras.layers.BatchNormalization ,在tensorflow后端调用的情况下tf.nn.batch_normalization.
  • tf.nn.batch_normalization is a low-level op. The caller is responsible to handle mean and variance tensors themselves.
  • tf.nn.fused_batch_norm is another low-level op, similar to the previous one. The difference is that it's optimized for 4D input tensors, which is the usual case in convolutional neural networks. tf.nn.batch_normalization accepts tensors of any rank greater than 1.
  • tf.layers.batch_normalization is a high-level wrapper over the previous ops. The biggest difference is that it takes care of creating and managing the running mean and variance tensors, and calls a fast fused op when possible. Usually, this should be the default choice for you.
  • tf.contrib.layers.batch_norm is the early implementation of batch norm, before it's graduated to the core API (i.e., tf.layers). The use of it is not recommended because it may be dropped in the future releases.
  • tf.nn.batch_norm_with_global_normalization is another deprecated op. Currently, delegates the call to tf.nn.batch_normalization, but likely to be dropped in the future.
  • Finally, there's also Keras layer keras.layers.BatchNormalization, which in case of tensorflow backend invokes tf.nn.batch_normalization.

这篇关于Tensorflow中正确的批次标准化功能是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆