哪里可以在标准CNN上应用批量归一化 [英] Where to apply batch normalization on standard CNNs

查看:284
本文介绍了哪里可以在标准CNN上应用批量归一化的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我具有以下架构:

Conv1
Relu1
Pooling1
Conv2
Relu2
Pooling3
FullyConnect1
FullyConnect2

我的问题是,我应该在哪里应用批处理规范化?在TensorFlow中执行此操作的最佳功能是什么?

My question is, where do I apply batch normalization? And what would be the best function to do this in TensorFlow?

推荐答案

原始批量范本规定在ReLU激活之前使用批处理规范.但是有证据表明,在激活后 使用batchnorm可能更好.这是Francois Chollet对 Keras GitHub 的评论:

The original batch-norm paper prescribes using the batch-norm before ReLU activation. But there is evidence that it's probably better to use batchnorm after the activation. Here's a comment on Keras GitHub by Francois Chollet:

...我可以保证Christian [Szegedy]编写的最新代码 适用relu 在BN之前.不过,偶尔仍然是一个辩论的话题.

... I can guarantee that recent code written by Christian [Szegedy] applies relu before BN. It is still occasionally a topic of debate, though.

第二个问题:在tensorflow中,您可以使用高级 tf.layers.batch_normalization 函数或低级 tf.nn.batch_normalization .

To your second question: in tensorflow, you can use a high-level tf.layers.batch_normalization function, or a low-level tf.nn.batch_normalization.

这篇关于哪里可以在标准CNN上应用批量归一化的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆