tf.keras.layers和tf.layers有什么区别? [英] What is the difference between tf.keras.layers versus tf.layers?
问题描述
tf.keras.layers与tf.layers有什么区别?
例如.他们两个都有Conv2d,它们提供不同的输出吗?
如果将它们混合使用(在一个隐藏层中像tf.keras.layers.Conv2d,在下一个隐藏层tf.layers.max_pooling2d之类的东西)会有好处吗?
What is the difference between tf.keras.layers versus tf.layers?
E.g. both of them have Conv2d, do they provide different outputs?
Is there any benefits if you mix them (something like a tf.keras.layers.Conv2d in one hidden layer and in the next, tf.layers.max_pooling2d)?
推荐答案
从TensorFlow 1.12开始,tf.layers
只是围绕tf.keras.layers
的包装.
Since TensorFlow 1.12, tf.layers
are merely wrappers around tf.keras.layers
.
一些例子:
卷积tf.layers
只是从卷积tf.keras.layers
继承而来,请参见源代码
Convolutional tf.layers
just inherit from the convolutional tf.keras.layers
, see source code here:
@tf_export('layers.Conv2D')
class Conv2D(keras_layers.Conv2D, base.Layer):
对于所有核心tf.layers
,例如:
The same is true for all core tf.layers
, e.g.:
@tf_export('layers.Dense')
class Dense(keras_layers.Dense, base.Layer):
通过将Keras集成到TensorFlow中,维护几种不同的层实现几乎没有意义. tf.keras
正在成为TensorFlow的实际高级API,因此tf.layers
现在只是tf.keras.layers
的包装.
With the integration of Keras into TensorFlow, it would make little sense to maintain several different layer implementations. tf.keras
is becoming the de-facto high-level API for TensorFlow, therefore tf.layers
are now just wrappers around tf.keras.layers
.
这篇关于tf.keras.layers和tf.layers有什么区别?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!