tensorflow中的pytorch等效功能? [英] Pytorch equivalent features in tensorflow?
问题描述
我最近正在阅读一个Pytorch代码并遇到了loss.backward()和optimizer.step()函数,使用tensorflow/keras有什么等效之处吗?
I recently was reading a Pytorch code and came across loss.backward() and optimizer.step() functions, are there any equivalent of these using tensorflow/keras?
推荐答案
loss.backward()
在张量流中的等效值为tf.GradientTape()
. TensorFlow提供用于自动区分的tf.GradientTape
API-计算计算相对于其输入变量的梯度. Tensorflow将在tf.GradientTape
上下文内执行的所有操作记录"到"tape"上.然后,Tensorflow使用该磁带和与每个记录的操作关联的梯度来使用反向模式微分来计算记录的"计算的梯度.
loss.backward()
equivalent in tensorflow is tf.GradientTape()
. TensorFlow provides the tf.GradientTape
API for automatic differentiation - computing the gradient of a computation with respect to its input variables. Tensorflow "records" all operations executed inside the context of a tf.GradientTape
onto a "tape". Tensorflow then uses that tape and the gradients associated with each recorded operation to compute the gradients of a "recorded" computation using reverse mode differentiation.
optimizer.step()
等效项是minimize()
.通过更新变量列表使损失最小化.调用minimize()
既要计算梯度,又要将其应用于变量.
optimizer.step()
equivalent in tensorflow is minimize()
. Minimizes the loss by updating the variable list. Calling minimize()
takes care of both computing the gradients and applying them to the variables.
如果要在应用渐变之前对其进行处理,则可以分三步使用优化器:
If you want to process the gradients before applying them you can instead use the optimizer in three steps:
- 使用
tf.GradientTape
计算梯度. - 根据需要处理渐变.
- 通过
apply_gradients()
应用处理后的渐变.
- Compute the gradients with
tf.GradientTape
. - Process the gradients as you wish.
- Apply the processed gradients with
apply_gradients()
.
希望这能回答您的问题.学习愉快.
Hope this answers your question. Happy Learning.
这篇关于tensorflow中的pytorch等效功能?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!