我在哪里可以看一下TensorFlow梯度下降主回路? [英] Where can I have a look at TensorFlow gradient descent main loop?
问题描述
(对不起,如果听起来有点天真)
我想看看 TensorFlow GradientDescent的实现-并亲自了解它们如何处理终止条件,步长自适应等.我跟踪了training_ops.apply_gradient_descent
的代码,但找不到实现:(
(Sorry if this sounds a bit naive)
I want to have a look at the meat of the TensorFlow implementation for GradientDescent - and see for myself how are they handling termination condition, step-size adaptiveness, etc. I traced the code down for training_ops.apply_gradient_descent
but I can't find the implementation :(
推荐答案
TensorFlow Optimizer
接口(由for i in range(1000)
循环中看到
TensorFlow Optimizer
interface, (which GradientDescentOptimizer
implements) defines a a single step of minimization. Termination conditions or adjusting step size is implemented by the user. In MNIST for Beginners tutorial, the termination conditions is "stop after 1000" steps which you can see in for i in range(1000)
loop
apply_gradient_descent(a,b,c)
是一个融合运算符,它将c
乘以b
并将其添加到a
. 添加新的op HowTo ,但是作为快捷方式,通常可以通过从蛇形格式转换并搜索来找到C ++实现,因此在本例中为ApplyGradientDescent
.这导致在 tensorflow/core/kernels/中实现training_ops.cc
apply_gradient_descent(a,b,c)
is a fused op that multiplies c
by b
and adds it to a
. There are some extra levels of indirection to go from Python wrapper to C++ implementation detailed in Adding a new op HowTo, but as a shortcut you can usually find C++ implementation by converting from snake-case and searching for that, so ApplyGradientDescent
in this case. That leads to implementation in tensorflow/core/kernels/training_ops.cc
这篇关于我在哪里可以看一下TensorFlow梯度下降主回路?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!