张量流中的循环 [英] Loop in tensorflow

查看:23
本文介绍了张量流中的循环的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我改变了我的问题以更好地解释我的问题:

I changed my question to explain my issue better:

我有一个函数:output_image = my_dunc(x) x 应该像 (1, 4, 4, 1)

I have a function: output_image = my_dunc(x) that x should be like (1, 4, 4, 1)

请帮我修复这部分的错误:

Please help me to fix the error in this part:

out = tf.Variable(tf.zeros([1, 4, 4, 3]))
index = tf.constant(0)
def condition(index):
    return tf.less(index, tf.subtract(tf.shape(x)[3], 1))
def body(index):
    out[:, :, :, index].assign(my_func(x[:, :, :, index]))
    return tf.add(index, 1), out
out = tf.while_loop(condition, body, [index])

ValueError: 这两个结构没有相同的嵌套结构.第一个结构:type=list str=[]第二种结构:type=list str=[, <tf.Variable 'Variable_2:0' shape=(1, 4,4, 3) dtype=float32_ref>]更具体地说:这两个结构没有相同数量的元素.第一个结构:type=list str=[].第二种结构:type=list str=[, <tf.Variable 'Variable_2:0' shape=(1, 4,4, 3) dtype=float32_ref>]

ValueError: The two structures don't have the same nested structure. First structure: type=list str=[] Second structure: type=list str=[<tf.Tensor 'while_10/Add_3:0' shape=() dtype=int32>, <tf.Variable 'Variable_2:0' shape=(1, 4, 4, 3) dtype=float32_ref>] More specifically: The two structures don't have the same number of elements. First structure: type=list str=[<tf.Tensor 'while_10/Identity:0' shape=() dtype=int32>]. Second structure: type=list str=[<tf.Tensor 'while_10/Add_3:0' shape=() dtype=int32>, <tf.Variable 'Variable_2:0' shape=(1, 4, 4, 3) dtype=float32_ref>]

我测试了我的代码,我可以从 out = my_func(x[:, :, :, i]) 获得结果,其中 i 的值不同,当我注释该行时,while_loop 也可以工作 <代码>out[:, :, :, index].assign(my_func(x[:, :, :, index])).那条线出了点问题.

I tested my code and I can get result from out = my_func(x[:, :, :, i]) with different values for i and also while_loop works when I comment the line out[:, :, :, index].assign(my_func(x[:, :, :, index])). Something is wrong in that line.

推荐答案

我知道没有 for 循环等等,只是 while,为什么?

I understand that there is no for-loop and so on and just while, why?

根据TensorFlow 中控制流的实现

它们应该很好地适应 TensorFlow 的数据流模型,并且应该适合并行和分布式执行以及自动微分.

They should fit well with the dataflow model of TensorFlow, and should be amenable to parallel and distributed execution and automatic differentiation.

我认为分布式数据流图和跨设备的自动区分可能是导致很少引入此类循环原语的限制条件.

I think distributed data flow graphs and Automatic differentiation across devices could have been the constraints leading to the introduction of very few such loop primitives.

本文档中有几个图表.分布式计算专家可以更好地理解.更彻底的解释超出了我的范围.

There are several diagrams in this doc. that distributed computing experts can understand better. A more thorough explanation is beyond me.

这篇关于张量流中的循环的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆