如何修复"OperatorNotAllowedInGraphError"Tensorflow 2.0中的错误 [英] how to fix "OperatorNotAllowedInGraphError " error in Tensorflow 2.0

查看:269
本文介绍了如何修复"OperatorNotAllowedInGraphError"Tensorflow 2.0中的错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在从官方教程中学习tensorflow2.0

I'm learn tensorflow2.0 from official tutorials.I can understand the result from below code.

def square_if_positive(x):
  return [i ** 2 if i > 0 else i for i in x]
square_if_positive(range(-5, 5))

# result
[-5, -4, -3, -2, -1, 0, 1, 4, 9, 16]

但是如果我用张量而不是python代码更改输入,就像这样

But if I change the inputs with tensor not python code, just like this

def square_if_positive(x):
  return [i ** 2 if i > 0 else i for i in x]
square_if_positive(tf.range(-5, 5))

我得到以下错误!!

OperatorNotAllowedInGraphError            Traceback (most recent call last)
<ipython-input-39-6c17f29a3443> in <module>
      2 def square_if_positive(x):
      3     return [i**2 if i > 0 else i for i in x]
----> 4 square_if_positive(tf.range(10))
      5 # measure_graph_size(square_if_positive, range(10))

~/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/eager/def_function.py in __call__(self, *args, **kwds)
    437     # This is the first call of __call__, so we have to initialize.
    438     initializer_map = {}
--> 439     self._initialize(args, kwds, add_initializers_to=initializer_map)
    440     if self._created_variables:
    441       try:

~/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/eager/def_function.py in _initialize(self, args, kwds, add_initializers_to)
    380     self._concrete_stateful_fn = (
    381         self._stateful_fn._get_concrete_function_internal_garbage_collected(  # pylint: disable=protected-access
--> 382             *args, **kwds))
    383 
    384     def invalid_creator_scope(*unused_args, **unused_kwds):

~/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/eager/function.py in _get_concrete_function_internal_garbage_collected(self, *args, **kwargs)
   1793     if self.input_signature:
   1794       args, kwargs = None, None
-> 1795     graph_function, _, _ = self._maybe_define_function(args, kwargs)
   1796     return graph_function
   1797 

~/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/eager/function.py in _maybe_define_function(self, args, kwargs)
   2093         graph_function = self._function_cache.primary.get(cache_key, None)
   2094         if graph_function is None:
-> 2095           graph_function = self._create_graph_function(args, kwargs)
   2096           self._function_cache.primary[cache_key] = graph_function
   2097         return graph_function, args, kwargs

~/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes)
   1984             arg_names=arg_names,
   1985             override_flat_arg_shapes=override_flat_arg_shapes,
-> 1986             capture_by_value=self._capture_by_value),
   1987         self._function_attributes,
   1988         # Tell the ConcreteFunction to clean up its graph once it goes out of

~/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
    851                                           converted_func)
    852 
--> 853       func_outputs = python_func(*func_args, **func_kwargs)
    854 
    855       # invariant: `func_outputs` contains only Tensors, CompositeTensors,

~/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/eager/def_function.py in wrapped_fn(*args, **kwds)
    323         # __wrapped__ allows AutoGraph to swap in a converted function. We give
    324         # the function a weak reference to itself to avoid a reference cycle.
--> 325         return weak_wrapped_fn().__wrapped__(*args, **kwds)
    326     weak_wrapped_fn = weakref.ref(wrapped_fn)
    327 

~/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/framework/func_graph.py in wrapper(*args, **kwargs)
    841           except Exception as e:  # pylint:disable=broad-except
    842             if hasattr(e, "ag_error_metadata"):
--> 843               raise e.ag_error_metadata.to_exception(type(e))
    844             else:
    845               raise

OperatorNotAllowedInGraphError: in converted code:

    <ipython-input-37-6c17f29a3443>:3 square_if_positive  *
        return [i**2 if i > 0 else i for i in x]
    /Users/zhangpan/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/framework/ops.py:547 __iter__
        self._disallow_iteration()
    /Users/zhangpan/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/framework/ops.py:540 _disallow_iteration
        self._disallow_when_autograph_enabled("iterating over `tf.Tensor`")
    /Users/zhangpan/tf2_workspace/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/framework/ops.py:518 _disallow_when_autograph_enabled
        " decorating it directly with @tf.function.".format(task))

    OperatorNotAllowedInGraphError: iterating over `tf.Tensor` is not allowed: AutoGraph did not convert this function. Try decorating it directly with @tf.function.

我找不到有关此错误的任何说明.我认为真正的原因不是不允许在 tf.Tensor 上进行迭代".如果我可以这样写.

I can't find any specifications about this error. I think the real reason is not "iterating over tf.Tensor is not allowed" . Becase I can write like this.

@tf.function
def square_if_positive(x):
    for i in x:
        if i>0:
            tf.print(i**2)
        else:
            tf.print(i)
square_if_positive(tf.range(10))

我像上面的代码一样遍历张量.

I iterate over tensor just like above code.

所以我的问题是这个错误的真正原因是什么?任何建议都会对我有帮助.通过阅读大量材料,我真的无法理解该错误.

So my question is what's the real reason about this error? Any suggestions will help me. I really can't understand this error through I read a lot of materials.

推荐答案

根本原因是签名不支持列表理解(主要是因为在所有情况下都很难确定结果的类型)

The root cause is that autograph doesn't yet support list comprehensions (primarily because it's difficult to determine the dtype of the result in all cases)

作为解决方法,您可以使用tf.map_fn进行理解:

As a workaround, you can use tf.map_fn for the comprehension:

return tf.map_fn(lambda i: i ** 2 if i > 0 else i, x)

有关更多信息,请查看此问题

For more information please take a look at this issue

这篇关于如何修复"OperatorNotAllowedInGraphError"Tensorflow 2.0中的错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆