在图形模式下运行`defun` [英] Running `defun` in graph-mode

查看:29
本文介绍了在图形模式下运行`defun`的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在 TF 中看到这样的代码:

I see code like this in TF:

from tensorflow.python.eager import function

...

class _PerDeviceGenerator(dataset_ops.DatasetV2):
  """A `dummy` generator dataset."""

  def __init__(self, shard_num, multi_device_iterator_resource, incarnation_id,
               source_device, element_spec):

    ...

    # TODO(b/124254153): Enable autograph once the overhead is low enough.
    @function.defun(autograph=False)  # Pure graph code.
    def _remote_init_func():
      return functional_ops.remote_call(
          target=source_device,
          args=init_func_concrete.captured_inputs,
          Tout=[dtypes.string],
          f=init_func_concrete)

    self._init_func = _remote_init_func._get_concrete_function_internal()  # pylint: disable=protected-access

    ...

    variant_tensor = gen_dataset_ops.generator_dataset(
        self._init_captured_args,
        self._next_captured_args,
        self._finalize_captured_args,
        init_func=self._init_func,
        next_func=self._next_func,
        finalize_func=self._finalize_func,
        **self._flat_structure)
    super(_PerDeviceGenerator, self).__init__(variant_tensor)

(此代码片段来自 TF 1.15.0.)

(This code snippet is from TF 1.15.0.)

我正在尝试理解代码.

更具体地说,我想知道这里的 defun.我认为 defun 是用于 Eager 模式的.

More specifically, I wonder about defun here. I thought defun is for eager mode.

但是在这里,这段代码似乎同时用于eager模式和graph模式.或者这是错误的,这仅适用于急切模式?(但在下面,有 MultiDeviceIterator,它有像 if context.executing_eagerly() 这样的检查,后来使用 _PerDeviceGenerator 进行急切和图形模式.还是图形模式也坏了?为什么要检查 executing_eagerly 呢?)

But here, this code seems to be used for both eager mode and graph mode. Or is that wrong, and this works only on eager mode? (But below, there is MultiDeviceIterator, which has checks like if context.executing_eagerly() and later uses _PerDeviceGenerator for both eager and graph mode. Or is that broken as well for graph mode? Why the check executing_eagerly then?)

defun 在图形模式下有什么作用?

What does defun do in graph mode?

那个 _get_concrete_function_internal 是一些内部 API?

That _get_concrete_function_internal is some internal API?

推荐答案

defun_get_concrete_function_internal 都是内部 API.您应该尽可能使用 tf.function(或在处理内部代码时使用 def_function.function).defun 是一个旧的 API,它在很大程度上复制了 function,并且将来可能会被删除.

Both defun and _get_concrete_function_internal are internal APIs. You should prefer using tf.function whenever possible (or def_function.function when working on internal code). defun is an old API that largely duplicates function, and will likely be removed in the future.

也就是说,defun 不仅适用于 Eager 模式(function 也不是).它们都创建了一个 TF 函数,并且在图形模式下有函数调用"操作可以让您调用这些函数.在 Eager 模式下,它只允许您调用 TF 图.但是在图形模式下,它们可以让您减小图形的大小,就像通过将公共代码分解为函数来减小普通代码的大小一样.

That said, defun is not just for eager mode (neither is function). They both create a TF function, and in graph mode there are "function call" ops that let you call these functions. In eager mode, it just lets you call a TF graph. But in graph mode they let you reduce the size of the graph, in the same way you reduce the size of normal code by factoring common code into functions.

这篇关于在图形模式下运行`defun`的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆