使用 SELECT_TF_OPS 将模型转换为 tflite 无法转换 ops HashTableV2 + 其他 [英] Converting model to tflite with SELECT_TF_OPS cannot convert ops HashTableV2 + others

查看:46
本文介绍了使用 SELECT_TF_OPS 将模型转换为 tflite 无法转换 ops HashTableV2 + 其他的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将 openimages_v4/ssd/mobilenet_v2 转换为 tflite按照此处的建议使用以下代码:

I'm trying to convert openimages_v4/ssd/mobilenet_v2 to tflite using the following code as suggested here:

import tensorflow as tf
MODEL_DIR = 'openimages_v4_ssd_mobilenet_v2_1'
SIGNATURE_KEYS = ['default']
SIGNATURE_TAGS = set()
saved_model = tf.saved_model.load(MODEL_DIR, tags=SIGNATURE_TAGS)
tf.saved_model.save(saved_model, 'new_model_path', signatures=saved_model.signatures)
converter = tf.lite.TFLiteConverter.from_saved_model('new_model_path', signature_keys=SIGNATURE_KEYS, tags=['serve'])
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()

但它给出了这个错误:

<unknown>:0: error: failed while converting: 'main': Ops that need custom implementation (enabled via setting the -emit-custom-ops flag):
    tf.HashTableV2 {container = "", device = "", key_dtype = i64, shared_name = "hub_input/index_to_string_1_load_0_3", use_node_name_sharing = true, value_dtype = !tf.string}
    tf.HashTableV2 {container = "", device = "", key_dtype = i64, shared_name = "hub_input/index_to_string_load_0_2", use_node_name_sharing = true, value_dtype = !tf.string}
    tf.LookupTableFindV2 {device = ""}
    tf.LookupTableImportV2 {device = ""}

我能够通过添加以下内容来克服这些错误:

I was able to get past these errors by adding:

 converter.allow_custom_ops = True

但根据 this github issue 4 月 13 日的帖子,2020 年:

But according to this github issue post on April 13, 2020:

暂时移除了 Python 中的 AddHashtableOps 支持.但是,您仍然可以将其添加到 C++ 中的解释器中.

Removed AddHashtableOps support in Python temporarily. However, you can still add this to an interpreter in C++.

还是这样吗?还有关于如何在 之前的评论中使用 tflite 模型的代码片段a> 在同样的问题上,为了使用 interpreter_wrapper 应该导入什么?

Is that still the case? Also for the code snippet on how to use the tflite model in an earlier comment on that same issue, what should be imported in order to use the interpreter_wrapper?

推荐答案

哈希表操作是 TFLite 中的自定义操作,因此您需要: converter.allow_custom_ops = True 以转换您的模型.

Hashtable ops are custom ops in TFLite so you will need: converter.allow_custom_ops = True in order to convert your model.

您提到的评论不再有效.您可以在 C++ 中使用 AddHashtableOps 或在 python 中使用 HashtableOpsRegisterer.

The comment you mention is no longer valid. You can use AddHashtableOps in C++ or HashtableOpsRegisterer in python.

import tensorflow as tf

model_interpreter = tf.lite.interpreter.InterpreterWithCustomOps(
      model_content=tflite_model, custom_op_registerers=[HashtableOpsRegisterer])

这篇关于使用 SELECT_TF_OPS 将模型转换为 tflite 无法转换 ops HashTableV2 + 其他的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆