TFLite 的硬刷 [英] Hard-swish for TFLite

查看:49
本文介绍了TFLite 的硬刷的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个用 Tensorflow.Keras 编写的自定义神经网络,并应用 hard-swish 函数作为激活(如 MobileNetV3 论文中使用的那样):

I have a custom neural network written in Tensorflow.Keras and apply the hard-swish function as activation (as used in the MobileNetV3 paper):

实施:

def swish(x):
    return x * tf.nn.relu6(x+3) / 6

我正在运行量化感知训练并在最后编写一个 protobuf 文件.然后,我使用此代码转换为 tflite(并最终将其部署在 EdgeTPU 上):

I am running quantization aware training and write a protobuf file at the end. Then, I am using this code to convert to tflite (and deploy it finally on the EdgeTPU):

tflite_convert --output_file test.tflite --graph_def_file=test.pb --inference_type=QUANTIZED_UINT8 --input_arrays=input_1 --output_arrays=conv2d_3/Sigmoid --mean_values=0 --std_dev_values=255 --default_ranges_min=0 --default_ranges_max=6

这很有效,当我不除以 6 时,但是,除以 6 时,我收到此错误:

And this works perfectly, when I am not dividing by 6, however, when dividing by 6 I am getting this error:

Unimplemented: this graph contains an operator of type Div for which the quantized form is not yet implemented.

我使用 TF 1.14 进行训练,昨晚使用 TF 1.15 构建以转换为 TFLITE;我正在努力让 TF 2.x 解决一些奇怪的 HDF5 不兼容问题,但如果有人知道如何规避这个问题,那就太好了……谢谢!

I am using TF 1.14 to train and TF 1.15 last nightly build to convert to TFLITE; I am struggling to get TF 2.x to work for some strange HDF5 incompatibilities, but it would be great if someone knows how to circumvent this issue... Thanks!

推荐答案

因为它是一个常数除法,你可以乘以(近似)倒数:

Since it is a constant division, you could just multiply by (a close approximation of) the inverse:

def swish(x):
    return x * tf.nn.relu6(x+3) * 0.16666667

这篇关于TFLite 的硬刷的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆