Tensorflow Lite模型输出错误 [英] Tensorflow lite model is giving wrong output

查看:501
本文介绍了Tensorflow Lite模型输出错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发具有回归预测的深度学习模型.我创建了一个tflite模型,但是其预测与原始模型不同,并且完全错误.这是我的过程:

I am developing a deep learning model with regression predicts. I created a tflite model but its predictions are different from original model and they are fully wrong.. Here is my process:

我用keras训练了模型

I trained my model with keras

model = Sequential()
model.add(Dense(100, input_dim=x.shape[1], activation='relu')) # Hidden 1
model.add(Dense(50, activation='relu')) # Hidden 2
model.add(Dense(1)) # Output
model.compile(loss='mean_squared_error', optimizer='adam')
model.fit(x,y,verbose=0,epochs=500)

并将我的模型另存为h5文件

And saved my model as h5 file

model.save("keras_model.h5")

然后通过TocoConverter将h5文件转换为tflile格式

Then converted h5 file to tflile format by TocoConverter

converter = tf.contrib.lite.TocoConverter.from_keras_model_file("keras_model.h5")
tflite_model = converter.convert()
open("converted_model.tflite", "wb").write(tflite_model)

当我用相同的输入测试两个文件时,原始keras模型会提供合理的输出,但转换后的模型会提供不合理的输出.

When i test both files with same input original keras model gives reasonable output, but converted model gives unreasonable output.

# Load TFLite model and allocate tensors.
interpreter = tf.contrib.lite.Interpreter(model_path="converted_model.tflite")
interpreter.allocate_tensors()

# Get input and output tensors.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()

# Test model on random input data.
input_shape = input_details[0]['shape']
input_data = np.array(np.random.random_sample(input_shape), dtype=np.float32)
interpreter.set_tensor(input_details[0]['index'], input_data)

interpreter.invoke()
output_data = interpreter.get_tensor(output_details[0]['index'])
print(input_data)
print(output_data)

//Original model testing
from keras.models import load_model
model2 = load_model("keras_model.h5")
pred = model2.predict(x)
print(pred)

输出是这样的:

[[10. 10. 10. 10. 10. 10.]]//input_data
[[-1.4308803]]// tflite output (meaningless)
[[335.0276]] // keras file output

为什么会出现此问题?

推荐答案

最后,我找到了一个解决方案,方法是使用

Finally i found a solution by converting keras model to frozen graph with this code snippet. I copied this python file tensorflow Scripts folder. And copy keras model file to same folder. And create a folder called "frozen". Then run this command

py cerasconvert.py keras_model.h5 frozen/ freeze_graph

我将新创建的.pb文件转换为tflite格式

I converted newly created .pb file to tflite format

import tensorflow as tf
import numpy as np

graph_def_file = "frozen/frozen.pb"
input_arrays = ["dense_1_input_1"]
output_arrays = ["dense_3_1/BiasAdd"]

converter = tf.contrib.lite.TocoConverter.from_frozen_graph(
graph_def_file, input_arrays, output_arrays)
tflite_model = converter.convert()
open("frozen/converted.tflite", "wb").write(tflite_model)

现在我的tflite模型的预测精度非常高.

now my tflite model prediction accuracy is very high.

这篇关于Tensorflow Lite模型输出错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆