将TensorFlowJS MobileNet + KNN保存到TFLite [英] Save TensorFlowJS MobileNet + KNN to TFLite

查看:244
本文介绍了将TensorFlowJS MobileNet + KNN保存到TFLite的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经使用TensorFlowJS在MobileNet登录结果的基础上训练了一个KNN.

I have trained a KNN on top of MobileNet logits results using TensorFlowJS.

我想知道如何将MobileNet + KNN的结果导出到TFLite模型.

And I want to know how can I export the result of the MobileNet + KNN to a TFLite model.

const knn = knnClassifier.create()
const net = await mobilenet.load()

const handleTrain = (imgEl, label) => {
  const image = tf.browser.fromPixels(imgEl);
  const activation = net.infer(image, true);
  knn.addExample(activation, label)
}

推荐答案

1.保存模型

保存模型,此示例将文件保存到本机文件系统,或者如果您需要将其保存在其他位置,请检查文档.

await model.save('file:///path/to/my-model');

在此步骤之后,您应该拥有一个JSON文件和一个二进制权重文件.

You should have a JSON file and a binary weight file(s) after this step.

tfjs_model.json是从上一步获得的model.json的路径,而saved_model是要保存SavedModel格式的路径.
您可以从

tfjs_model.json is the path to the model.json that you get from the previous step and saved_model is the path where you want to save the SavedModel format.
You can read more about using the TensorflowJS Converter from here.

tensorflowjs_converter --input_format=tfjs_layers_model --output_format=keras_saved_model tfjs_model.json saved_model

3.从SavedModel格式转换为TFLite格式

从文档保存格式转换为TFLite的方法是推荐的方法,例如根据文档.

import tensorflow as tf

# Convert the model
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir) # path to the SavedModel directory
tflite_model = converter.convert()

# Save the model.
with open('model.tflite', 'wb') as f:
  f.write(tflite_model)

这篇关于将TensorFlowJS MobileNet + KNN保存到TFLite的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆