loadFrozenModel不适用于本地文件 [英] loadFrozenModel does not work with local files

查看:114
本文介绍了loadFrozenModel不适用于本地文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

需要有关异步/等待的帮助.

need help with async/await.

当前正在研究 https://github.com/tensorflow/tfjs-converter .

并且我对代码的这一部分感到困惑(加载我的python转换后保存的js模型以在浏览器中使用):

and I'm stumped at this part of the code (loading my python converted saved js model for use in the browser):

import * as tf from '@tensorflow/tfjs';
import {loadFrozenModel} from '@tensorflow/tfjs-converter';

/*1st model loader*/
const MODEL_URL = './model/web_model.pb';
const WEIGHTS_URL = '.model/weights_manifest.json';
const model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL);

/*2nd model execution in browser*/
const cat = document.getElementById('cat');
model.execute({input: tf.fromPixels(cat)});

我注意到它正在使用es6(导入/导出)和es2017(异步/等待),因此我将babel与babel-preset-env,babel-polyfill和babel-plugin-transform-runtime一起使用.我已经使用过webpack,但已将其作为捆绑程序切换到Parcel(如tensorflow.js开发人员所建议).在这两个捆绑器中,我不断收到这样的错误消息:应将await封装在一个异步函数中,因此我将代码的第一部分包装在一个异步函数中,以期获得一个Promise.

I noticed it's using es6 (import/export) and es2017 (async/await) so I've used babel with babel-preset-env, babel-polyfill and babel-plugin-transform-runtime. I've used webpack but switched over to Parcel as my bundler (as suggested by the tensorflow.js devs). In both bundlers I keep getting the error that the await should be wrapped in an async function so I wrapped the first part of the code in an async function hoping to get a Promise.

async function loadMod(){

const MODEL_URL = './model/web_model.pb';
const WEIGHTS_URL = '.model/weights_manifest.json';
const model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL);

} 

loadMod();

现在,两个构建者都说等待是保留字". vscode eslinter说,loadMod();有一个承诺的空虚. (所以诺言失败还是被拒绝?) 我正在尝试使用相对路径引用javascript模型文件,或者这是错误的吗?我必须从云服务" ML模型吗?不能来自相对本地路径吗?

now both builders say that the 'await is a reserved word'. vscode eslinter says that loadMod(); has a Promise void. (so the promise failed or got rejected?) I'm trying to reference the javascript model files using a relative path or is this wrong? I have to 'serve' the ML model from the cloud? It can't be from a relative local path?

任何建议将不胜感激.谢谢!

Any suggestions would be much appreciated. Thanks!

推荐答案

tf. loadFrozenModel 使用获取 .提取用于获取服务器提供的文件,除非本地文件由服务器提供,否则不能与本地文件一起使用.有关更多信息,请参见此 answer .

tf.loadFrozenModel uses fetch under the hood. Fetch is used to get a file served by a server and cannot be used with local files unless those are served by a server. See this answer for more.

要使loadFrozenModel使用本地文件,这些文件需要由服务器提供.可以使用 http-服务器来提供模型拓扑及其权重.

For loadFrozenModel to work with local files, those files needs to be served by a server. One can use http-server to serve the model topology and its weights.

 // install the http-server module
 npm install http-server -g

 // cd to the repository containing the files
 // launch the server to serve static files of model topology and weights
 http-server -c1 --cors .

 // load model in js script
 (async () => {
   ...
   const model = await tf.loadFrozenModel('http://localhost:8080/tensorflowjs_model.pb', 'http://localhost:8080/weights_manifest.json')
 })()

这篇关于loadFrozenModel不适用于本地文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆