在Compute/Isolate函数中运行Firebase ML Vision API调用 [英] Running the Firebase ML Vision API calls inside a Compute/Isolate function

查看:100
本文介绍了在Compute/Isolate函数中运行Firebase ML Vision API调用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在玩Firebase ML Vision https://pub.dartlang.org/packages /firebase_ml_vision ,到目前为止,我的应用程序中已集成了文本识别功能.我的问题是,我有一个实时摄像头供稿的UI,每次调用Firebase ML视觉时,都会看到约1-2秒的滞后/冻结,这会影响我的UI.我设法将范围缩小到了Firebase ML API的这一行代码

I have been playing around with the Firebase ML Vision https://pub.dartlang.org/packages/firebase_ml_vision and have so far integrated the Text recognition within my application. My issue here is that I have my UI which is a live camera feed and every time I call the Firebase ML vision, I see about 1-2 seconds lag/freeze which affects my UI. I managed to narrow this down to this line of code from the Firebase ML API

final results =
    (() async => (await detector.detectInImage(visionImage) ?? <dynamic>[]));

此后,我一直尝试实现对API的整个调用,即从摄像机供稿向FirebaseVisionDetector对象发送捕获内容,但是如果我没记错的话,也没有出错的地方,必须执行类似例程的操作从主要隔离区调用.没有此API,我就可以成功实现计算功能.我觉得这是一个外部程序包.

I have since tried to implement the whole call to the API from sending the capture from the camera feed to the FirebaseVisionDetector object but no avail with errors along the lines to do with if I remember right, something like the routine has to be called from the main isolate. I can successfully implement a compute function without this API. I have a feeling is something to do with the fact that it is an external package.

在我寻求App内平滑的UI交互和过渡时,将为您提供任何帮助.

Any help will be appreciated as I am seeking a smooth UI interaction and transition within the App.

推荐答案

使用来自其他隔离版本的平台频道,而不是已知主要隔离版本的

Using platform channels from other isolates than the main isolate is know to have issues

https://github.com/flutter/flutter/issues/13937

这篇关于在Compute/Isolate函数中运行Firebase ML Vision API调用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆