带有NodeJS(Express)的Sklearn模型(Python):如何将两者连接? [英] Sklearn Model (Python) with NodeJS (Express): how to connect both?

查看:89
本文介绍了带有NodeJS(Express)的Sklearn模型(Python):如何将两者连接?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一台使用NodeJS-Express的Web服务器,并且在同一台机器上腌制(倾倒)了一个Scikit-Learn(机器学习)模型.

I have a web server using NodeJS - Express and I have a Scikit-Learn (machine learning) model pickled (dumped) in the same machine.

我需要的是通过向服务器发送/接收数据来演示该模型.我想在Web服务器的启动上加载模型,并保持监听"数据输入.接收数据时,执行预测并将其发送回去.

What I need is to demonstrate the model by sending/receiving data from it to the server. I want to load the model on startup of the web server and keep "listening" for data inputs. When receive data, executes a prediction and send it back.

我对Python比较陌生.从我所看到的,我可以使用子进程"来执行该操作.我还看到了一些从Node运行Python脚本的模块.

I am relatively new to Python. From what I've seen I could use a "Child Process" to execute that. I also saw some modules that run Python script from Node.

问题是我想一次加载模型,并且只要服务器开启就让它处于打开状态.由于尺寸过大,我不想每次都继续加载模型.最好的执行方式是什么?

The problem is I want to load the model once and let it be for as long as the server is on. I don't want to keep loading the model every time due to it's size. How is the best way to perform that?

这个想法是在AWS机器上运行所有东西.

The idea is running everything in a AWS machine.

谢谢.

推荐答案

我的建议:编写一个简单的python Web服务(个人推荐flask)并部署您的ML模型.然后,您可以轻松地从节点后端将请求发送到python Web服务.初始模型加载不会有问题.在应用启动时完成一次,然后就可以了

My recommendation: write a simple python web service (personally recommend flask) and deploy your ML model. Then you can easily send requests to your python web service from your node back-end. You wouldn't have a problem with the initial model loading. it is done once in the app startup, and then you're good to go

请勿执行脚本执行和儿童程序!相信我...它可能会非常南移,在解雇工作和其他工作时会发生所有僵尸进程.我们只是简单地说这不是标准的方法.

DO NOT GO FOR SCRIPT EXECUTIONS AND CHILD PROCESSES!!! I just wrote it in bold-italic all caps so to be sure you wouldn't do that. Believe me... it potentially go very very south, with all that zombie processes upon job termination and other stuff. let's just simply say it's not the standard way to do that.

您需要考虑多请求处理.我认为烧瓶现在默认情况下有

You need to think about multi-request handling. I think flask now has it by default

我只是给您一些一般性提示,因为您的问题已被普遍介绍.

I am just giving you general hints because your problem has been generally introduced.

这篇关于带有NodeJS(Express)的Sklearn模型(Python):如何将两者连接?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆