Tensorflow服务-有状态LSTM [英] Tensorflow Serving - Stateful LSTM

查看:134
本文介绍了Tensorflow服务-有状态LSTM的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

通过Tensorflow Serving是否有规范的方法来维护有状态LSTM等?

Is there a canonical way to maintain a stateful LSTM, etc. with Tensorflow Serving?

直接使用Tensorflow API很简单-但是我不确定将模型导出到Serving后如何最好地实现两次调用之间的持久LSTM状态.

Using the Tensorflow API directly this is straightforward - but I'm not certain how best to accomplish persisting LSTM state between calls after exporting the model to Serving.

有没有实现上述目标的示例?回购中的样本非常基础.

Are there any examples out there which accomplish the above? The samples within the repo are very basic.

推荐答案

在TF邮件列表中,来自Martin Wicke:

From Martin Wicke on the TF mailing list:

在模型服务器中,状态模型还没有很好的集成.如您所指出的,它基本上假设模型是纯函数.我们正在努力,您最终应该会看到此功能,但是实在太过分了因此,与此同时,您可以编写一个简单的包装程序,以将状态保存在服务器上(并分配某种在请求中传递的会话ID),也可以编写自己的服务器来维护TensorFlow会话状态(并类似地返回一些会话ID).后者的性能更高.两者都需要某种垃圾回收/会话超时逻辑."

"There's no good integration for stateful models in the model server yet. As you noted, it basically assumes models are pure functions. We're working on this, and you should see this functionality appear eventually, but it's too far out to promise a time. So in the meantime, you can write a simple wrapper which keeps state on the server (and assigns some sort of session ID which is passed around in requests), or you can write your own server which maintains the TensorFlow session state (and similarly returns some session ID). The latter is more performant. Both will require some sort of garbage collection/session timeout logic."

这篇关于Tensorflow服务-有状态LSTM的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆