流 API 与 Rest API? [英] Streaming API vs Rest API?

查看:27
本文介绍了流 API 与 Rest API?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这里的规范示例是 Twitter 的 API.我从概念上理解 REST API 是如何工作的,本质上它只是向他们的服务器查询您的特定请求,然后您会在其中收到响应(JSON、XML 等),太棒了.

The canonical example here is Twitter's API. I understand conceptually how the REST API works, essentially its just a query to their server for your particular request in which you then receive a response (JSON, XML, etc), great.

但是,我不确定流媒体 API 在幕后是如何工作的.我明白如何消费它.例如使用 Twitter 监听响应.从响应中侦听数据,其中推文成块出现.在字符串缓冲区中构建块并等待表示 Tweet 结束的换行符.但是他们正在做些什么来使这项工作发挥作用?

However I'm not exactly sure how a streaming API works behind the scenes. I understand how to consume it. For example with Twitter listen for a response. From the response listen for data and in which the tweets come in chunks. Build up the chunks in a string buffer and wait for a line feed which signifies end of Tweet. But what are they doing to make this work?

假设我有一堆数据,我想在本地设置一个流 API 供网络上的其他人使用(就像 Twitter 一样).这是怎么做的,什么技术?这是 Node JS 可以处理的事情吗?我只是想了解一下他们为使这件事发挥作用所做的工作.

Let's say I had a bunch of data and I wanted to setup a streaming API locally for other people on the net to consume (just like Twitter). How is this done, what technologies? Is this something Node JS could handle? I'm just trying to wrap my head around what they are doing to make this thing work.

推荐答案

Twitter 的流 API 本质上是一个长期运行的请求,它保持打开状态,数据在可用时被推送到其中.

Twitter's stream API is that it's essentially a long-running request that's left open, data is pushed into it as and when it becomes available.

这样做的后果是服务器必须能够处理大量并发打开的 HTTP 连接(每个客户端一个).许多现有的服务器并不能很好地管理,例如 Java servlet 引擎为每个请求分配一个线程,这可能 (a) 变得非常昂贵,并且 (b) 快速达到正常的最大线程设置并阻止后续连接.

The repercussion of that is that the server will have to be able to deal with lots of concurrent open HTTP connections (one per client). A lot of existing servers don't manage that well, for example Java servlet engines assign one Thread per request which can (a) get quite expensive and (b) quickly hits the normal max-threads setting and prevents subsequent connections.

正如您所猜测的,Node.js 模型比 servlet 模型更适合流连接的想法.请求和响应都在 Node.js 中作为流公开,但不会占用整个线程或进程,这意味着只要流保持打开状态,您就可以继续将数据推送到流中,而不会占用过多资源(尽管这是主观的).理论上,您可以将大量并发打开响应连接到单个进程,并且仅在必要时写入每个进程.

As you guessed the Node.js model fits the idea of a streaming connection much better than say a servlet model does. Both requests and responses are exposed as streams in Node.js, but don't occupy an entire thread or process, which means that you could continue pushing data into the stream for as long as it remained open without tying up excessive resources (although this is subjective). In theory you could have a lot of concurrent open responses connected to a single process and only write to each one when necessary.

如果您还没有看过 HTTP 文档对于 Node.js 可能有用.

If you haven't looked at it already the HTTP docs for Node.js might be useful.

我还想看看 technoweenie 的 Twitter 客户端,看看消费者的最终目的是什么该 API 看起来像 Node.js,特别是stream()函数.

I'd also take a look at technoweenie's Twitter client to see what the consumer end of that API looks like with Node.js, the stream() function in particular.

这篇关于流 API 与 Rest API?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆