流API和REST API? [英] Streaming API vs Rest API?

查看:178
本文介绍了流API和REST API?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

典型的例子这里是Twitter的API。我理解的概念REST API是如何工作的,本质上它只是一个查询到他们的服务器为您的特定要求在其中接收响应(JSON,XML等),太棒了。

The canonical example here is Twitter's API. I understand conceptually how the REST API works, essentially its just a query to their server for your particular request in which you then receive a response (JSON, XML, etc), great.

但我不能完全确定一个流API是如何在幕后工作。我知道如何使用它。例如与Twitter监听响应。从响应侦听数据并在其中的鸣叫进来块。建立了块在字符串缓冲区,等待一个换行符这标志着鸣叫结束。但是,他们在做什么,使这项工作?

However I'm not exactly sure how a streaming API works behind the scenes. I understand how to consume it. For example with Twitter listen for a response. From the response listen for data and in which the tweets come in chunks. Build up the chunks in a string buffer and wait for a line feed which signifies end of Tweet. But what are they doing to make this work?

比方说我有一组数据,我想设置一个流API当地其他人在网上消费(就像微博)。这是怎么做的,有什么技术?这是不是JS节点可以处理?我只是想环绕自己在做什么,使这个事情的工作我的头。

Let's say I had a bunch of data and I wanted to setup a streaming API locally for other people on the net to consume (just like Twitter). How is this done, what technologies? Is this something Node JS could handle? I'm just trying to wrap my head around what they are doing to make this thing work.

推荐答案

Twitter的流API是它的本质剩下的开放一个长期运行的请求时,数据被压成其为,当它变得可用。

Twitter's stream API is that it's essentially a long-running request that's left open, data is pushed into it as and when it becomes available.

中的反响是服务器必须能够处理有很多并发打开HTTP连接(每个客户端一个)。很多现有的服务器不管理那么好,比如Java的servlet引擎分配每个请求一个线程可以(一)获得相当昂贵和(b)迅速打正常的最大线程设置和prevents后续连接。

The repercussion of that is that the server will have to be able to deal with lots of concurrent open HTTP connections (one per client). A lot of existing servers don't manage that well, for example Java servlet engines assign one Thread per request which can (a) get quite expensive and (b) quickly hits the normal max-threads setting and prevents subsequent connections.

正如你猜到了Node.js的模型拟合​​一个流连接远胜说servlet模型做的想法。请求和响应都暴露在Node.js的流,但不占据整个线程或进程,这意味着你可以继续将数据推到流,只要它仍然不占用过多的资源开放(虽然这是主观的)。从理论上讲,你可以有很多连接到单个进程并发打开反应和只写每一个在必要时。

As you guessed the Node.js model fits the idea of a streaming connection much better than say a servlet model does. Both requests and responses are exposed as streams in Node.js, but don't occupy an entire thread or process, which means that you could continue pushing data into the stream for as long as it remained open without tying up excessive resources (although this is subjective). In theory you could have a lot of concurrent open responses connected to a single process and only write to each one when necessary.

如果你没有看过它已经 HTTP文档的Node.js的可能是有用的。

If you haven't looked at it already the HTTP docs for Node.js might be useful.

我也想看看 technoweenie的Twitter客户端看什么该API的消费终端的样子与Node.js的,特别是流()函数

I'd also take a look at technoweenie's Twitter client to see what the consumer end of that API looks like with Node.js, the stream() function in particular.

这篇关于流API和REST API?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆