限制REST API用户速率的最佳实践? [英] Best practice for rate limiting users of a REST API?

查看:113
本文介绍了限制REST API用户速率的最佳实践?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在整理一个REST API,但不确定如何扩展或对它的需求如何,我希望能够对它的使用进行评级并能够暂时当包装盒容量过大或存在某些斜线的情况时,拒绝请求。

I am putting together a REST API and as I'm unsure how it will scale or what the demand for it will be, I'd like to be able to rate limit uses of it as well as to be able to temporarily refuse requests when the box is over capacity or if there is some kind of slashdotted scenario.

我还希望能够暂时将服务正常关闭(同时当/如果我需要通过增加容量来扩展服务时,向客户提供表明主要服务处于离线状态的结果。

I'd also like to be able to gracefully bring the service down temporarily (while giving clients results that indicate the main service is offline for a bit) when/if I need to scale the service by adding more capacity.

是否有任何最佳做法之类的事情?

Are there any best practices for this kind of thing? Implementation is Rails with mysql.

推荐答案

这一切都是通过外部Web服务器完成的,该服务器监听世界(我建议使用Nginx或lighttpd )。

This is all done with outer webserver, which listens to the world (i recommend nginx or lighttpd).

关于速率限制,nginx可以进行限制,即每个IP 50个请求/分钟,整个页面可以显示503页,您可以自定义。

Regarding rate limits, nginx is able to limit, i.e. 50 req/minute per each IP, all over get 503 page, which you can customize.

关于预期的暂时下降,在rails世界中,这是通过特殊的maintainance.html页面完成的。当Rails应用服务器停机时,会有某种自动化的方法来创建或符号链接该文件。我建议不要依赖文件的存在,而要依赖应用服务器的实际可用性。

Regarding expected temporary down, in rails world this is done via special maintainance.html page. There is some kind of automation that creates or symlinks that file when rails app servers go down. I'd recommend relying not on file presence, but on actual availability of app server.

但是实际上,您能够启动/停止服务而不会丢失任何连接。即您可以在不同的UNIX套接字/ IP端口上运行应用程序服务器的单独实例,并使平衡器(nginx / lighty / haproxy)也使用该新实例。然后,您关闭旧实例,而所有新实例仅服务于所有客户端。没有连接丢失。当然,这种情况并非总是可能的,取决于您在新版本中引入的更改类型。

But really you are able to start/stop services without losing any connections at all. I.e. you can run separate instance of app server on different UNIX socket/IP port and have balancer (nginx/lighty/haproxy) use that new instance too. Then you shut down old instance and all clients are served with only new one. No connection lost. Of course this scenario is not always possible, depends on type of change you introduced in new version.

haproxy是仅平衡器的解决方案。

haproxy is a balancer-only solution. It can extremely efficiently balance requests to app servers in your farm.

对于相当大的服务,您最终会得到类似的东西:

For quite big service you end-up with something like:


  • api.domain解析为循环N平衡器

  • 每个平衡器将对M个Web服务器的请求代理为静态,将P个应用服务器的请求代理为动态内容。哦,好了,您的REST API没有静态文件吗?

对于相当小的服务(在2K rps以下),所有平衡都是在一两个Web服务器中完成。

For quite small service (under 2K rps) all balancing is done inside one-two webservers.

这篇关于限制REST API用户速率的最佳实践?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆