码头工人的多个节点工人 [英] Multiple nodejs workers in docker
问题描述
- nginx容器提供静态文件,ssl,代理nodejs请求
- nodejs容器
- postgesql容器
我现在正在努力解决可扩展性。看到您可以在nginx配置中定义多个 proxy_pass
语句,您是否可以启动一个重复的nodejs容器(完全相同但是暴露了不同的端口),并且有效地负载平衡你的网络应用程序?这是一个很好的架构吗?
此外,这个效果如何写入?是否有竞争条件我需要专门架构?任何指导都不胜感激。
是的,可以使用Nginx来 Node.js服务。每个Node.js实例都可以在不同的Docker容器中运行。增加配置的可扩展性就像启动另一个Docker容器一样简单,并确保它在Nginx配置中注册。 (根据您需要更新Nginx配置的频率,可以自动执行上一步的各种工具/框架。)
例如,下面是一个Nginx配置负载平衡跨不同Node.js服务的传入请求。在我们的例子中,我们有多个Node.js服务在同一台机器上运行,但是完全可以使用Docker容器。
文件 / etc / nginx / sites-enabled / apps
:
upstream apps-cluster {
least_conn;
server localhost:8081;
server localhost:8082;
server localhost:8083;
keepalive 512;
}
server {
listen 8080;
位置/{
proxy_next_upstream错误超时http_500 http_502 http_503 http_504;
proxy_set_header连接;
proxy_http_version 1.1;
proxy_pass http:// apps-cluster;
}
access_log off;
}
尽管运行了Node.js服务的多个实例,您的数据库不应该受到负面影响。 PostgreSQL数据库本身可以完美地处理多个开放连接,并自动解决任何竞争条件。从开发人员的角度来看,运行1个Node.js服务的代码与运行x Node.js服务的代码相同。
I'm very new to docker and productionizing nodejs web apps. However, after some reading I've determined that a good setup would be:
- nginx container serving static files, ssl, proxying nodejs requests
- nodejs container
- postgesql container
However, I'm now trying to tackle scalability. Seeing as you can define multiple proxy_pass
statements in an nginx config, could you not spin up a duplicate nodejs container (exactly the same but exposing a different port) and effectively "load balance" your web app? Is it a good architecture?
Also, how would this effect database writes? Are there race conditions I need to specifically architecture for? Any guidance would be appreciated.
Yes, it's possible to use Nginx to load balance requests between different instances of your Node.js services. Each Node.js instance could be running in a different Docker container. Increasing the scalability of your configuration is as easy as starting up another Docker container and ensure it's registered in the Nginx config. (Depending on how often you have to update the Nginx config, a variety of tools/frameworks are available to do this last step automatically.)
For example, below is an Nginx configuration to load balance incoming requests across different Node.js services. In our case, we have multiple Node.js services running on the same machine, but it's perfectly possible to use Docker containers instead.
File /etc/nginx/sites-enabled/apps
:
upstream apps-cluster {
least_conn;
server localhost:8081;
server localhost:8082;
server localhost:8083;
keepalive 512;
}
server {
listen 8080;
location "/" {
proxy_next_upstream error timeout http_500 http_502 http_503 http_504;
proxy_set_header Connection "";
proxy_http_version 1.1;
proxy_pass http://apps-cluster;
}
access_log off;
}
Despite running multiple instances of your Node.js services, your database should not be negatively affected. The PostgreSQL database itself can perfectly handle multiple open connections and automatically resolves any race conditions. From a developer point of view, the code for running 1 Node.js service is the same as for running x Node.js services.
这篇关于码头工人的多个节点工人的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!