如何使用Python在远程服务器上启动asyncio服务器? [英] How to start asyncio server on remote server with Python?

查看:61
本文介绍了如何使用Python在远程服务器上启动asyncio服务器?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个运行Linux的虚拟服务器,它有8个核心.32 GB RAM,另外还有1 TB.它应该是一个开发环境.(与测试和生产相同)这就是我可以从IT那里获得的东西.只能通过腻子或直接的tcp/ip端口(必须使用ssh)通过所谓的跳转服务器访问服务器.

I have a virtual server available which runs Linux, having 8 cores. 32 GB RAM, and 1 TB in addition. It should be a development environment. (same for test and prod) This is what I could get from IT. Server can only be accessed via so-called jump servers by putty or direct tcp/ip ports (ssh is a must).

我正在处理的应用程序通过多重处理启动了多个过程.在每个过程中,都会启动asyncio事件循环,并在某些情况下启动asyncio套接字服务器.基本上,这是一个低级数据流和处理应用程序(不幸的是,尚无kafka或类似技术可用).实时应用程序将永远运行,无需与用户进行交互或与用户进行有限的交互(读取/处理/写入数据).

The application I am working on starts several processes via multiprocessing. In every process an asyncio event loop is started, and an asyncio socket server in some cases. Basically it is a low level data streaming and processing application (unfortunately no kafka or similar technology available yet). The live application runs forever, no or limited interaction with the user (reads/processes/writes data).

我认为,为此可以选择IPython,但是-也许我错了-我认为它会根据每个客户端请求启动新内核,但是我需要从没有用户交互的主要代码中启动新进程.如果是这样,这可能是监视应用程序,从应用程序收集数据,将新的用户命令发送到主模块的选项,但不确定如何远程运行进程和异步服务器.

I assume, IPython is an option for this, but - and maybe I am wrong - I think it starts new kernels per client request, but I need to start new process from the main code w/o user interaction. If so, this can be an option for monitoring the application, gathering data from it, sending new user commands to the main module, but not sure how to run processes and asyncio servers remotely.

我想了解如何在给定的环境下完成这些操作.我不知道从哪里开始,有什么选择.而且我对ipython的理解不正确,他们的页面对我来说还算不算是.

I would like to understand how these can be done on the given environment. I do not know where to start, what alternatives there are. And I do not understand ipython properly, their page is not obviuos to me yet.

请帮帮我!预先谢谢你!

Please help me out! Thank you in advance!

推荐答案

经过大量的研究和学习,我在沙盒"中找到了一个可能的解决方案.环境.首先,我不得不将问题分解为几个子问题:

After lots of research and learning I came across to a possible solution in our "sandbox" environment. First, I had to split the problem into several sub-problems:

  • 远程"发展
  • 并行化
  • 安排和执行并行代码
  • 这些引擎"之间的
  • 数据共享
  • 控制这些引擎"

让我们详细看看:

  • 远程开发意味着您想在笔记本电脑上编写代码,但是代码必须在远程服务器上执行.容易回答的是Jupyter Notebook(或同等解决方案),尽管它需要权衡取舍,但也可以使用其他解决方案,但这部署和使用起来速度更快,并且依赖性,维护等方面的要求最少.
  • 并行化:在进行多处理时,iPython内核面临多个挑战,因此必须并行运行的每个代码都将在单独的Jupyter Notebook中编写.在单个代码中,我仍然可以使用eventloop来获取异步行为
  • 执行并行代码:我将使用几个选项:
    • iPyParallel -解决方法";用于多处理
    • papermill -使用命令行中的参数执行JN(可选)
    • 在Jupyter Notebook中使用%% writefile magic命令-创建可导入项
    • 像cron这样的任务调度程序.
    • 与事件循环异步
    • 尚无选项:docker,多处理,多线程,云(aws,azure,google ...)
    • Remote development means you want write your code on your laptop, but the code must be executed on a remote server. Easy answer is Jupyter Notebook (or equivalent solution) although it has several trade-offs, also other solutions are available, but this was faster to deploy and use and had the least dependency, maintenance, etc.
    • parallelization: had several challenges with iPython kernel when working with multiprocessing, so every code that must run parallel will be written in separated Jupyter Notebook. In a single code I can still use eventloop to get async behaviour
    • executing parallel codes: there are several options I will use :
      • iPyParallel - "workaround" for multiprocessing
      • papermill - execute JNs with parameters from command line (optional)
      • using %%writefile magic command in Jupyter Notebook - create importables
      • os task scheduler like cron.
      • async with eventloops
      • No option yet: docker, multiprocessing, multithreading, cloud (aws, azure, google...)

      这篇关于如何使用Python在远程服务器上启动asyncio服务器?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆