仅使用1个网页动态码和0个工作者动态码运行Heroku后台任务 [英] Running Heroku background tasks with only 1 web dyno and 0 worker dynos

查看:119
本文介绍了仅使用1个网页动态码和0个工作者动态码运行Heroku后台任务的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Heroku上有一个Python Flask应用程序,它可以为网页提供服务,但也允许启动某些任务,我认为这些任务最适合作为后台任务。因此,我遵循 Heroku rq 教程来设置后台任务。 My Procfile看起来像这样:

  web:python app.py 
worker:python worker.py

然而,我的进程目前被缩放为 web = 1 worker = 0 。考虑到这个后台进程不会经常运行,我为它配置一个完整的动态代码似乎并不明智,然后为这个小的东西支付34美元/月。



问题: 如果我离开 worker 在我的Procfile中声明的过程,但保持缩放在 web = 1 worker = 0 ,我的排队进程最终会运行我的可用网络动态?或者排队的进程永远不会运行?

  • 如果排队的进程永远不会运行,还有另一种方法可以做到这一点,例如使用 twisted
  • code>在我的web应用程序中异步运行任务吗?


    其他信息



    worker.py 看起来像这样:

      import os 
    从rq import导入redis
    Worker,Queue,Connection
    $ b $ listen = ['high','default','low']

    redis_url = os.getenv('REDISTOGO_URL','redis:// localhost:6379')

    conn = redis.from_url(redis_url)

    如果__name__ =='__main__':
    with Connection(conn):
    worker = Worker(map(Queue,listen))
    worker.work()

    主应用中排队进程的逻辑如下所示:

      from rq import Queue 
    from worker import conn
    q = Queue(connection = conn)

    q.enqueue(myfunction,myargument)


    解决方案

      $ cat Procfile 
    web:bin / web

    $ cat bin / web
    python app.py&
    python worker.py


    I have a Python Flask app on Heroku that serves web pages but also allows certain tasks to be launched which I believe would be best structured as background tasks. As such, I've followed the Heroku rq tutorial to set up background tasks. My Procfile looks like this:

    web: python app.py
    worker: python worker.py
    

    However, my processes are currently scaled web=1 worker=0. Given that this background process won't be run very often, it doesn't seem sensible to me to provision an entire dyno for it and then pay the $34/month for something that small.

    Question:

    • If I leave the worker process declared in my Procfile but keep the scaling at web=1 worker=0, will my queued processes eventually be run on my available web dyno? Or will the queued processes never run?
    • If the queued processes will never run, is there another way to do this short of, for example, using twisted in my web app to run the tasks asynchronously?

    Additional Information

    worker.py looks like this:

    import os
    import redis
    from rq import Worker, Queue, Connection
    
    listen = ['high', 'default', 'low']
    
    redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')
    
    conn = redis.from_url(redis_url)
    
    if __name__ == '__main__':
        with Connection(conn):
            worker = Worker(map(Queue, listen))
            worker.work()
    

    The logic in the main app that enqueues a process looks like this:

    from rq import Queue
    from worker import conn
    q = Queue(connection=conn)
    
    q.enqueue(myfunction, myargument)
    

    解决方案

    $ cat Procfile
    web: bin/web
    
    $ cat bin/web
    python app.py &
    python worker.py
    

    这篇关于仅使用1个网页动态码和0个工作者动态码运行Heroku后台任务的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

    查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆