仅使用1个网页动态码和0个工作者动态码运行Heroku后台任务 [英] Running Heroku background tasks with only 1 web dyno and 0 worker dynos
问题描述
Heroku上有一个Python Flask应用程序,它可以为网页提供服务,但也允许启动某些任务,我认为这些任务最适合作为后台任务。因此,我遵循 Heroku rq
教程来设置后台任务。 My Procfile看起来像这样:
web:python app.py
worker:python worker.py
然而,我的进程目前被缩放为 web = 1 worker = 0
。考虑到这个后台进程不会经常运行,我为它配置一个完整的动态代码似乎并不明智,然后为这个小的东西支付34美元/月。
问题: 如果我离开 其他信息 主应用中排队进程的逻辑如下所示: I have a Python Flask app on Heroku that serves web pages but also allows certain tasks to be launched which I believe would be best structured as background tasks. As such, I've followed the Heroku However, my processes are currently scaled Question: Additional Information The logic in the main app that enqueues a process looks like this:
这篇关于仅使用1个网页动态码和0个工作者动态码运行Heroku后台任务的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋! worker
在我的Procfile中声明的过程,但保持缩放在 web = 1 worker = 0
,我的排队进程最终会运行我的可用网络动态?或者排队的进程永远不会运行?
twisted
worker.py
看起来像这样:
import os
从rq import导入redis
Worker,Queue,Connection
$ b $ listen = ['high','default','low']
redis_url = os.getenv('REDISTOGO_URL','redis:// localhost:6379')
conn = redis.from_url(redis_url)
如果__name__ =='__main__':
with Connection(conn):
worker = Worker(map(Queue,listen))
worker.work()
from rq import Queue
from worker import conn
q = Queue(connection = conn)
q.enqueue(myfunction,myargument)
$ cat Procfile
web:bin / web
$ cat bin / web
python app.py&
python worker.py
rq
tutorial to set up background tasks. My Procfile looks like this:web: python app.py
worker: python worker.py
web=1 worker=0
. Given that this background process won't be run very often, it doesn't seem sensible to me to provision an entire dyno for it and then pay the $34/month for something that small.
worker
process declared in my Procfile but keep the scaling at web=1 worker=0
, will my queued processes eventually be run on my available web dyno? Or will the queued processes never run?twisted
in my web app to run the tasks asynchronously?worker.py
looks like this:import os
import redis
from rq import Worker, Queue, Connection
listen = ['high', 'default', 'low']
redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')
conn = redis.from_url(redis_url)
if __name__ == '__main__':
with Connection(conn):
worker = Worker(map(Queue, listen))
worker.work()
from rq import Queue
from worker import conn
q = Queue(connection=conn)
q.enqueue(myfunction, myargument)
$ cat Procfile
web: bin/web
$ cat bin/web
python app.py &
python worker.py