Flask-Socketio不从外部RQ进程发出 [英] Flask-Socketio not emitting from external RQ process

查看:73
本文介绍了Flask-Socketio不从外部RQ进程发出的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在运行一个Flask服务器,该服务器通过Flask-Socketio连接到iOS客户端.服务器必须处理一些复杂的数据,由于要花一些时间才能解决,因此我使用Redis Queue在后台作业中进行处理.

I'm running a flask server that connects to an iOS client with Flask-Socketio. The server has to process some complicated data, and since that takes a while to solve, I do it in a background job using Redis Queue.

通信正常工作,但是我需要向客户端发出消息,并在工作完成后写入数据库,而我正在尝试通过工作功能执行此操作(如果有一种方法可以让应用程序知道何时工作完成后,该应用程序可以在一个地方处理所有通信).

Communication works fine normally, but I need to emit to the client, and write to database once the job finishes, and I am trying to do that from the job function (if there is a way to let the app know when the job is finished, the app could handle all communication in one place).

为此,我在工作中启动了一个新的Socketio实例,并将其连接到redis队列,但是我认为这样做的方式是错误的.

To do this, I start a new instance of Socketio in the job, and connect it to the redis queue, but I think I am doing it the wrong way.

它不会崩溃,但是客户端没有收到任何东西.

It doesn't crash, but the client are not receiving anything.

这是我的代码:

tasks.py

tasks.py

# This is the job
def engine(path, id):
    result = process(path)
    print(result)
    socket = SocketIO(message_queue = os.environ.get('REDIS_URL'))
    socket.emit('info', result)

events.py

events.py

def launch_task(name, description, *args, **kwargs):
    rq_job = current_app.task_queue.enqueue('app.tasks.' + name,
                                        *args, **kwargs)
    return rq_job.get_id()

@socketio.on('File')
def got_file(file):
    print("GOT FILE")
    print(file[0])
    name = file[0] + ".csv"
    path = queue_dir + name
    data = file[1]
    csv = open(path, "w")
    csv.write(data)
    csv.close()
    print(path)
    launch_task("engine", "test", path, request.sid)

__ init __.py

__init__.py

socketio = SocketIO()

def create_app(debug=False, config_class=Config):
    app = Flask(__name__)
    app.debug = debug
    app.config.from_object(config_class)

    app.redis = Redis.from_url(app.config['REDIS_URL'])
    app.task_queue = rq.Queue('alg-tasks', connection=app.redis)

    from .main import main as main_blueprint
    app.register_blueprint(main_blueprint)

    socketio.init_app(app)
    return app

events.py处理所有通信并启动工作程序.

events.py handles all communication and launches the worker.

我认为在实例化Socketio时我的论点是错误的,但是我不知道...关于Socketio和后台作业还有很多我不了解的地方.

I think my arguments are wrong when instantiating Socketio, but I don't know... there are still a lot of things I don't understand about Socketio and the background jobs.

提前谢谢!

推荐答案

在应用程序上,必须使用 app 和消息队列初始化 SocketIO 对象:

On the app, you have to initialize your SocketIO object with app and the message queue:

socketio.init_app(app, message_queue=os.environ.get('REDIS_URL'))

在您的RQ工作人员上,您做对了,只是使用了消息队列:

On your RQ worker you are doing it right, just the message queue is used:

socket = SocketIO(message_queue=os.environ.get('REDIS_URL'))

但是每次创建一个新的 SocketIO 实例都是浪费资源,您应该创建一个全局实例,该实例可以在工作者处理的多个任务中重用.

But creating a new SocketIO instance each time you emit is a waste of resources, you should create a global instance that can be reused in multiple tasks handled by the worker.

这篇关于Flask-Socketio不从外部RQ进程发出的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆