芹菜&作为docker容器运行的RabbitMQ:接收到“...”类型的未注册任务 [英] Celery & RabbitMQ running as docker containers: Received unregistered task of type '...'

查看:785
本文介绍了芹菜&作为docker容器运行的RabbitMQ:接收到“...”类型的未注册任务的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

对于码头,芹菜和兔子,我比较新。



在我们的项目中,我们目前有以下设置:
1具有多个docker容器的物理主机运行:



1x rabbitmq:3管理容器

 code>#从docker集线器中拉取图像并安装
docker pull rabbitmq:3-management
#run docker image
docker run -d -e RABBITMQ_NODENAME = my-rabbit --name some-rabbit -p 8080:15672 -p 5672:5672 rabbitmq:3-management

1x芹菜容器

 #从docker hub拉拔码头图像
码头拉拉芹菜
#运行芹菜容器
码头运行--link some-rabbit:rabbit --name some-celery -d celery

(还有一些容器,但是不应该对问题做任何事情)



任务文件 p>

要知道芹菜和兔子一点,我cre在物理主机上安装了一个tasks.py文件:

 从芹菜导入芹菜

app =芹菜('tasks',backend ='amqp',broker ='amqp:// guest:guest@172.17.0.81/')

@ app.task(name ='tasks.add')
def add(x,y):
return x + y

整个安装程序似乎工作得很好。所以当我在tasks.py所在目录中打开一个python shell并运行

 >>>从任务导入添加
>>>> add.delay(4,4)

任务排队等待,并直接从芹菜工作者拉出。 / p>

但是,芹菜工作者不知道有关日志的任务模块:

  $ docker logs some-celery 


[2015-04-08 11:25:24,669:ERROR / MainProcess]接收到tasks.add类型的未注册任务。
该消息已被忽略并丢弃。

您是否记得导入包含此任务的模块?也许你正在使用相对的进口?
请参阅http://bit.ly/gLye1c了解更多信息。

消息体的完整内容是:
{'callbacks':无,'timelimit':(无,无),'retries':0,'id':'2b5dc209 (4,4),'eta':无,'utc':True,'taskset':无,'task':'tasks.add','errbacks ':无,'kwargs':{},'和弦':无,'expires':无}(256b)
追溯(最近的最后一次呼叫):
文件/ usr / local / lib /python3.4/site-packages/celery/worker/consumer.py,第455行,on_task_received
策略[name](message,body,
KeyError:'tasks.add'

所以问题显然是,芹菜容器中的芹菜工人不知道任务模块b $ b现在,由于我不是码头专家,我想问我最好将任务模块导入芹菜容器?



任何帮助都赞赏:)






EDIT 4/8/2015,21:05:



感谢Isowen的答案。只是为了完整,这是我做的:



让我们假设我的 tasks.py 位于我的本地机器上 /家庭/ platzhersh / celerystuff 。现在我在同一个目录中创建了一个 celeryconfig.py ,内容如下:

  CELERY_IMPORTS =('tasks')
CELERY_IGNORE_RESULT = False
CELERY_RESULT_BACKEND ='amqp'

如Isowen所述,芹菜搜索任务和配置文件的容器的 / home / user 。所以我们在开始时将 / home / platzhersh / celerystuff 加载到容器中:

  run -v / home / platzhersh / celerystuff:/ home / user --link some-rabbit:rabbit --name some-celery -d celery 

这对我来说是个窍门。希望这有助于其他一些类似问题的人。
我现在尝试通过将任务放在一个单独的docker容器中来扩展该解决方案。

解决方案

你怀疑,问题是因为芹菜工作者不知道任务模块。有两件事您需要做的:


  1. 将您的任务定义放入码头集装箱。

  2. 配置芹菜工作者加载这些任务定义。

对于Item(1),最简单的方法可能是使用Docker Volume将代码的主机目录安装到芹菜码头实例。一些东西:

  docker run --link some-rabbit:rabbit -v / path / to / host / code:/ home / user -name some-celery -d celery 

其中 / path / to / host / code 是您的主机路径, / home / user 是将其挂载到实例上的路径。在这种情况下为什么 / home / user 因为 Dockerfile 的芹菜图像将工作目录( WORKDIR )定义为 / home / user



(注意:完成Item(1)的另一种方法是使用代码内置构建一个定制的docker图像,但是我将把它作为读者的练习。)



对于Item(2),您需要创建一个导入任务文件的芹菜配置文件。这是一个更一般的问题,所以我将指出一个以前的stackoverflow答案: Celery收到未注册的任务(运行示例)


I am relatively new to docker, celery and rabbitMQ.

In our project we currently have the following setup: 1 physical host with multiple docker containers running:

1x rabbitmq:3-management container

# pull image from docker hub and install
docker pull rabbitmq:3-management
# run docker image
docker run -d -e RABBITMQ_NODENAME=my-rabbit --name some-rabbit -p 8080:15672 -p 5672:5672 rabbitmq:3-management

1x celery container

# pull docker image from docker hub
docker pull celery
# run celery container
docker run --link some-rabbit:rabbit --name some-celery -d celery

(there are some more containers, but they should not have to do anything with the problem)

Task File

To get to know celery and rabbitmq a bit, I created a tasks.py file on the physical host:

from celery import Celery

app = Celery('tasks', backend='amqp', broker='amqp://guest:guest@172.17.0.81/')

@app.task(name='tasks.add')
def add(x, y):
    return x + y

The whole setup seems to be working quite fine actually. So when I open a python shell in the directory where tasks.py is located and run

>>> from tasks import add
>>> add.delay(4,4)

The task gets queued and directly pulled from the celery worker.

However, the celery worker does not know the tasks module regarding to the logs:

$ docker logs some-celery


[2015-04-08 11:25:24,669: ERROR/MainProcess] Received unregistered task of type 'tasks.add'.
The message has been ignored and discarded.

Did you remember to import the module containing this task?
Or maybe you are using relative imports?
Please see http://bit.ly/gLye1c for more information.

The full contents of the message body was:
{'callbacks': None, 'timelimit': (None, None), 'retries': 0, 'id': '2b5dc209-3c41-4a8d-8efe-ed450d537e56', 'args': (4, 4), 'eta': None, 'utc': True, 'taskset': None, 'task': 'tasks.add', 'errbacks': None, 'kwargs': {}, 'chord': None, 'expires': None} (256b)
Traceback (most recent call last):
  File "/usr/local/lib/python3.4/site-packages/celery/worker/consumer.py", line 455, in on_task_received
strategies[name](message, body,
KeyError: 'tasks.add'

So the problem obviously seems to be, that the celery workers in the celery container do not know the tasks module. Now as I am not a docker specialist, I wanted to ask how I would best import the tasks module into the celery container?

Any help is appreciated :)


EDIT 4/8/2015, 21:05:

Thanks to Isowen for the answer. Just for completeness here is what I did:

Let's assume my tasks.py is located on my local machine in /home/platzhersh/celerystuff. Now I created a celeryconfig.py in the same directory with the following content:

CELERY_IMPORTS = ('tasks')
CELERY_IGNORE_RESULT = False
CELERY_RESULT_BACKEND = 'amqp'

As mentioned by Isowen, celery searches /home/user of the container for tasks and config files. So we mount the /home/platzhersh/celerystuff into the container when starting:

run -v /home/platzhersh/celerystuff:/home/user --link some-rabbit:rabbit --name some-celery -d celery

This did the trick for me. Hope this helps some other people with similar problems. I'll now try to expand that solution by putting the tasks also in a separate docker container.

解决方案

As you suspect, the issue is because the celery worker does not know the tasks module. There are two things you need to do:

  1. Get your tasks definitions "into" the docker container.
  2. Configure the celery worker to load those task definitions.

For Item (1), the easiest way is probably to use a "Docker Volume" to mount a host directory of your code onto the celery docker instance. Something like:

docker run --link some-rabbit:rabbit -v /path/to/host/code:/home/user --name some-celery -d celery 

Where /path/to/host/code is the your host path, and /home/user is the path to mount it on the instance. Why /home/user in this case? Because the Dockerfile for the celery image defines the working directory (WORKDIR) as /home/user.

(Note: Another way to accomplish Item (1) would be to build a custom docker image with the code "built in", but I will leave that as an exercise for the reader.)

For Item (2), you need to create a celery configuration file that imports the tasks file. This is a more general issue, so I will point to a previous stackoverflow answer: Celery Received unregistered task of type (run example)

这篇关于芹菜&作为docker容器运行的RabbitMQ:接收到“...”类型的未注册任务的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆