芹菜进口和SQS连接问题 [英] Celery import and SQS connection issue

查看:232
本文介绍了芹菜进口和SQS连接问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用文档来开始使用芹菜,但是很难调试示例代码的问题。我不知道我是否碰到同样问题的两面,还是两个独特的问题。我可以通过shell连接到SQS队列,但不能与django建立连接。我不知道导入Celery与导入任务的问题有什么关系。

I'm trying to follow the documentation to get started with celery, but running into hard to debug problems with the sample code. I can't tell if I'm hitting two sides of the same problem, or two unique problems. I can make a connection to the SQS queue through the shell, but not with django. I don't know what the relation is of that behavior to the problems importing Celery vs importing task.

入门指南在这里:
http://芹菜.github.com / celery / get-started / first-steps-with-celery.html#running-the-celery-worker-server

显示代码

from celery import Celery

如果我从python shell运行这个代码,这个代码可以工作,但是如果我在eclipse中的tasks.py中的django项目中这样做,我得到一个错误Unresolved Import:Celery。

This code works if I run it from a python shell, however, if I do that inside my django project in tasks.py in eclipse, I get an error Unresolved Import: Celery.

这里有一个单独的指南: http://celery.github.com/celery/django/first-steps-with-django.html for django,而是使用

There is a separate guide here: http://celery.github.com/celery/django/first-steps-with-django.html for django, which instead uses

from celery import task

哪个解决了但是,当我继续教程并调用

Which resolves fine, however, when I continue the tutorial and call

add.delay(2, 2)

我发现连接失败,看起来可能是试图仍然使用rabbitmq而不是SQS我有我的django项目设置使用(这可以从amazon的Web界面看到SQS队列,如果我从芹菜导入Celery中的shell中执行所有操作,我可以进行连接)。如果它是相关的,这是堆栈跟踪:

I get a connection failure, and it looks like maybe it is trying to still use rabbitmq instead of the SQS I have my django project setup to use (which works, I can see the SQS queues from amazon's web interface, and I can make the connection if I do everything from the shell using from celery import Celery). Here is the stack trace if it is relevant:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python2.7/dist-packages/celery/app/task.py", line 343, in delay
    return self.apply_async(args, kwargs)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/task.py", line 458, in apply_async
    with app.producer_or_acquire(producer) as P:
  File "/usr/lib/python2.7/contextlib.py", line 17, in __enter__
    return self.gen.next()
  File "/usr/local/lib/python2.7/dist-packages/celery/app/base.py", line 247, in producer_or_acquire
    with self.amqp.producer_pool.acquire(block=True) as producer:
  File "/usr/local/lib/python2.7/dist-packages/kombu/connection.py", line 705, in acquire
    R = self.prepare(R)
  File "/usr/local/lib/python2.7/dist-packages/kombu/pools.py", line 54, in prepare
    p = p()
  File "/usr/local/lib/python2.7/dist-packages/kombu/pools.py", line 45, in <lambda>
    return lambda: self.create_producer()
  File "/usr/local/lib/python2.7/dist-packages/kombu/pools.py", line 42, in create_producer
    return self.Producer(self._acquire_connection())
  File "/usr/local/lib/python2.7/dist-packages/celery/app/amqp.py", line 160, in __init__
    super(TaskProducer, self).__init__(channel, exchange, *args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/kombu/messaging.py", line 83, in __init__
    self.revive(self.channel)
  File "/usr/local/lib/python2.7/dist-packages/kombu/messaging.py", line 174, in revive
    channel = self.channel = maybe_channel(channel)
  File "/usr/local/lib/python2.7/dist-packages/kombu/connection.py", line 879, in maybe_channel
    return channel.default_channel
  File "/usr/local/lib/python2.7/dist-packages/kombu/connection.py", line 617, in default_channel
    self.connection
  File "/usr/local/lib/python2.7/dist-packages/kombu/connection.py", line 610, in connection
    self._connection = self._establish_connection()
  File "/usr/local/lib/python2.7/dist-packages/kombu/connection.py", line 569, in _establish_connection
    conn = self.transport.establish_connection()
  File "/usr/local/lib/python2.7/dist-packages/kombu/transport/amqplib.py", line 279, in establish_connection
    connect_timeout=conninfo.connect_timeout)
  File "/usr/local/lib/python2.7/dist-packages/kombu/transport/amqplib.py", line 90, in __init__
    super(Connection, self).__init__(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/amqplib/client_0_8/connection.py", line 129, in __init__
    self.transport = create_transport(host, connect_timeout, ssl)
  File "/usr/local/lib/python2.7/dist-packages/amqplib/client_0_8/transport.py", line 281, in create_transport
    return TCPTransport(host, connect_timeout)
  File "/usr/local/lib/python2.7/dist-packages/amqplib/client_0_8/transport.py", line 85, in __init__
    raise socket.error, msg
socket.error: [Errno 111] Connection refused

在settings.py中,我使用我的SQS网址正确配置了BROKER_URL(并且密码中没有正斜杠,这显然是过去的一个问题)。

In settings.py, I have the BROKER_URL correctly configured with my SQS url (and no forward slashes in secret code, which apparently has been a problem in the past).

所以


  1. 为什么芹菜导入芹菜可以从python shell工作,但不是在django项目中的日食中?

  2. 为什么遵循django教程中的说明导致连接被拒绝的错误(amqplib引用意味着它试图使用rabbitmq而不是SQS) ?


推荐答案

如何调用任务,你是否使用 manage.py shell

How do you call the task, are you using manage.py shell?

您是否添加了 import djcelery; djcelery.setup_loader() settings.py 的顶部?

Did you add import djcelery; djcelery.setup_loader() to the top of your settings.py?

因为芹菜和django-芹菜的API是不同的,因为django-celery落后。 Celery 3.1将支持Django开箱即用,所以新的API可以随处可见。

The API's for celery and django-celery are different now because django-celery is lagging behind. Celery 3.1 will support Django out of the box, so the new API can be used everywhere.

在有趣的eclipse里面。 Eclipse可以使用静态分析来查找模块中的符号吗?在这种情况下,它是否有助于将以下内容添加到芹菜/ init .py文件中:

On the eclipse thing that is interesting. Is it possible that Eclipse uses static analysis to find the symbols in a module? In that case, does it help to add the following to the celery/init.py file:

__all__ = ['Celery']

这篇关于芹菜进口和SQS连接问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆