python - Connecting and Saving Data With Redis Inside Celery Task -


i have object saves data redis. needs block less possible, i've decided use celery offload task. when try .save() object outside of celery, connects redis , stores data fine. however, when try exact same thing celery task, looks runs, there no connection redis, no exception, no error output , nothing gets saves redis server. replicated problem small bit of code below. test.py:

from celery.decorators import task import redis  class a(object):     def __init__(self):         print "init"      def save(self):         self.r = self.connect()         self.r.set('foo', 'bar')         print "saved"      def connect(self):         return redis.redis(host="localhost", port=6379)  = a()  @task def something(a):     a.save() 

here python console output:

>>> test import * init >>> <test.a object @ 0x1010e3c10> >>> result = something.delay(a) >>> result.ready() true >>> result.successful() true 

and here celeryd output:

[2010-11-15 12:05:33,672: info/mainprocess] got task broker: test.something[d1d71ee5-7206-4fa7-844c-04445fd8bead] [2010-11-15 12:05:33,688: warning/poolworker-2] saved [2010-11-15 12:05:33,694: info/mainprocess] task test.something[d1d71ee5-7206-4fa7-844c-04445fd8bead] succeeded in 0.00637984275818s: none 

any awesome! i've replicated issue on multiple computers, multiple python versions.

the problem being caused misconfiguration in celeryconfig.py. celery_imports needed include task module. resolved.


Comments

Popular posts from this blog

android - Spacing between the stars of a rating bar? -

html - Instapaper-like algorithm -

c# - How to execute a particular part of code asynchronously in a class -