django-celery

How create task with celery in django? POO issue?

How create task with celery in django? POO issue? Question: I’m trying to set up a task with Django and Celery. The Celery and Django configuration is okay, nothing to report on that side. However, I have a problem, I think with the writing, of my code in OOP. I can’t locate where the problem …

Total answers: 1

Patch Django EmailMultiAlternatives send() in a Celery Task so that an exception is raised

Patch Django EmailMultiAlternatives send() in a Celery Task so that an exception is raised Question: I want to test a Celery Task by raising an SMTPException when sending an email. With the following code, located in: my_app.mailer.tasks from django.core.mail import EmailMultiAlternatives @app.task(bind=True ) def send_mail(self): subject, from_email, to = ‘hello’, ‘[email protected]’, ‘[email protected]’ text_content = ‘This …

Total answers: 2

Celery crontab to schedule 1 of the month and quarterly in year

Celery crontab to schedule 1 of the month and quarterly in year Question: I have a celery task which executes quarterly on 1 of the month how can my month_of_year can be written { ‘task’: ‘mytask’, ‘schedule’: crontab(day_of_month=’1′, month_of_year=”) }, Asked By: jimmy || Source Answers: Use month_of_year=’*/3′ to run every quarter month { ‘task’: …

Total answers: 1

Celery scheduler not performing the task

Celery scheduler not performing the task Question: I was trying to use Celery to query an external api at a regular frequency and update my database in my Django project with the new data. Celery schedules the task correctly and sends it to the celery worker but it never executes anything. Here is my celery.py …

Total answers: 1

Add n tasks to celery queue and wait for the results

Add n tasks to celery queue and wait for the results Question: I’d add several jobs to the celery queue and wait for the results. I have many ideas about how I would accomplish this using some type of shared storage (memcached, redis, database, etc.), but I think it was something Celery could handle automatically, …

Total answers: 3

Celery auto reload on ANY changes

Celery auto reload on ANY changes Question: I could make celery reload itself automatically when there is changes on modules in CELERY_IMPORTS in settings.py. I tried to give mother modules to detect changes even on child modules but it did not detect changes in child modules. That make me understand that detecting is not done …

Total answers: 8

Global variable with Django and Celery

Global variable with Django and Celery Question: I have a code like this, wl_data = {} def set_wl_data(): global wl_data wl_data = get_watchlist_data() def get_wl_data(scripcodes): # Filtering Data result = {scripcode:detail for scripcode, detail in wl_data.iteritems() if int(scripcode) in scripcodes or scripcode in scripcodes} return result I am running this as a django project, I …

Total answers: 2

Why would running scheduled tasks with Celery be preferable over crontab?

Why would running scheduled tasks with Celery be preferable over crontab? Question: Considering Celery is already a part of the stack to run task queues (i.e. it is not being added just for running crons, that seems an overkill IMHO ). How can its “periodic tasks” feature be beneficial as a replacement for crontab ? …

Total answers: 2

How to send periodic tasks to specific queue in Celery

How to send periodic tasks to specific queue in Celery Question: By default Celery send all tasks to ‘celery’ queue, but you can change this behavior by adding extra parameter: @task(queue=’celery_periodic’) def recalc_last_hour(): log.debug(‘sending new task’) recalc_hour.delay(datetime(2013, 1, 1, 2)) # for example Scheduler settings: CELERYBEAT_SCHEDULE = { ‘installer_recalc_hour’: { ‘task’: ‘stats.installer.tasks.recalc_last_hour’, ‘schedule’: 15 # …

Total answers: 3

celery – chaining groups and subtasks. -> out of order execution

celery – chaining groups and subtasks. -> out of order execution Question: When I have something like the following group1 = group(task1.si(), task1.si(), task1.si()) group2 = group(task2.si(), task2.si(), task2.si()) workflow = chain(group1, group2, task3.si()) The intuitive interpretation is that task3 should only execute after all tasks in group 2 have finished. In reality, task 3 …

Total answers: 2