How to run a function periodically with Flask and Celery?

Question:

I have a flask app that roughly looks like this:

app = Flask(__name__)
@app.route('/',methods=['POST'])
def foo():
   data = json.loads(request.data)
   # do some stuff

   return "OK"

Now in addition I would like to run a function every ten seconds from that script. I don’t want to use sleep for that. I have the following celery script in addition:

from celery import Celery
from datetime import timedelta
celery = Celery('__name__')

CELERYBEAT_SCHEDULE = {
    'add-every-30-seconds': {
        'task': 'tasks.add',
        'schedule': timedelta(seconds=10)
    },
}



@celery.task(name='tasks.add')
def hello():
    app.logger.info('run my function')

The script works fine, but the logger.info is not executed. What am I missing?

Asked By: ustroetz

||

Answers:

Do you have Celery worker and Celery beat running? Scheduled tasks are handled by beat, which queues the task mentioned when appropriate. Worker then actually crunches the numbers and executes your task.

celery worker --app myproject--loglevel=info
celery beat --app myproject

Your task however looks like it’s calling the Flask app’s logger. When using the worker, you probably don’t have the Flask application around (since it’s in another process). Try using a normal Python logger for the demo task.

Answered By: tuomur

A celery task by default will run outside of the Flask app context and thus it won’t have access to Flask app instance. However it’s very easy to create the Flask app context while running a task by using app_context method of the Flask app object.

app = Flask(__name__)
celery = Celery(app.name)

@celery.task
def task():
    with app.app_context():
        app.logger.info('running my task')

This article by Miguel Grinberg is a very good place to get a primer on the basics of using Celery in a Flask application.

Answered By: Param Prabhakar

Well, celery beat can be embedded in regular celery worker as well, with -B parameter in your command.

celery -A --app myproject --loglevel=info -B

It is only recommended for the development environment. For production, you should run beat and celery workers separately as documentation mentions. Otherwise, your periodic task will run more than one time.

Answered By: Sabuhi Shukurov

First install the redis on machine and check it is running or not.
install the python dependencies

  1. celery
  2. redis
  3. flask

folder structure

  • project
    • app
      • init.py
      • task.py
    • main.py

write task.py

from celery import Celery
from celery.schedules import crontab
from app import app
from app.scrap import product_data
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
def make_celery(app):
    #Celery configuration
    app.config['CELERY_BROKER_URL'] = 'redis://127.0.0.1:6379'
    app.config['CELERY_RESULT_BACKEND'] = 'db+postgresql://user:[email protected]:5432/mydatabase'
    app.config['CELERY_RESULT_EXTENDED']=True
    app.config['CELERYBEAT_SCHEDULE'] = {
        # Executes every minute
        'periodic_task-every-minute': {
            'task': 'periodic_task',
            'schedule': crontab(minute="*")
        }
    }


    celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
    celery.conf.update(app.config)
    TaskBase = celery.Task
    class ContextTask(TaskBase):
        abstract = True
        def __call__(self, *args, **kwargs):
            with app.app_context():
                return TaskBase.__call__(self, *args, **kwargs)
    celery.Task = ContextTask
    return celery
celery = make_celery(app)

@celery.task(name="periodic_task",bind=True)
def testing(self):
    file1 = open("../myfile.txt", "a")

    # writing newline character
    file1.write("n")
    file1.write("Today")
    #faik
    print("Running")
    self.request.task_name = "state"
    logger.info("Hello! from periodic task")
    return "Done"

write init.py

from flask import Flask, Blueprint,request
from flask_restx import Api,Resource,fields
from flask_sqlalchemy import SQLAlchemy
import redis
from rq import Queue

app = Flask(__name__)
app.config['SECRET_KEY']='7c09ebc8801a0ce8fb82b3d2ec51b4db'
app.config['SQLALCHEMY_DATABASE_URI']='sqlite:///site.db'
db=SQLAlchemy(app)

command to run celery beat and worker

celery -A app.task.celery beat
celery -A app.task.celery worker --loglevel=info
Answered By: Gaurav Lokhande
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.