Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nothing shown in django admin results, results only in flower #102

Closed
GabLeRoux opened this issue Jun 20, 2019 · 18 comments
Closed

Nothing shown in django admin results, results only in flower #102

GabLeRoux opened this issue Jun 20, 2019 · 18 comments

Comments

@GabLeRoux
Copy link

Hello there,
I just gave this a try and I can't figure out why I get no results in the admin.

Here's what I tried:

  1. Generate a project from django cookiecutter so I get Celery with Redis already setup.
  2. Install django-celery-email, add to installed apps and set settings.EMAIL_BACKEND
  3. Install django-celery-results, add to installed apps and setup as described in the documentation

Here are exact commands to reproduce the issue:

git clone git@github.com:GabLeRoux/django-celery-results-with-redis-example.git
cd django-celery-results-with-redis-example
# checkout the specific commit of the project at the time of creation of this issue
git checkout ace6fcd5ad0099d1054bd42d275f974ef99b81cf
docker-compose up -d
docker-compose logs -f &
# wait for everything to be up, then send a test email that will actually be queued with celery
docker-compose run --rm django python manage.py sendtestemail --admin

Email is successfully sent and does go through the queue using celery. Open a shell and verify the data:

docker-compose run --rm django python manage.py shell_plus
TaskResult.objects.all()

Out[1]: <QuerySet []>

It's empty, that's the problem 😢

The result is shown in flower correctly:

image

The email is definitely sent correctly too

image

The result backend doesn't seem to be used correctly or there's something I'm missing here:

image

Related settings

In [2]: settings.CELERY_BROKER_URL
Out[2]: 'redis://redis:6379/0'

In [3]: settings.CELERY_RESULT_BACKEND
Out[3]: 'django-db'

In [4]: settings.EMAIL_BACKEND
Out[4]: 'djcelery_email.backends.CeleryEmailBackend'

In [5]: settings.INSTALLED_APPS
Out[5]:
['django.contrib.auth',
 'django.contrib.contenttypes',
 'django.contrib.sessions',
 'django.contrib.sites',
 'django.contrib.messages',
 'django.contrib.staticfiles',
 'django.contrib.admin',
 'crispy_forms',
 'allauth',
 'allauth.account',
 'allauth.socialaccount',
 'rest_framework',
 'django_celery_beat',
 'django_celery_results',
 'djcelery_email',
 'example.users.apps.UsersConfig',
 'debug_toolbar',
 'django_extensions']
@GabLeRoux GabLeRoux changed the title Nothing shown django admin results Nothing shown in django admin results, results only in flower Jun 20, 2019
@PaszaVonPomiot
Copy link

I confirm the issue. It shows me the results in Flower but not in django-celery-results. The fix was to add -E switch to celery worker. This is not clearly described in docs and should be added.

systemd exec command would look like this:

ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} \
  -A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
  --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} -E ${CELERYD_OPTS}'

@larsrei
Copy link

larsrei commented Sep 26, 2019

Hi,
I have the same problem, the Task Results in the table are missing. I am not using flower. Only django, django-celery-results and database (try sqlite and mariadb).

CELERY STUFF

CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/Berlin'
DJANGO_CELERY_RESULTS_TASK_ID_MAX_LENGTH=191

INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'modules',
'guardian',
'rest_framework',
'rest_framework.authtoken',
'django_celery_results',
'rangefilter',
'import_export',
]

Table was migrated without problems. The celery task run and work correctly. But no entry in database table. I tested with local sqlite and mariadb.
So now I read the switch -E not sure what you mean with this. I only start celery worker in my project.

best regards,
Lars

@PaszaVonPomiot
Copy link

PaszaVonPomiot commented Sep 26, 2019 via email

@larsrei
Copy link

larsrei commented Oct 18, 2019

I tested it on some systems (ubuntu 19.04, RHEL (GNU/Linux 3.10.0-1062.1.2.el7.x86_64 Linux) ) , with different databases (sqlite, mariadb) but always same problem. Celery result is not written to database. the switch -E is not helping me.
Some tips how I can debug further ?

@PaszaVonPomiot
Copy link

PaszaVonPomiot commented Oct 18, 2019 via email

@kowalej
Copy link

kowalej commented Apr 4, 2020

@larsrei - not sure how you were running Celery, but I had the same problem when testing, it was do to not running Celery worker - I was only running the scheduler.

First run:
celery -A <proj> worker
Then:
celery -A <proj> beat -l info

@larsrei
Copy link

larsrei commented Apr 6, 2020

Hi @kowalej,
in our development environment we are running celery local. Here we only run worker like your first run. In production we need both beat and worker. Here we use supervisor.
Development is on Ubuntu, production on RHEL. On both it's not working. At the moment we have no debugging idea.

@johnthealy3
Copy link

Put the following in settings.py and django-celery-results should work with one TaskResult per email:

CELERY_EMAIL_TASK_CONFIG = {'ignore_result': False}
CELERY_EMAIL_CHUNK_SIZE = 1

It took me a long time to figure this out even though it is (abstractly) mentioned in the README for django-celery-email.

@tzalistar
Copy link

Hello everyone,

I had a similar scenario with an app deployed on kubernetes with this problem as as well, in my case I am using celery with redis as the broker. My problem was in my configuration ( almost always is 😄 ) were I used redis in both the broken and result DB.

Below is my celery config from system.py, maybe this can help someone with a similar issue.

This is the old not working config.

CELERY_BROKER_URL = os.getenv('REDISADDR','redis://localhost:6379')
CELERY_RESULT_BACKEND = os.getenv('REDISADDR','redis://localhost:6379')
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = os.getenv('TIME_ZONE','Europe/Athens')

This is the working config.

CELERY_BROKER_URL = os.getenv('REDISADDR','redis://localhost:6379')
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = os.getenv('TIME_ZONE','Europe/Athens')

@auvipy
Copy link
Member

auvipy commented Feb 16, 2021

This is the working config.

CELERY_BROKER_URL = os.getenv('REDISADDR','redis://localhost:6379')
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = os.getenv('TIME_ZONE','Europe/Athens')

based on this I think we should close this issue

@auvipy auvipy closed this as completed Feb 16, 2021
@auvipy
Copy link
Member

auvipy commented Feb 16, 2021

I confirm the issue. It shows me the results in Flower but not in django-celery-results. The fix was to add -E switch to celery worker. This is not clearly described in docs and should be added.

systemd exec command would look like this:

ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} \
  -A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
  --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} -E ${CELERYD_OPTS}'

is this contributed to celery docs?

@PaszaVonPomiot
Copy link

I haven't added anything to docs. I think someone more knowledgable in celery should review this before adding anything to docs.

@MatejMijoski
Copy link

I confirm the issue. It shows me the results in Flower but not in django-celery-results. The fix was to add -E switch to celery worker. This is not clearly described in docs and should be added.

systemd exec command would look like this:

ExecStart=/bin/sh -c '${CELERY_BIN} multi start ${CELERYD_NODES} \
  -A ${CELERY_APP} --pidfile=${CELERYD_PID_FILE} \
  --logfile=${CELERYD_LOG_FILE} --loglevel=${CELERYD_LOG_LEVEL} -E ${CELERYD_OPTS}'

I can also confirm that adding -E enabled django-celery-results to capture the events and show them in the admin tables.

@chrispijo
Copy link

For me it also turned out that adding the -E flag enabled the results to be included in the backend database.

@vladox
Copy link

vladox commented Jan 11, 2022

What about the CELERY_CACHE_BACKEND setting is not that supposed to be used when using REDIS as CELERY_BROKER_URL?

@auvipy auvipy pinned this issue Jan 12, 2022
@auvipy
Copy link
Member

auvipy commented Jan 12, 2022

What about the CELERY_CACHE_BACKEND setting is not that supposed to be used when using REDIS as CELERY_BROKER_URL?

can you please elaborate?

@APouzi
Copy link

APouzi commented Mar 19, 2022

django-db'

Im sorry, I am a total noob with Linux systems and I am using Bash on windows. I tried putting thing into my venv and got back "bash: -c: command not found", "which bash" and got a "/usr/bin/bash" and change those variables to match that and I am still getting this issue. It doesn't work outside of Venv either.
I can't seem to find the solution to where I am suppose to put this to fix the issue.

@leemurus
Copy link

leemurus commented Jan 8, 2023

Hello everyone,

I had a similar scenario with an app deployed on kubernetes with this problem as as well, in my case I am using celery with redis as the broker. My problem was in my configuration ( almost always is 😄 ) were I used redis in both the broken and result DB.

Below is my celery config from system.py, maybe this can help someone with a similar issue.

This is the old not working config.

CELERY_BROKER_URL = os.getenv('REDISADDR','redis://localhost:6379')
CELERY_RESULT_BACKEND = os.getenv('REDISADDR','redis://localhost:6379')
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = os.getenv('TIME_ZONE','Europe/Athens')

This is the working config.

CELERY_BROKER_URL = os.getenv('REDISADDR','redis://localhost:6379')
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = os.getenv('TIME_ZONE','Europe/Athens')

Yeah, the main point that CELERY_RESULT_BACKEND = 'django-db'. In celery worker and web should be 'django-db' value for result backend. Sample of my project https://github.com/leemurus/crawler

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests