-
-
Notifications
You must be signed in to change notification settings - Fork 165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Job aborts failing #405
Comments
I might have identified the potential issue. it has to do with That means if I have 5 tasks for a worker, and I schedule all 5, then the following won't happen unless a tasks gets completed:
Currently, the easiest way would be to have a hold of task during the What could be a potential better solution? Also, is there something else I might be missing? |
Thanks mate. I'm a bit AFK for a few weeks, due to IRL stuff. |
No worries @JonasKs. Take care. |
Sounds good 😊 No, not really. In short, you need a non-password protected redis running on the default port for tests to run, and pre-commit installed. Clone project, write If you use docker, this docker-compose will do: version: '3.8'
services:
redis:
container_name: arq_redis
image: redis:7-alpine
ports:
- '127.0.0.1:6379:6379'
restart: always |
I have attempted to try & solve the issue. Please let me know if there is any clarity or further information needed. Can brainstorm the problem further in case of a possible better solution. P.S.: Suggestions for PR improvement are also welcome. |
I have setup a sample project: arq-scale-test. I love Arq but somehow task abort is failing for me. Any help or insights would be appreciated.
I have a simple long running tasks that sleeps for 60 seconds.
5 tasks are queued and started in
app.py
.In
checks.py
, I attempt to check the status of the task and stop it. I am successfully able to retrieve the task status. But while callingawait job.abort()
, it just hangs. The task actually gets completed within 60 seconds & then I get a job aborted asFalse
.Adding the code below as well:
The text was updated successfully, but these errors were encountered: