Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_mp_pool_worker_no_daemon hangs after dependencies or CI env update #5712

Closed
crusaderky opened this issue Jan 26, 2022 · 5 comments · Fixed by #5716
Closed

test_mp_pool_worker_no_daemon hangs after dependencies or CI env update #5712

crusaderky opened this issue Jan 26, 2022 · 5 comments · Fixed by #5716
Assignees

Comments

@crusaderky
Copy link
Collaborator

Overnight, with no changes to the distributed code, test_mp_pool_worker_no_daemon started hanging deterministically on Linux.

Last successful run: https://github.com/dask/distributed/runs/4944060002
First failed run: https://github.com/dask/distributed/runs/4949990263

Diff of mamba list on Python 3.8 doesn't show anything that catches the eye:

< coverage                  6.2              py38h497a2fe_0    conda-forge
> coverage                  6.3              py38h497a2fe_0    conda-forge

< distributed               2022.1.0+20.g682a7b1d           dev_0    <develop>
> distributed               2022.1.0+21.ge04c5627           dev_0    <develop>

< notebook                  6.4.7              pyha770c72_0    conda-forge
> notebook                  6.4.8              pyha770c72_0    conda-forge

< prometheus_client         0.12.0             pyhd8ed1ab_0    conda-forge
> prometheus_client         0.13.0             pyhd8ed1ab_0    conda-forge

Note that the diff on Python 3.9 is a lot noiser due to a major change in pytorch which however is unrelated to the issue.

@jrbourbeau
Copy link
Member

Just observed this on macOS as well (see this Python 3.7 CI build)

@jcrist
Copy link
Member

jcrist commented Jan 26, 2022

I've failed to recreate this issue locally - perhaps it's something to do with the CI environment itself?

@graingert
Copy link
Member

looks like distributed/tests/test_asyncprocess.py::test_terminate_after_stop has also begun flaking

@graingert
Copy link
Member

looks like it might be this: nedbat/coveragepy#1307

@graingert
Copy link
Member

nedbat/coveragepy#1310

graingert added a commit to graingert/distributed that referenced this issue Jan 27, 2022
@graingert graingert mentioned this issue Jan 27, 2022
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants