Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing tests on distributed>2023.9.2 #1265

Open
pentschev opened this issue Oct 26, 2023 · 3 comments
Open

Failing tests on distributed>2023.9.2 #1265

pentschev opened this issue Oct 26, 2023 · 3 comments
Assignees

Comments

@pentschev
Copy link
Member

pentschev commented Oct 26, 2023

Two tests are currently failing after removing the distributed==2023.9.2 pin:

FAILED tests/test_local_cuda_cluster.py::test_pre_import_not_found - Failed: DID NOT RAISE <class 'RuntimeError'>
FAILED tests/test_local_cuda_cluster.py::test_death_timeout_raises - Failed: DID NOT RAISE <class 'asyncio.exceptions.TimeoutError'>

We need to bisect to find the source of those regressions, but for now we're xfailing them to allow unpinning going through.

@wence-
Copy link
Contributor

wence- commented Oct 27, 2023

FAILED tests/test_local_cuda_cluster.py::test_pre_import_not_found

This one seems to be because scaling up a speccluster now swallows any errors when scaling up. (dask/distributed#8309)

@wence-
Copy link
Contributor

wence- commented Oct 27, 2023

FAILED tests/test_local_cuda_cluster.py::test_death_timeout_raises

This is the same cause.

@pentschev
Copy link
Member Author

Thanks for digging into that @wence- .

I'm sure this is not the first time this is problematic. This is definitely not something we need to do right now, but I'm wondering if we should do a pass on Dask-CUDA's tests that could be generalized and submit them as part of Distributed's test set to prevent regressions there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants