Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add python 3.11 to GPU CI matrix #8598

Merged
merged 1 commit into from Mar 22, 2024

Conversation

charlesbluca
Copy link
Member

With rapidsai/build-planning#3, we're now publishing RAPIDS packages for 3.11 - we should test against those packages here (and in dask/dask, but probably once dask/dask#11010 is in).

@mrocklin
Copy link
Member

Quick question, I don't think we often see people engaging with the GPU CI on Dask PRs. Is that correct? Are you all actiely tracking each PR that fails GPU CI?

If not, I propose that we stop running GPU CI on these PRs and that instead you all run a nightly test or something. Thoughts?

It probably doesn't make sense to keep running GPU CI on every PR if no one is looking at the results.

@rjzamora
Copy link
Member

@mrocklin - Before the query-planning switch we were absolutely engaging on gpuCI in dask. It was a great way for us to stay on top of breaking changes.

For the past few weeks we have been working to get dask-expr working with rapids (many small things are still broken, but we are getting much closer). I was nervous that you would propose removing it, because it has been quite valuable to us over the past year.

During the maintainers sync, I have mentioned a few times that rapids is trying not to slow down the legacy->dask-expr transition, and so it is temporarily okay for gpuCI to be broken.

Copy link
Contributor

Unit Test Results

See test report for an extended history of previous test failures. This is useful for diagnosing flaky tests.

    28 files   -     1      28 suites   - 1   10h 43m 34s ⏱️ - 20m 45s
 4 056 tests ±    0   3 935 ✅  -     3    109 💤 ± 0  11 ❌ +2  1 🔥 +1 
53 492 runs   - 1 410  51 069 ✅  - 1 343  2 347 💤  - 62  75 ❌  - 6  1 🔥 +1 

For more details on these failures and errors, see this check.

Results for commit 5030ab1. ± Comparison against base commit abd1539.

@mrocklin
Copy link
Member

Ah ok, I just misunderstood the situation. I was seeing GPUCI broken for much of the last couple of years without NVIDIA folks responding on PRs when things were off. Maybe you all are actually aware of what is going on though and just operating in the background? Or maybe I'm just not as attentive as I could be here and missed stuff. My apologies.

@rjzamora
Copy link
Member

Maybe you all are actually aware of what is going on though and just operating in the background?

I think it's fair that you think we are ignoring things. We do often work in the background, but I can/should develop a more visible process on our end. I appreciate the nudge.

@pentschev
Copy link
Member

I think it's fair that you think we are ignoring things. We do often work in the background, but I can/should develop a more visible process on our end. I appreciate the nudge.

I don't think it's fair we are "ignoring" things as normally folks like Florian and James are nice enough to nudge us when gpuCI is broken for some reason and then we act to fix it. It is true though that we don't watch every PR which "in a perfect world" is precisely what we hope for, ideally it is always green but we should catch things that break GPU usage when a change occurs either on Dask/Distributed's end or by some other dependency in the context of Dask/Distributed.

Copy link
Member

@jrbourbeau jrbourbeau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @charlesbluca!

Just noting the failures here are unrelated and possibly already fixed (xref #8591). I'm testing over in #8599 to see if main is broken or not.

@jrbourbeau jrbourbeau merged commit e72d2b2 into dask:main Mar 22, 2024
24 of 37 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants