Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Collect and unify pytest results across different runs #6

Open
akihironitta opened this issue Jul 15, 2022 · 2 comments
Open

Collect and unify pytest results across different runs #6

akihironitta opened this issue Jul 15, 2022 · 2 comments
Labels
ci/cd Continuous integration and delivery enhancement New feature or request help wanted Extra attention is needed

Comments

@akihironitta
Copy link
Contributor

馃殌 Feature

Create a workflow that collects and merges pytest results from CI runs across different operating systems, acclerators and software versions.

With this feature implemented, we will be able to see a merged list of all test cases succeeded/failed/skipped across all CI configurations.

Motivation

We have tests running on across different OS, accelerators and software versions, and currently, each CI run has its own result only, which making it almost impossible to monitor which tests are running or skipped across all such configurations.

Due to this, we've experienced an issue a while ago due to the lack of observability where all of the horovod tests had never run for a long time of period in PL repo.

Pitch

To be explored.
(I guess we could somehow utilise https://github.com/pytest-dev/pytest-reportlog)

Alternatives

To be explored.

Additional context

Codecov automatically merges coverage results uploaded from different CI runs: https://app.codecov.io/gh/Lightning-AI/lightning/
AFAIK, cov result doesn't hold any pytest results, so need to find another way to collect each test case status from different CI settings.

Open for any suggestions 馃挏

@akihironitta akihironitta added enhancement New feature or request ci/cd Continuous integration and delivery labels Jul 15, 2022
@akihironitta akihironitta self-assigned this Jul 15, 2022
@akihironitta
Copy link
Contributor Author

This will enable us to easily see that skipped tests in a certain CI run (e.g. GPU CI) are covered by other CI configurations. context: Lightning-AI/pytorch-lightning#13651

@akihironitta
Copy link
Contributor Author

Here's another case where sklearn-related tests have been skipped silently: Lightning-AI/pytorch-lightning#15311

I'll check and modify akihironitta/playground#8 again to see if it's reliable enough to use in our CI.

@Borda Borda added the help wanted Extra attention is needed label Mar 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci/cd Continuous integration and delivery enhancement New feature or request help wanted Extra attention is needed
Projects
No open projects
Status: No status
Development

No branches or pull requests

2 participants