Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

upload json-formatted coverage reports #518

Closed
wants to merge 3 commits into from

Conversation

mattip
Copy link

@mattip mattip commented Jun 14, 2022

As per this comment, add an upload of json-formatted coverage test reports so that this tool https://carreau.github.io/pytest-json-report-viewer/ can be used to analyze the time spent in tests

@mattip
Copy link
Author

mattip commented Jun 16, 2022

The PR workflow needs approval to run

@BeyondEvil
Copy link
Contributor

Sorry about that, I hadn't gotten a notification for this. 🙇

@mattip
Copy link
Author

mattip commented Jul 2, 2022

It seems something is off with all the windows runs. Any ideas?

@BeyondEvil
Copy link
Contributor

Unfortunately no. Searching for the error just brings up old issues that doesn't seem relevant.

And I'm not set up on any Windows machines to try and debug. :(

@BeyondEvil
Copy link
Contributor

Hey @mattip

Sorry for the delay, but I finally got the Windows tests to behave: #522

As soon as that's merged, I would really appreciate if you would resume looking into this. 🙇

@mattip
Copy link
Author

mattip commented Jul 17, 2022

Rebased off master to pull in the fixes for windows. The CI run needs approval to start.

@BeyondEvil
Copy link
Contributor

Note that the windows pypy-38 is deactivated. If you want that to be part of the report.

@mattip
Copy link
Author

mattip commented Jul 17, 2022

I needed to add --durations=0 to actually get the test durations output. Duh. In any case, this seems to work: downloading the artifact, unzippping it, and uploading the files to https://carreau.github.io/pytest-json-report-viewer/ is starting to show something.

@mattip
Copy link
Author

mattip commented Jul 17, 2022

Could you trigger the CI when you get a chance?

@mattip
Copy link
Author

mattip commented Jul 18, 2022

This is working, there was an issue in the viewer which is on its way to being fixed. Although I am not sure currently of the added value over looking at the durations report, here for py37mac and here for pypy37mac. For instance, testing/test_pytest_html.py::TestHTML::test_collect_error (why are there two?) clocks in at 37 secs on PyPy, and 0.45secs on CPython.

@BeyondEvil
Copy link
Contributor

This is working, there was an issue in the viewer which is on its way to being fixed. Although I am not sure currently of the added value over looking at the durations report, here for py37mac and here for pypy37mac. For instance, testing/test_pytest_html.py::TestHTML::test_collect_error (why are there two?) clocks in at 37 secs on PyPy, and 0.45secs on CPython.

There are several tests that are reported twice.

I wonder if it's related to: #508

Regardless, I have no clue why that happens.

The most interesting thing is, does the test really run twice or is it just reported twice? 🤔

@BeyondEvil
Copy link
Contributor

Care to revisit this @mattip ?

If not, feel free close. 🙇

@mattip
Copy link
Author

mattip commented Nov 4, 2023

I will continue this at some point in the weekly CI runs of PyPy + pytest-html. So I will close this, thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants