-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Coverage report in coveralls seems incorrect #1736
Comments
One more note is that in my project - I have had other developers raise the same issue. This happens in typescript with karma-coverage also, so it is not a pytest specific issue. Other info that may be helpful:
|
Hi, @AbdealiLoKo. Thanks for the detailed description and the link. Always helpful. 🙏 One more thing that will help though is a link to the source file that's showing this branch coverage on or around Line 516-518. Can you please share that so I can examine it more closely? I'll do my best to find it in the meantime. In general, though. Your observation seems totally accurate:
My initial thoughts / questions:
To answer your follow-up comments / questions:
Yeah, that's no good. Please share any additional instances that your colleagues have shared and I'll check them like this one.
Good to know. I will be interested in examining both Coveralls coverage reports (JSON) and, perhaps, compare them to the original reports in their original formats if it comes to that.
That's perfectly fine, but if you weren't aware, our official integrations have received major upgrades over the past year, and our Universal Coverage Reporter, which is now the underlying integration for Generally speaking, if you can use an official integration, many benefits to that.
Perfect. Try upgrading to Try starting with the standard usage example, which is a two (2) line configuration in default mode, or the parallel build usage example if you're running parallel builds.
Then you probably are running parallel builds. Shouldn't be an issue. Let me know if you need any help updating your Coveralls integration(s). The other benefit of using the official integration, BTW, as applies in this case with your use of the Coveralls GitHub Action, is that you can use one integration for multiple languages / report formats and thereby standardize (not to mention simplify) your Coveralls integrations. |
Thanks for the reply @afinetooth Answers to the initial questions:
Some replies to the other comments:
Might need to be updated for new users |
I was apprehensive that I could create a MRE. Original issue - Minimum Reproducible Example:
I tried removing the
I don't know if I can download the coverage files from coveralls ... Update: I tried to run my tests in a single machine, and my coveralls coverage increased from 86% -> 89% without any change in source or tests. Here is the new build with all my tests running in 1 github worker:
This seems to indicate there is some issue in how coveralls is "merging" the multiple coverage files that are being sent to it. |
@AbdealiLoKo thanks for the updates. Apologies for the delay. As we discussed in email, since you have a paid subscription, you can get support by directly emailing support@coveralls.io, or by scheduling a live support call. This board is for open-source users and meant for community response, though we do try to sweep new issues here at least once/week. We have talked about some of these things elsewhere, but let me reply here as well for consistency and to fill any gaps:
Got that and replied. You'll want to use support@coveralls.io going forward. 🙏
Great. Good to know.
Got it. Have forwarded to support@coveralls.io and you will get a reply from there.
Absolutely, and this is another reason it's best to use support@coveralls.io---because paid subscriptions are for private repos and private repos may be sensitive, so while it's fine to leave a question here, we recommend also following up in email to support@coveralls.io with any other details that could be sensitive.
Excellent. Thanks for clarifying.
Yes, and in your case, the format will be
Yes! Thanks so much for catching that, we'll update that. (v1 only supported |
Also wanting to reply in full here to cover any gaps:
First of all, thanks so much for taking the time to reproduce an example. I would say that, in the future, you can avoid creating public repos in order to generate MREs that you can share in this forum, just because you can feel free exchanging original examples from, or MREs created in, private repo's in email (support@coveralls.io). But again, thanks for going to the effort here. It's very helpful to see the example free of context. And your insights are spot on. I think this could be related to coverage report merging.
OK, just to clarify:
Now, here's an observation, and a discovery while examining your second example, which I think is a likely factor:
So again, just to clarify:
But we have a difference here per the observation and discovery I mention above:
Three (3) comments on that:
(Again, these examples are very helpful, and interesting, so thanks again.) Update (WIP...)
Right now, that's not something we make available to end users (we're are considering it), but we do store them and I can access them as an admin of Coveralls. (As I said, we are considering making those available, but the feature would require careful UX design because of all the different permutations, and, at this point, that work has not begun.) But thank you for storing your original reports as artifacts! I suspect we will need to compare the Coveralls JSON versions to the original versions to see if the originals were misread by the Coverage Reporter, whether or not that entailed a merge operation.
Fascinating. First of all, I did not realize that your two jobs were parallelized test runs. That's correct, though, right? It took me a minute to realize that you are splitting the tests for a single test suite across two manually-defined jobs in your workflow, by using the split conditions:
And that, for this third example, you simply removed the first split condition for (I am more used to seeing parallelization declared as an input option on a single job, that tells CI to split the jobs tests in the background.) In any case, this should not be a problem, since Coveralls will receive as many coverage reports as you want to send and will merge them according to your instructions---the most common being to give all reports you want merged the same So, another observation and two (2) recommendations on this:
So, in your case, I would recommend keeping your two "parallelized" jobs the same, and just making sure you give each the same
Note that I'm simply giving each upload job the same Also note that I'm encouraging the removal of
|
Just didn't want to leave this un-replied to:
That said, my message above should explain why you are probably seeing this sort of thing and how you can avoid it going forward, which is basically to follow the two (2) best practice recommendations above (Recommendation 1 | Recommendation 2). |
@AbdealiLoKo I just wanted to summarize the above, since it represents a lot of reading, which I'm sorry for, but I hope addresses all the details you shared in the same spirit in which you shared those super helpful examples: This also aligns with the analysis and recommendations I gave your colleague @indiVar0508, who also posted in this forum here (and previously here), whom I also met in a live support call about these very same issues (though using different examples). SummaryI noticed two (2) antipatterns in your Coveralls config / setup and made two (2) best practice recommendations that I think should resolve the issues you're having. That said, your examples did show some unexpected results that, even in the context of the "antipatterns" still seem like they could be bugs. In other words, Coveralls should probably handle the antipattern cases more gracefully, or else raise an exception, so I have reported those issues internally. Antipatterns
Recommendations
ConclusionsThis support request has been very instructive on our end, and your examples have been very helpful in uncovering some unexpected use cases that are, I'm sure, more likely than we would have expected. Given that, there are a couple of learnings here for us, as well as a couple of action items:
Thanks! 🙏 |
Antipattern 1: Understood |
Understood. I am basically thinking of any scenario where someone in parallelizing test runs for performance (such as by using the Probably not common for your use case (or CI), but for those using parallelism, do this for uploads that will happen
I'll close this for now. Just re-open if it becomes relevant again (or start a new issue). 🙏 |
@AbdealiLoKo I wanted to circle back to this issue, even though it was closed, by me, after making the best practice recommendations above. I have since discovered more about the root cause here (after working on a similar issue) and would like to give you some additional direction to make sure you aren't still seeing issues like First let me revisit the best practice recommendations, which still stand (I've condensed them here):
So, I think that, given those recommendations, you were going to stop converting your
I'd love to know if you're still seeing the issue with Thanks. |
I went with Option 1. Did not solve the issue:
|
I tried option 2 now. And it seems like with cobertura that issue does not exist anymore.
Note that even trying with different flag-names did not give the same issue. So, the culprit seems to be with lcov files only (sighs) |
@AbdealiLoKo regarding this attempt:
I see that Coverage Reporter processed two coverage reports, each one covering all files in the project: And each contained branch coverage data; and, in particular, marked Line 18 as being a branch: So this suggests the original coverage report(s) (either Maybe we can look into the original report to verify. Are you able to share the original Which, AFAIK, should have instructed Coverage Reporter to only parse your Did you happen to parallelize your test suite in CI? (Such that we would have received two uploads of data from your Update: I don't recall what your break-up of duties is there, but both seem to be sending coverage data for all files in the project. |
Regarding your second attempt:
It's clear you're using the And the two (2) coverage reports: Show no branch coverage for Line 18: So, either there is an issue with the original I assume the issue is with our 🙏 🙏 |
Here are the 2 things being generated for myapp1 and myapp2 |
I am seeing cases where the coverage in coveralls does not match what I expected.
I have a test (which is passing in my CI) that goes something like this:
This is the coverage report I see in the coveralls UI:
Something was fishy here cause :
raise
statement is not a branching statement. So, why is coveralls telling me that some branches were not tested ?So, I ran the testcase with
pytest --cov --cov-report=html
in my laptop.This is the coverage report I see in htmlcov:
According to this, that line is fully covered.
If you'd like to check the issue - you can access it at https://coveralls.io/builds/63824265
The text was updated successfully, but these errors were encountered: