New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Option to write out all individual test case results to json output file #365
Comments
Can you provide an example JSON that you would like to add to the output json? Why do you want to upload it to the Checks API (according to airbytehq/airbyte#18004)? You could upload the JSON file as an artifact and retrieve that? Or is that because checks are easier to retrieve for a commit than artifacts? I can see why this information is valuable, but I'd prefer to add it to the file only since that content can become massive. |
Sorry if it wasn’t clear from the wording of the issue. The ask is to add it to the output file only if some optional parameter is specified and not to modify what gets uploaded to the Checks API. 100% agree that this info is very often way too big to even remotely fit in a lot of use cases; with my use case the output of the “cases” object easily exceeded 1 million bytes (both 'summary' and 'text' in Checks Run API have limit of ~65k bytes). I think the way the “cases” key/value exists as it does currently would probably be more than sufficient if it was added to the output JSON file with little to no modification. |
PR with aforementioned approach: #366 |
Then we are on the same page. Sounds good to add this to the JSON file! |
@cpdeethree Btw., can you please remove your forked action from the GitHub Marketplace, or clearly state that it is a fork: https://github.com/marketplace/actions/publish-test-results-with-json-text |
@EnricoMi delisted! |
Appreciated! |
I am quite interested in individual test case results for Pass, Fail and Skipped (vs just failed). It would be nice if we somehow expose the option to access individual test case results in the output json (the contents of this object)
https://github.com/EnricoMi/publish-unit-test-result-action/blob/master/python/publish/publisher.py#L296
I'm happy to submit a PR that does something like add a cleaned up version of that object to the output json if an optional parameter is supplied to the action, for instance, if it makes sense.
The text was updated successfully, but these errors were encountered: