Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Conformance] Allow for reporting results before fully achieving conformance #3036

Open
robscott opened this issue Apr 29, 2024 · 3 comments

Comments

@robscott
Copy link
Member

What would you like to be added:
A way to display "in progress" or "partial" conformance.

Why this is needed:
In #3021 and #3025 we ran into situations where an implementation was not able to support a core feature in a conformance profile. Despite that meaning that the implementation is not conformant yet, it is likely still helpful for the broader ecosystem if they are able to report their current status, especially given the incoming work to display results in #2874. This could also be helpful for other implementations to report their current status even if they haven't reached 100% support of core features yet.

Note: This is meant to be the start of a discussion, this is not ready to be worked on yet.

@youngnick
Copy link
Contributor

I think that it might be helpful, for this and other unforeseen problems with conformance reports, to have a spot in the conformance suite and the resultant output that allows the suite to say "There are issues with this report, that causes it to not be conformant for some reason that's not covered by the suite".

I think that, in general, we should encourage implementations to submit partial conformance results, or other results that say "Our implementation is working on conformance but it's not done yet". This allows for:

  • the Gateway API community to know who's working on implementations, whenever the implementation is ready to tell us
  • implementors to show a clear progress towards conformance to their business

Using an override feature like this would require:

  • agreement between Gateway API maintainers and implementation maintainers
  • clarity on what this means (currently, it will mean that your implementation won't show up in the upcoming comparison matrix, but more importantly, this will unblock implementations that do us the service of finding edge cases in our conformance testing to get something in to record their efforts.

In terms of more concrete changes required we would need to add:

  • some stanza to record both that the report does not pass, even though the tests would indicate it should
  • a reason field to indicate why
  • a link to a Github issue that tracks what we are doing about this (so the implementation doesn't end up stuck in limbo)
  • an expiry date for the status (so that we all have a deadline).

Again, just a quick proposal for discussion, not intended to be binding.

@mlavacca
Copy link
Member

After discussing #3021 and #3025, I agree with you it makes sense to allow implementations to submit partial reports, for all the good reasons listed by @youngnick as well.

When it comes to the report API, we already have the partial result automatically set by the suite in case some tests don't pass:

profiles:
- core:
result: partial
statistics:
Passed: 28
Skipped: 1
Failed: 0
.

If we need to include some information on the profile to allow implementations to say "I am not supporting this or that because the suite lacks this functionality that I need", we have the summary

// Summary is a human-readable message intended for end-users to understand
// the overall status at a glance.
Summary string `json:"summary"`
field that is currently filled in with an automatic message produced by the suite. That field could be leveraged by the implementation to state why the report is not complete and even add GH issues.

For what concerns us allowing implementations to submit partial results, what's currently missing is the explicit allowance to do so in our docs, as we are currently saying that partial reports are not allowed:

- Test result: in order to have a report valid to be accepted, all the profiles
need to have the `result` fields (core and extended) set to `success`. It means
that all the core conformance tests have been successfully run as well as all
the tests related to the supported extended features. No reports with partial
or failing results can be accepted.

One thing I think we should pay attention to is that we should require implementations to always write the current status of their support level in the README. If there are any workarounds the projects need to use because of implementation/suite limitations, they need to be listed here, along with the related issues.

@youngnick
Copy link
Contributor

What I meant was that, in the case that there's something that we don't catch in the suite that renders the result not complete (like the use of annotations in #3021 as part of their suite, which means that the reproduction instructions are not runnable), there needs to be a way in the conformance YAML for implementation maintainers to say "although we're passing conformance, something else is wrong, and the report doesn't count". I don't think this is the last time this will happen.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants