Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Rework checks API #1689

Open
Stranger6667 opened this issue Feb 12, 2023 · 9 comments
Open

[FEATURE] Rework checks API #1689

Stranger6667 opened this issue Feb 12, 2023 · 9 comments
Assignees
Labels
Core: Checks What issues Schemathesis can find Difficulty: Hard Complex, needs deep understanding Priority: Medium Planned for regular releases Status: Needs Design Issue requires more design work Type: Feature New functionalities or enhancements

Comments

@Stranger6667
Copy link
Member

Stranger6667 commented Feb 12, 2023

There were some ideas on how checks API could look in order to allow the user to build much more flexible checks.

The first change is to introduce a way to run checks not only in the scope of a single case & response but also in the scope of all cases & responses.

Then add context as it is in hooks, targets, etc.

And filters - so checks could be applied to specific endpoints only. It could also be possible to filter by response status / some other properties.

Based on this comment:

@schemathesis.checks.response
def my_response_check(context, case, response):
    # Run on every case-response pair
    ...


@schemathesis.checks.operation
def my_operation_check(context, cases, responses):
    # Run once after all test cases for a specific operation have been executed
    ...


@schemathesis.checks.suite
def my_suite_check(context, cases, responses):
    ...



@schemathesis.checks.response.apply_to(method="POST", path="/users/", status_code=[201, 400])
def custom_check(context, response, case):
    ...
@Stranger6667 Stranger6667 added Status: Needs Triage Requires initial assessment to categorize and prioritize Type: Feature New functionalities or enhancements labels Feb 12, 2023
@Stranger6667 Stranger6667 added this to the Schemathesis 4.0 milestone Feb 12, 2023
@Stranger6667 Stranger6667 self-assigned this Feb 12, 2023
@Stranger6667 Stranger6667 modified the milestones: 4.0, 3.22 Oct 12, 2023
@Stranger6667 Stranger6667 added Priority: Medium Planned for regular releases Difficulty: Hard Complex, needs deep understanding Core: Checks What issues Schemathesis can find Status: Needs Design Issue requires more design work and removed Status: Needs Triage Requires initial assessment to categorize and prioritize labels Oct 12, 2023
@IvanRibakov
Copy link

Hi, I've arrived here from the #1147 thread as we're currently experiencing the same false-positive valdiation results when migrating from Dredd to Schemathesis. We're using Schemathesis to only validate the schema of the responses and as such CLI-based usage is more than enough for us. Is the new Checks API going to be usable/configurable via CLI or perhaps some custom OpenAPI schema tags?

@Stranger6667
Copy link
Member Author

Hi! I was thinking about exposing some checks in the contrib sub package, so it is available from CLI, additionally I want to make some current checks configurable (eg status codes to ignore). Could you briefly describe what kind of problem you’d like to solve with configurable Schemathesis checks?

@IvanRibakov
Copy link

Could you briefly describe what kind of problem you’d like to solve with configurable Schemathesis checks?

As mentioned, we're using Schemathesis to validate response schema correctness based on the OpenAPI document. In the OpenAPI document we include examples that generally illustrate the "happy path" (2XX) responses but we document some of the possible failure response codes. We are not interested in triggering those failures codes - they are there purely for documentation purposes. Assuming we can achieve that somehow, we would want to treat every non-2XX response as a test failure (even though it may be one of the documented response codes for the endpoint).

@Stranger6667
Copy link
Member Author

Thank you! I will take it into account :) likely I’ll work on it in January

@IvanRibakov
Copy link

Also it seems that currently checks (@schemathesis.check) do not run on the dependent operation tests (those mentioned in the OpenAPI links: section). It would be great if the new CheckAPI did not discriminate dependent operation tests.

@Stranger6667
Copy link
Member Author

Oh, these ones should be passed explicitly at the moment, but I think it should not be an issue to use all registered checks there if no explicit set of check is provided. Id say it is an oversight, I’ll take a look at it during this week

@Stranger6667
Copy link
Member Author

@IvanRibakov Can you, please, show how have you discovered such behavior? I can not reproduce it with CLI or pytest integrations for stateful testing:

hooks.py

import schemathesis

SHOWN = False


@schemathesis.check
def my_check(response, case) -> None:
    global SHOWN
    if not SHOWN:
        SHOWN = True
        print("CUSTOM CHECK")

Running against the built-in test app (./test_server.sh 8081 --spec=openapi3)

⇒  SCHEMATHESIS_HOOKS=hooks st run http://127.0.0.1:8081/schema.yaml -c all -E users
====================================================================================== Schemathesis test session starts ======================================================================================
Schema location: http://127.0.0.1:8081/schema.yaml
Base URL: http://127.0.0.1:8081/api
Specification version: Open API 3.0.2
Random seed: 24312450841532103631291903625221961034
Workers: 1
Collected API operations: 3
Collected API links: 3

POST /api/users/ CUSTOM CHECK
.                                                                                                                                                                                      [ 33%]
    -> GET /api/users/{user_id} .                                                                                                                                                                       [ 50%]
        -> PATCH /api/users/{user_id} .                                                                                                                                                                 [ 60%]
    -> PATCH /api/users/{user_id} .                                                                                                                                                                     [ 66%]
GET /api/users/{user_id} .                                                                                                                                                                              [ 83%]
PATCH /api/users/{user_id} .                                                                                                                                                                            [100%]

================================================================================================== SUMMARY ===================================================================================================

Performed checks:
    not_a_server_error                              600 / 600 passed          PASSED 
    status_code_conformance                         600 / 600 passed          PASSED 
    content_type_conformance                        600 / 600 passed          PASSED 
    response_headers_conformance                    600 / 600 passed          PASSED 
    response_schema_conformance                     600 / 600 passed          PASSED 
    my_check                                        600 / 600 passed          PASSED 

Tip: Use the `--report` CLI option to visualize test results via Schemathesis.io.
We run additional conformance checks on reports from public repos.

============================================================================================= 6 passed in 4.05s ==============================================================================================

Running pytest against the following file also shows that the check is executed:

import schemathesis


SHOWN = False


@schemathesis.check
def my_check(response, case) -> None:
    global SHOWN
    if not SHOWN:
        SHOWN = True
        print("CUSTOM CHECK")


schema = schemathesis.from_uri("http://0.0.0.0:8081/schema.yaml")

APIWorkflow = schema.as_state_machine()
TestAPI = APIWorkflow.TestCase

@Stranger6667
Copy link
Member Author

Please, disregard the code samples I provided in the previous comment,the check does not really show if the cases are coming from stateful transitions

@Stranger6667
Copy link
Member Author

Not sure if it was something you've observed, but in CLI all case instances that get to checks have their source as None which indicates that they are not derived from stateful operations. However, due to the way stateful testing works in CLI, even derived cases have their source=None. My long-term plan is to migrate CLI to proper state machines that have all things in place, but I don't have a good idea of how to display stateful testing results in CLI :(

@Stranger6667 Stranger6667 removed this from the 3.24 milestone Jan 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Core: Checks What issues Schemathesis can find Difficulty: Hard Complex, needs deep understanding Priority: Medium Planned for regular releases Status: Needs Design Issue requires more design work Type: Feature New functionalities or enhancements
Projects
None yet
Development

No branches or pull requests

2 participants