New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Before branching day: integration tests #3227
Conversation
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: danilo-gemoli The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
LGTM but I am not an expert in the field of integration tests in our repo. I am also not sure if we really need them for such tiny tools. |
/test e2e |
@danilo-gemoli Can you elaborate more on why we need all of those So we need the actual and expected data to already exist instead of generating temporary ones. |
Sure. I've added three integration tests, along with their
Each of which setups the whole test, declares a Each test has its tests cases located in
The script What follows is an example of execution.
Actual and expected data are already there, nothing is generated on the fly. |
@danilo-gemoli The only temporary files we should generate are the actual data. The test should have input and expected data, and a script to compare the expected with the actual data. |
Folders structure (like |
ac02382
to
005d338
Compare
@droslean I have changed it a little bit in accordance with your advice. Not even the folders structured is generated on the fly anymore. Thanks! |
@@ -0,0 +1,37 @@ | |||
releases: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no need to separate this file into a folder named test1
. We need to have all the input files in one place like we have them in the release repo. And to extend different cases you need to add more configs that include those cases. The tool should run only once and generate the desired output.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These test cases are much like the ones you can find into the cluster-init
integration tests here: test/integration/cluster-init. I can adhere to the same naming conventions input
and expected
instead of data
and want
and renaming test1
, test2
, ... testN
to something more meaningful but, other than that, both have test cases split into several folders, each of which recreates part of the openshift/release
repository folders structure.
If you take a look at test/integration/cluster-init.sh you'll notice that cluster-init
tool runs more than once.
005d338
to
a2d8835
Compare
@danilo-gemoli: The following test failed, say
Full PR test history. Your PR dashboard. Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. I understand the commands that are listed here. |
Issues go stale after 90d of inactivity. Mark the issue as fresh by commenting If this issue is safe to close now please do so with /lifecycle stale |
Stale issues rot after 30d of inactivity. Mark the issue as fresh by commenting If this issue is safe to close now please do so with /lifecycle rotten |
Rotten issues close after 30d of inactivity. Reopen the issue by commenting /close |
@openshift-bot: Closed this PR. In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
These integration tests cover the tools developed in:
Run them locally with:
/cc @jmguzik @droslean