Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable testing with py311 #51

Merged
merged 3 commits into from Jul 16, 2022
Merged

Conversation

ssbarnea
Copy link
Member

No description provided.

@BeyondEvil
Copy link
Contributor

Any particular reason this is in draft?

@ssbarnea
Copy link
Member Author

Because I did not want to bother you with it until I get it to pass.

@BeyondEvil
Copy link
Contributor

Oh, I didn’t even notice it wasn’t passing. 😂

@ssbarnea ssbarnea marked this pull request as ready for review July 15, 2022 16:03
@ssbarnea
Copy link
Member Author

You need to be careful with jobs altering GHA, because it was failing to load the workflow and only pre-commit.ci action run. Because this repo is not configured to require any of the jobs from this pipeline to pass, it reported green... with only linting passing.

I know how to fix that nicely too, with a summarizing job (we call it check), and only that that one as required in repository config, as you do not want to edit branch protection every time a job name is changed. Example: https://github.com/ansible/ansible-lint/blob/main/.github/workflows/tox.yml#L255-L280

@BeyondEvil
Copy link
Contributor

You need to be careful with jobs altering GHA, because it was failing to load the workflow and only pre-commit.ci action run. Because this repo is not configured to require any of the jobs from this pipeline to pass, it reported green... with only linting passing.

I know how to fix that nicely too, with a summarizing job (we call it check), and only that that one as required in repository config, as you do not want to edit branch protection every time a job name is changed. Example: https://github.com/ansible/ansible-lint/blob/main/.github/workflows/tox.yml#L255-L280

Good point!

Yeah, that's a nice solution!

@ssbarnea
Copy link
Member Author

@BeyondEvil that is ready AFAIK. BTW, if you need help with maintenance chores on this one or other pytest-dev projects let me know, I would not mind helping.

Maybe we can start using some reusable pipeline across some projects, thus minimizing the amount of maintenance we have to do on GHA workflows. I already know that it should be possible to even make the list of python version relatively dynamic, so we could for example add a new python version and have it running on all projects using that reusable pipeline.

I will first try to implement this for my team, but if this sounds appealing maybe we can minimize the amount of maintenance needed for keeping CI/CD pipelines working on pytest plugins in general. -- in fact I do have similar use-case, where I need to nurture molecule plugins, 15+ of them and I got bored of boring the same boring changes on all of them.

@BeyondEvil
Copy link
Contributor

@BeyondEvil that is ready AFAIK. BTW, if you need help with maintenance chores on this one or other pytest-dev projects let me know, I would not mind helping.

I maintain a couple of plugins:

  • variables
  • metadata
  • base-url
  • selenium
  • html

The maintenance burden on the first three is very low.

The fourth one, selenium, is I would say medium.

The fifth and last one, html, has a very high maintenance burden.

So if there's any of them I would very much appreciate your help with, it's that one.

In fact, for the past year myself and a frontend friend of mine have been working on a complete rebuild of that plugin. For multiple reasons:

  • Get rid of the deprecated py dependency.
  • Move most if not all frontend stuff to JS where it belongs, keeping python as a "pure" backend.
  • Lower the maintenance burden.

There's a branch called next-gen where this work is taking place. HOWEVER, that branch is not updated as much of the changes are on a branch on my fork.

I plan to merge that during this weekend, I just have to build up the energy for all the conflicts. 😅

Once that is merged and cleaned up, we're going to be at about a 90% feature parity with the state of v3 (legacy).

Hopefully, this rebuild will also solve a lot of reported issues and the idea is that it will be easier to add new features.

I will let you know once the merge is done and at that point I would very much appreciate you reviewing the code. You can do it now if you want, just keep in mind that it's not really ready for review yet.

Maybe we can start using some reusable pipeline across some projects, thus minimizing the amount of maintenance we have to do on GHA workflows. I already know that it should be possible to even make the list of python version relatively dynamic, so we could for example add a new python version and have it running on all projects using that reusable pipeline.

While that sounds awesome, depending on the effort, it might not be worth it. But if you implement it, I won't stop you. 😅

However, something I would love for all of my plugins is basically a state where once a PR is merged, it is automatically released.

I use https://github.com/semantic-release/semantic-release professionally, and it works really well. You only need a little bit of enforcement of commit messages, and the rest is handled by semantic-release.

I will first try to implement this for my team, but if this sounds appealing maybe we can minimize the amount of maintenance needed for keeping CI/CD pipelines working on pytest plugins in general. -- in fact I do have similar use-case, where I need to nurture molecule plugins, 15+ of them and I got bored of boring the same boring changes on all of them.

Sounds good! 👍

Please review this: pytest-dev/pytest-html#522

@ssbarnea

@@ -11,12 +11,14 @@ on:

jobs:
test:
runs-on: ${{ matrix.os }}
runs-on: ${{ matrix.os || 'ubuntu-latest' }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add a comment explaining why this is necessary?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In theory it should not be needed to have a fallback value as we do already have all values defined for "os" dimension, but to my surprise, without it GHA failed to load the workflow complaining about the empty value inside run-on. Adding a default, sorts this issue. it is not really used by any job on the matrix, but if you forget to add os on one "include", it will make use of it.

Comment on lines 19 to 21
os:
- ubuntu-latest
- windows-latest
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think reverting this change is better than having to add:

runs-on: ${{ matrix.os || 'ubuntu-latest' }}

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is the second place where I raised my brows while testing the pipeline, GHA complained about os at this line, and making it an explicit list fixed the issue. It is bit perplexing because I know well enough that part of the YAML specification and both representations are valid and should load the same. Still, one failed another one did not.

Probably you know the Github is not using a standard YAML loading library and they have their own implementation, one that does not even supports anchors. I would not be surprised to be another issue related to that.

PS. || works even for undefined values, if the value on the left side is not defined, it will pick the second value.

You are free to try other variants including modifying this pr.

Co-authored-by: Jim Brännlund <jimbrannlund@fastmail.com>
@ssbarnea
Copy link
Member Author

@BeyondEvil that is ready AFAIK. BTW, if you need help with maintenance chores on this one or other pytest-dev projects let me know, I would not mind helping.

I maintain a couple of plugins:

  • variables
  • metadata
  • base-url
  • selenium
  • html

The maintenance burden on the first three is very low.

The fourth one, selenium, is I would say medium.

The fifth and last one, html, has a very high maintenance burden.

So if there's any of them I would very much appreciate your help with, it's that one.

In fact, for the past year myself and a frontend friend of mine have been working on a complete rebuild of that plugin. For multiple reasons:

  • Get rid of the deprecated py dependency.
  • Move most if not all frontend stuff to JS where it belongs, keeping python as a "pure" backend.
  • Lower the maintenance burden.

There's a branch called next-gen where this work is taking place. HOWEVER, that branch is not updated as much of the changes are on a branch on my fork.

I plan to merge that during this weekend, I just have to build up the energy for all the conflicts. 😅

Once that is merged and cleaned up, we're going to be at about a 90% feature parity with the state of v3 (legacy).

Hopefully, this rebuild will also solve a lot of reported issues and the idea is that it will be easier to add new features.

I will let you know once the merge is done and at that point I would very much appreciate you reviewing the code. You can do it now if you want, just keep in mind that it's not really ready for review yet.

Maybe we can start using some reusable pipeline across some projects, thus minimizing the amount of maintenance we have to do on GHA workflows. I already know that it should be possible to even make the list of python version relatively dynamic, so we could for example add a new python version and have it running on all projects using that reusable pipeline.

While that sounds awesome, depending on the effort, it might not be worth it. But if you implement it, I won't stop you. 😅

However, something I would love for all of my plugins is basically a state where once a PR is merged, it is automatically released.

I use semantic-release/semantic-release professionally, and it works really well. You only need a little bit of enforcement of commit messages, and the rest is handled by semantic-release.

I will first try to implement this for my team, but if this sounds appealing maybe we can minimize the amount of maintenance needed for keeping CI/CD pipelines working on pytest plugins in general. -- in fact I do have similar use-case, where I need to nurture molecule plugins, 15+ of them and I got bored of boring the same boring changes on all of them.

Sounds good! 👍

Please review this: pytest-dev/pytest-html#522

@ssbarnea

I will help with pytest-html is the only one from that list that I already use, in fact fixing pytest-metadata was because it was used by html one. I am already core on html, so we should be able to move faster with changes. I already knew about "py" and few other historical tech-debts but never had time to tackle them, that was not no1 priority on the list of libraries I (try to) nurture. Still, I am quite interested about the reporting part on pytest, including GHA integration, combining results, displaying. At some point I was wondering if we could make a JS app that loads ZIPs that GHA archive creates and displays the reports in browser, so the user does not need to download reports themselves. Setting a webhost is problematic for many projects, as nobody provides S3-like for free and I am almost sure that Github will not enable more than ZIPs for archives. It is by design only that (cost and security).

@BeyondEvil BeyondEvil self-requested a review July 16, 2022 16:08
Copy link
Contributor

@BeyondEvil BeyondEvil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Explicit list seemed to work! 👍

@BeyondEvil BeyondEvil merged commit a29a465 into pytest-dev:master Jul 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants