-
-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replace Azure Pipelines with GitHub Actions #690
Conversation
I forgot to link this. It relates to #689 I'm not sure why, but GitHub Actions isn't running on the PR. I'm guessing that it's a policy thing in GitHub, since there are no workflows in the main repo. You can see a build on my fork here: https://github.com/sirosen/webargs/actions/runs/1837835283 I'm not sure what else to do in terms of validating the work? I also am not sure how we want to handle the required check which seems to be stuck. It may be waiting for Azure Pipelines. Maybe wait to turn that check off until we're ready to merge this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I hesitated to introduce check-jsonschema, as it is my own work, but I find it useful.
No problem. You can leave it..
I originally tried this with the GitHub reusable workflows feature, but a reusable workflow can't be run under a matrix config.
OK. Never mind, we can duplicate this in all repos.
Regarding the build matrix, I tend to limit the number of builds. I generally do Linux only and only test oldest and most recent Python versions.
I'd call the twine action "release" or a word relative to the goal, not the tool.
Turn on ReadTheDocs PR builds (Settings > Advanced > PR builds).
We need a pypi token from someone with push access
I don't have the credentials for RTD and PyPI. I believe only @sloria does.
I'm not sure why, but GitHub Actions isn't running on the PR. I'm guessing that it's a policy thing in GitHub, since there are no workflows in the main repo.
I also am not sure how we want to handle the required check which seems to be stuck. It may be waiting for Azure Pipelines. Maybe wait to turn that check off until we're ready to merge this?
This is unclear to me. AFAIR, it worked in my branch when I set this up in flask-smorest.
I checked the repo settings quickly and everything seems to be enabled.
@sloria, do you think you could create an API token for each repo (webargs, apispec, perhaps marshmallow) and store it as |
What about cutting it down to just linux, but all supported pythons? Since we don't touch the filesys or other very platform-specific things, we probably don't need to check macos and windows, but I'm less comfortable skipping python versions.
👍 When I set up the publishing job, it will be a "release" or "publish" step, and it will include We just got me set with test-pypi creds for |
I've added the |
- uses: actions/checkout@v2 | ||
- uses: actions/setup-python@v2 | ||
- run: python -m pip install tox | ||
- run: python -m tox -e mypy |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
don't we already run mypy in pre-commit?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We do, and it's great for catching most things. I added mypy
to the tox+azure setup a while back when it passed under pre-commit but mypy src/
failed.
I now think of the pre-commit run as a way letting mypy
catch most errors at commit-time, but tox -e mypy
as "the true mypy
run".
I don't know exactly how right or wrong that is, but it's how I reacted to mypy src/
and mypy src/**/*.py
being different.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, that's odd that those return different results.
I wonder if it makes sense to use a local
hook in pre-commit so that there's parity between CI and local pre-commit runs, as in https://github.com/sloria/environs/blob/cbbe3ae9d970cbf4c495358077b6048e0f2476c8/.pre-commit-config.yaml#L22-L30
Also, it probably doesn't make sense to run mypy twice in CI. so keep this line and add
ci:
# mypy gets run in GitHub actions
skip: [mypy]
to .pre-commit-config.yaml
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure this is relevant, but I have a project in which I don't want all pre-commit hooks to be run in CI, so I invoke them one by one in tox.ini:
[testenv:lint]
deps =
pre-commit
skip_install = true
commands =
pre-commit run pyupgrade --all-files --show-diff-on-failure
pre-commit run black --all-files --show-diff-on-failure
pre-commit run flake8 --all-files --show-diff-on-failure
I was reluctant at first but then realized that running all hooks was a shortcut but we might not always want to do that and calling them explicitly makes sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i think we do want all the hooks to run in pre-commit.ci though right? mypy is just a special case
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mypy
is definitely a (very?) special case. e.g. black
, flake8
, etc don't need app dependencies installed, but mypy
does in order to pick up typing info from libraries. That's why I prefer mypy
under tox
, since it does the full install before running.
The more I've dealt with it, the more I find mypy
under pre-commit weird. I might also experiment with a local
hook which runs... tox -e mypy
! I don't know yet. It's a bit unclear to me right now.
I'll add the skip for pre-commit.ci , but I don't want to hold up the basic switchover to github actions on details. So I'd like to punt on some of this stuff.
🙌 Similarly, I'll use this as soon as I get a chance (hopefully this evening) to work more on the build here. Another related thing: I wrote a build which would push to testpypi (visible here), but then I couldn't figure out what should trigger it, so I have that sitting. |
@sirosen can we have that run when a tag gets pushed? |
(Sorry about slow action on this -- absolutely swamped with work and personal stuff.) The issue is that testpypi is a way for us to test things before we release, so we don't want to trigger it off of the same tagging that drives the production pypi build+publish. I'd like to be able to do things like trigger a testpypi publish job (maybe that's manual?) to check that a change to the CI pipeline works. I'll continue to think about this, and I'll let that testpypi token stick around in the repo secrets so that it's available for us. In the meantime, I'll add a commit here to setup the publish job to run off of all tags, since we only tag releases anyway, similar to what @lafrech has on flask-smorest. |
df3fed5
to
d337052
Compare
While writing down the build steps to publish, I had an idea about testpypi, so I went ahead and set it up. I think that's explicit enough, and lets us use the "exact" same workflow for both cases. d337052 should explain everything. |
This is looking good! @sirosen Are you blocked on anything on my end? |
Nope, I think this should be good to merge! I'm happy to listen to feedback on it if -- for example -- we want to remove the testpypi tag build or make any other changes. But I'm pretty happy with the work as is, and we can iterate on it more later if we need changes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nothing blocking as far as i can tell. Some follow-ups to note:
- evaluate reusable workflows so that we can reduce duplication in config across repos (similar to what we're doing with Azure Pipelines templates)
- Remove testpypi releasing once we've tested that it works
- run: python -m pip install tox | ||
- run: python -m tox -e py${{ matrix.tox-factor }} | ||
|
||
# this duplicates pre-commit.ci, so only run it on tags |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thought (non-blocking): this is probably unnecessary, but i don't feel strongly about removing it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When I added it, my thought was "this is probably unnecessary, but it's easier to add it than to worry about ever having a release tag where linting fails". I don't feel strongly about keeping it. 😉
I probably won't change it today though, since I want to leave this PR untouched for at least a few hours before merging.
I'm good with this. |
I generally do only latest and oldest. Mainly because of carbon footprint vs. low benefit. But fine for me in any case. I won't have time to review this thoroughly so I'm leaving this to you guys. It can always be iterated upon later. @sloria do you think you can create the |
@lafrech you're already an owner for apispec in pypi. i went ahead and added the token. |
Ah, ah. Weird. I thought I had double-checked. Indeed, I own apispec and webargs. Thanks. I just invited you to own flask-smorest, just so that I'm not sole owner. |
Primarily: A matrix build for main testing with parameters for python version and added tox factors. Additionally add mypy and `twine check` builds, and setup check-jsonschema to validate workflow files against the schemastore schema. Include readthedocs config validation as an added bonus.
And add cron to GitHub Actions.
python.version should be a string.
On any new tag, the following additional steps will now run: - lint-pre-release (tox -e lint) This duplicates pre-commit.ci , but allows us to check that linting is passing before publishing - python -m build Build an sdist and wheel using pypa's 'build' - twine check dist/* Use twine to confirm that package metadata is all good, up to whatever validation 'twine check' implements. It doesn't *guarantee* a good/valid package, but it can catch some issues - IF the tag is of the form 'testpypi-*' - publish to testpypi - IF the tag is NOT of the form 'testpypi-*' - publish to pypi Also, mark mypy to skip in pre-commit.ci
I just did a final rebase to cleanup and pickup the build fix we just merged into I plan to merge this today in the evening (US Eastern time) when I have time to poke around repo config and make sure everything is tidy. I want to make sure we're requiring the GH Actions builds and not requiring Azure Pipelines. For the future todo list, I want to make sure not to lose something from the original comment here:
I want this because it's so easy to break docs and not notice. |
twine check
check-jsonschema
to pre-commit configI originally tried this with the GitHub reusable workflows feature, but a reusable workflow can't be run under a matrix config.
Here's a screenshot the actions pane in a build in my fork
The matrix is slightly bigger than what we had in Azure Pipelines, but we can adjust it more going forward.
Two more things I'd like us to do on the CI checklist:
I hesitated to introduce
check-jsonschema
, as it is my own work, but I find it useful. It catches a wide class of invalid workflow files when tweaking builds. If there's any concern about pulling it in, I'll drop it from the config, as it's not a big deal.