New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use cibuildwheel to build wheels. #491
Conversation
Codecov Report
@@ Coverage Diff @@
## main #491 +/- ##
=======================================
Coverage 88.96% 88.96%
=======================================
Files 6 6
Lines 1685 1685
=======================================
Hits 1499 1499
Misses 186 186 Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excellent, thank you!
We should have done this sooner. Look at all that code we get to delete!
👏
I've rejiggled the work distribution. Now Windows, macOS and normal (x86_64 and i686) Linux get one job each. Then the really slow qemu emulation (Linux on aarch64) is split per Python version. Overall runtime is 13 minutes with Windows being the slowest (despite have half as much work to do as the native Linux runner 🙄).
What do you think about also splitting the deploy-wheels-native.yml
matrix into Python versions? Then each one would be maybe 90 seconds or less.
Then the bottleneck would be a QEMU job at around 9 minutes, but all the others should have finished by then.
Another nice benefit of separate jobs is each wheel is uploaded to PyPI a few minutes earlier, and so narrowing the window someone ends up getting and building the sdist.
Finally, I think it's worth bundling this in with the 3.6 drop = major version bump. Might be safer just in case of incompatibilities. What do you think?
Honestly, I find firing off 58 different jobs to save a few minutes thumb twiddling time slightly excessive but, sure, your call...
Those qemu jobs are doing two jobs each (one glibc and one musl) so we could slice them in half too to bring it down to about 5 minutes.
Surely if the timing is that critical then it would be best to maket the sdist be uploaded last? i.e. Stick it in its own job to be launched only when the wheel building jobs succeed?
The Windows and macOS wheels are now being built with https://python.org/downloads Pythons instead of those on https://github.com/actions/python-versions so any difference in the compiler flags used building Python will now be affecting those wheels too. Without rummaging through either's build systems, the only difference I know of is that python.org sets the macOS deployment target to 10.9 whereas Github's is unset and defaults to 10.14 (the macOS VM version they build on). So these new wheels should be more compatible in that they also support macOS 10.9 and up. I'd be quite surprised if this is a breaking change but if you're already about to do a major version bump then we might as well assume the worst. |
This will enable the building of macOS ARM64 compatible wheels, fixing ultrajson#456 (and also lets us delete lots of code!!!).
866eb71
to
3ac5b9d
Compare
Ah, let's go for that :) Can we have
Yep, let's ditch 3.6 from this one too. The new |
4946b45
to
5c0ca34
Compare
You can indeed but it looks like they must be defined all in one file so I've merged all 3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because deploy.yml
can now be triggered by PR, we'll need a guard for the Test PyPI upload, like for the other two:
if: |
github.repository == 'ultrajson/ultrajson' &&
github.ref == 'refs/heads/main'
a654ab8
to
c637f81
Compare
c637f81
to
04286a6
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!
Will merge this and #490 in December, to coincide with 3.6 EOL month.
Will likely release before the actual EOL date, the previous 3.6-compatible versions will still be available on PyPI and python_requires
will help out.
Thank you! |
All deployed nicely, thanks again! https://github.com/ultrajson/ultrajson/actions/runs/1587833653 |
You're welcome 🙂 |
Fixes #456.
Changes proposed in this pull request:
We should have done this sooner. Look at all that code we get to delete!
I've rejiggled the work distribution. Now Windows, macOS and normal (x86_64 and i686) Linux get one job each. Then the really slow qemu emulation (Linux on aarch64) is split per Python version. Overall runtime is 13 minutes with Windows being the slowest (despite have half as much work to do as the native Linux runner 🙄).