Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

platform-independent wheels #95

Open
amras0000 opened this issue Nov 10, 2017 · 15 comments
Open

platform-independent wheels #95

amras0000 opened this issue Nov 10, 2017 · 15 comments

Comments

@amras0000
Copy link

amras0000 commented Nov 10, 2017

Related to #43 . A platform-independent pure-python wheel on pypi would be very useful for dependency resolution in linux projects.
PyYAML-?.??-py35-none-any.whl
Ultimately a manylinux wheel using libyaml would be ideal, but I understand the complications so this might be a fair compromise that's simple to implement.

@rdb
Copy link

rdb commented Nov 10, 2017

I concur, it would be great to have a platform-agnostic wheel as fallback when no platform-specific wheel is available for the platform.

@jayvdb
Copy link

jayvdb commented Mar 21, 2018

c.f. #43

@webknjaz
Copy link

The shared library is a binary compiled against certain arch and python version. there's no standard for shipping all of them in one wheel. That's why manylinux1 wheels exist. The project ships a whole bunch of them and then client (pip) selects which one to download and install.

There's also sdist (source python package distribution), it's a .tar.gz-packed source with build scripts and some metadata. It's not pre-compiled at all.

So your request is either duplicate of #43 or I don't understand what you want.

"generic" wheels are only shipped for projects with no binary dependencies, which is not pyyaml's case at all.

@rdb
Copy link

rdb commented Feb 20, 2019

I think what's being asked here is to produce generic wheels that don't use the LibYAML bindings and are therefore not platform-specific, as a fallback for when no more specific (eg. manylinux) wheels are available. This would be useful for packaging tools and distribution pipelines that only support wheels and can't build source distributions.

@webknjaz
Copy link

Oh, so pure-python implementation, then? This would probably not make sense once manylinux1 wheels are out...

@rdb
Copy link

rdb commented Feb 21, 2019

I think it would still make sense because not every possible platform is covered even when manylinux wheels are available. So the generic wheels would act as a fallback.

@bsolomon1124
Copy link
Contributor

bsolomon1124 commented May 24, 2020

@webknjaz to elaborate on @rdb , if --without-libyaml is used (and you do not have Cython installed when building), the result is a pure-Python wheel.

$ pyenv local 3.8.3
$ python3 -m venv venv && . ./venv/bin/activate
$ python -m pip install -U wheel setuptools
$ python setup.py --without-libyaml -q bdist_wheel
$ ls -1 dist
PyYAML-5.3.1-py3-none-any.whl

Note that this is not a py2.py3 universal wheel because of the existence of lib/ versus lib3, so the result would be two pure-Python wheels.

@bsolomon1124
Copy link
Contributor

See some comments from #407 (comment) on this - while it is possible to build a pure-Python wheel with --without-libyaml bdist_wheel, I'm wondering if that would have the unintended side effect of users unknowingly pulling down the pure-Python wheel from PyPI when they had previously fetched the sdist and build with libyaml bindings by default.

@graingert
Copy link

@bsolomon1124 it might work with all the pure python bits going in yaml-slow and have yaml depend on yaml-slow and add only the binary accelerated parts

@dirkroorda
Copy link

In pyodide you can micropip.install() a package only if it has a pure python wheel.
I want to install a package that requires pyyaml and are stumbling over the fact that pyyaml does not have a pure-python-wheel available for download on PyPi.

@nitzmahone
Copy link
Member

nitzmahone commented Jul 13, 2021

I think most of us are still of the opinion that the presence of a pure-Python wheel under the pyyaml package on PyPI causes many more problems than it solves. Users on platforms without binary wheels that have forever silently built the libyaml extension from the sdist would suddenly and silently be forced onto a slow pure-Python version on their next upgrade, with the requisite performance penalties and/or import failures, depending on how "hard" they depend on the extension.

I'd argue that splitting the package into separate Python + extension packages doesn't really solve the problem either; it's a matter of deps and discoverability, and a whole lot of new added complexity with two very tightly-coupled packages that arguably aren't very discoverable. Someone's getting hosed depending on what the dependent package's requirements specify, and whether pyyaml defaults to libyaml or not; any existing package that just depends on pyyaml isn't going to "just work" under something like pyodide- someone would have to know to switch the dependency to the pure-Python subpackage. If we went the other way, anyone that didn't explicitly install pyyaml_fast is now slow and/or broken. A completely standalone pure-python-only package would solve some of those problems, but introduces others, since Python packaging doesn't really handle top-level package conflicts (ie, now we have two things providing a yaml top-level package; path ordering becomes a complicating factor, bleh).

The needs for pure-Python wheels in pyyaml have so far been niche enough that it seems like the problem needs to be solved elsewhere (private index, upstream fixes to those envs to allow install from an sdist, whatever)- the likelihood of breaking a huge swath of users is just too great to accommodate it on PyPI IMO.

@graingert
Copy link

Recently twisted split their binary and pure python wheels and the feedback has been great so far

@nitzmahone
Copy link
Member

nitzmahone commented Jul 13, 2021

Give it a few releases for breaking API/ABI changes on the native extensions and people that, eg, have the pure-Python part installed in an OS package and the native part installed with --user- there's a special hell to sorting out mismatched bits there, and making the runtime resilient to it. We had similar problems with pyyaml recently when we moved the top-level _yaml package that hosts the extension into a subpackage; anyone that was doing import _yaml to probe for the presence of the extension was now able to pick up a stale version of the old top-level package and extension from another place on their path that tries to load the new version of the pure-Python bits.

I'm not saying it's impossible, but the !/$ factor seems awfully low for the added complexity to build/test/release and the new potential runtime hassles it exposes (especially since there are no $ involved and all of us are working on this out of selfish desire to keep it working for our own needs ;) ).

@graingert
Copy link

How about trialing the pattern in a wholly new project namespace like psycopg3 are doing?

@dirkroorda
Copy link

I use yaml for config files mostly. Not performance critical at all. Maybe json is a better choice. And json is in Python's standard library. Added bonus: good documentation. When I look at https://pyyaml.org/wiki/PyYAMLDocumentation occasionally, I'm bewildered.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants