Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Libopenblas issues in conda build when adding <pybind11/numpy.h> functionality on OSX #66

Open
EelcoHoogendoorn opened this issue Nov 23, 2020 · 6 comments

Comments

@EelcoHoogendoorn
Copy link

If I clone this repo, it works great. If I add some simple pybind11/numpy functionality (passing a numpy array in and out), I run into trouble on OSX, when trying to build a conda package.

  • Doing a pip install in my conda env is fine; compiles, links, passes tests. Though I have not tried relocating the wheel to another env than the one it is built in, which I suspect might trigger the same problem.
  • The conda build works just fine on linux.
  • Obviously, I did add numpy to both my host and run requirements. But it cant find libopenblas when invoking the test. libopenblas and liblapack are in the test environment as you would expect, them being a numpy dependency.
  • Problem happens regardless of python version on OSX

Perhaps adding a little pybind/numpy to this example repo would be cool. Probably this is something simple that more seasoned OSX-linking-gurus can spot easily, but it eludes me after a few days of messing around with it.

Here is what the conda build tests have to say:

Hint: make sure your test modules/packages have valid Python names.
Traceback:
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pl/lib/python3.8/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/test_pyopcode.py:1: in <module>
    import numpy as np
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pl/lib/python3.8/site-packages/numpy/__init__.py:142: in <module>
    from . import add_newdocs
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pl/lib/python3.8/site-packages/numpy/add_newdocs.py:13: in <module>
    from numpy.lib import add_newdoc
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pl/lib/python3.8/site-packages/numpy/lib/__init__.py:8: in <module>
    from .type_check import *
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pl/lib/python3.8/site-packages/numpy/lib/type_check.py:11: in <module>
    import numpy.core.numeric as _nx
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pl/lib/python3.8/site-packages/numpy/core/__init__.py:14: in <module>
    from . import multiarray
E   ImportError: dlopen($PREFIX/lib/python3.8/site-packages/numpy/core/multiarray.cpython-38-darwin.so, 2): Library not loaded: @rpath/libopenblas.dylib
E     Referenced from: $PREFIX/lib/python3.8/site-packages/numpy/core/multiarray.cpython-38-darwin.so
E     Reason: image not found

@EelcoHoogendoorn EelcoHoogendoorn changed the title Linking issues in conda build when adding <pybind11/numpy.h> functionality on osx Libopenblas issues in conda build when adding <pybind11/numpy.h> functionality on OSX Nov 23, 2020
@EelcoHoogendoorn
Copy link
Author

Managed to narrow down the issue. The problem is a lack of conda-forge numpy 1.11 packages for python 3.8. So the build works on py37, or py38 with np1.19 for instance. If those missing deps get filled in by defaults or anaconda channel, things go wrong.

That still leaves me with some questions. What numpy dependency does pybind create in the first place? Does it only need the numpy headers, or is there a dynamic runtime linking requirement? And what would be the recommended way of codifying that in the conda recipe? Does using pybind/numpy mean that you always have to build a seperate package for each python/numpy combination?

Expanding this into a working example for the numpy-side of things would be really valuable I think; happy to make a PR on this repo once I understand what is actually going on!

@henryiii
Copy link
Collaborator

henryiii commented Nov 24, 2020

I think this is a conda issue more than a pybind11 one - I expect you could retriever the issue without building an extension at all; it's NumPy's import proves that's broken (import numpy as np), not your extension, which hasn't even been loaded (that's line 1 of your file). Pybind11 itself does not need NumPy when building, which makes this all so much easier. It imports NumPy at runtime, just like any other package would, so you can build a pybind11 extension without NumPy and then use it on any version of NumPy when you run it.

I would probably make sure you have strict dependencies on, and maybe drop defaults. Also make sure you are not pinning too strictly (see below).

NumPy 1.11 was never released for Python 3.8. See here for a list of some of the minimum versions of NumPy available for Python versions, which is critical for non-pybind11 builds, since you can't load an extension in an older version of NumPy than you used to build it.

Adding NumPy would likely not provide much for the example, but I'd not be too averse to it. We'd probably want to add it to all three examples if we do. I think the main thing that would change would the the conda recipe; maybe we could just add a note when you discover what when wrong? Something like: # note: "channel_priority: strict" recommended if adding NumPy? For most libraries, the correct solution is to add a recipe to conda-forge, and that works out-of-the-box (see boost-histogram, for example).

@EelcoHoogendoorn
Copy link
Author

EelcoHoogendoorn commented Nov 24, 2020

True; this is a conda problem more than anything. But if the purpose of this repo is to provide a complete project skeleton that 'just works', working around / pointing out the quirks of the toolchain is valuable.

Just coming from boost.python I was assuming there was going to be some kind of build time numpy dependency; but if that isnt that case its a nontrivial point that would also be valuable if made explicit in this example repo. Sure took me a while to figure out!

Indeed removing the numpy build dependency and removing the runtime version constraints (was using python {{python}} and numpy {{numpy}} in both host and run before) from the run requirement seems to be the way to go here. Thanks for the input!

@henryiii
Copy link
Collaborator

I assume you should have been using https://github.com/conda-forge/iminuit-feedstock/blob/abba08ff2067409c066c378e1d25d3a1d244f477/recipe/meta.yaml#L27, but yes, it's not needed for pybind11; nothing special is needed for pybind11.

@EelcoHoogendoorn
Copy link
Author

Ah I just realized that me pinning to python {{ python }} also really isnt necessary, as the example implies; I thought I inferred that was actually required to get it to work with numpy, but that must have been confusion with another issue.

The docs dont explicitly call this out either even though I consider it a major practical difference wrt boost.python; but to be clear there is absolutely no requirement to link to any particular version of any python runtime components? That kinda surprises me given that pybind does call back into the python process when allocating a py::array right? For sure it crashes when I allocate a py::array with the GIL released; but not that I understand much at all how that all works on a low level anyway.

@YannickJadoul
Copy link

but to be clear there is absolutely no requirement to link to any particular version of any python runtime components

If you have a C extension, it will be dynamically loaded into a Python process, that already contains all of the symbols that need to be resolved. As far as I know, that's why you don't need to explicitly link to the external symbol.
I believe the NumPy support does something even more nifty, where there are some function pointers stored in a Python "capsule" object (numpy.core.multiarray._ARRAY_API), which is why pybind11 doesn't need numpy headers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants