New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pinned requirements in the optional dependencies tests #20623
Comments
I made those files as a way to ensure it was possible to test all dependencies and build the docs as part of the release script where I wanted to pin the versions for reproducibility. Since I already had those ready to setup an Ubuntu 20.04 environment it was quick and easy to reuse them when migrating to actions in a hurry. I think probably those files should be split up into several parts e.g. some parts are needed for tests and some for building the docs, some for apt and some for pip etc. There's also The reasons for using conda over pip are not the same as they used to be since packaging with wheels etc has improved a lot over the years. I don't know about sage. Maybe that got missed out... |
Well conda packaging has also improved because of conda-forge. Although I've heard that tensorflow is better to install with pip. For sage, we basically have to use the conda install. It also has to be a separate conda environment, because sage pins a lot of packages itself, and it also breaks mpmath when it is installed. I'm not sure why the Travis install pins a specific conda-forge channel. We might look into if that is still necessary. |
I run the release script in a virtualbox VM. Initially I gave it 1GB of memory but it kept crashing while installing Tensorflow so I had to rebuild the VM with 2GB of memory. It turns out that |
The manylinux wheel for tensorflow is 400 MB (https://pypi.org/project/tensorflow/#files). I downloaded it and it's 900 MB uncompressed (wheels are just zip files). I'm guessing pip decompresses the file in memory, and perhaps stores it in memory twice. The reason it is so big is that wheels have to statically link their dependencies, or else include a copy of them to dynamically link against, because wheels can only depend on other pip packages which have to be Python packages. conda packages can depend on any library. Conda works more like a Linux package manager like |
I've just never used conda so I would find it difficult to set anything up with it. I've tried to do bits through miniconda and other things but it seems like I really need to install the whole massive Anaconda distribution to get it working properly and that hasn't ever seemed reasonable to me (although I've advised other people to do it in the past). I think this also contributed to the problems I had with rever. |
Fixing this would also get rid of the annoying "We found potential security vulnerabilities in your dependencies." banner at the top of https://github.com/sympy/sympy/ |
I have unpinned the versions of the optional dependencies in #20944. I had to remove tensorflow from the list of installed modules because it was pinning the numpy version to 1.19.x. I'm not sure why tensorflow pinned numpy when installed by pip in Actions but not with conda on Travis. There were failures with numpy 1.20.x that showed up on Travis. Those would now also show up on Actions except that I have now fixed them (np.complex and np.float are deprecated in numpy 1.20 and were used in a few places in sympy). There are a number of remaining todos for Actions:
If anyone wants to work on any of these then please go ahead. |
There are some stats tests being skipped in the optional dependency build still:
I guess this is tensorflow:
These doctests are also being skipped:
|
The requirements are no longer pinned and we now have optional dependency tests running on Python 3.8, 3.9 and 3.10 and pypy 3.8. We also have a bleeding edge 3.11-beta job that runs with the master branches of cython, numpy and scipy: sympy/.github/workflows/runtests.yml Lines 165 to 217 in 596257e
A few optional dependencies are excluded under 3.11 and should be added when it becomes possible to install them (maybe it already is): sympy/.github/workflows/runtests.yml Lines 207 to 209 in 596257e
The latest 3.10 optional dependency run shows only 3 doctests skipped. Two of those are in rubi and one is in lambdify. I think that's due to tensorflow which is tested in another job. The main tests for 3.10 optional dependency job show 112 skipped. That seems to be due to cupy, tensorflow. Also I see pymc3. The other skips just seem to be It doesn't look like cupy is tested anywhere. Maybe that should be fixed. It might need its own job like tensorflow (the problem with tensorflow was that it pinned the numpy version). For pymc3 perhaps just the tests need to be updated after #23650 (CC @oscargus). |
I would like to keep a bleeding edge job and also add master testing for gmpy2 and mpmath there as well. Another thing that might be good is a bleeding edge sphinx job because the sphinx build typically breaks in every new sphinx release. Note that sphinx is currently pinned: Line 7 in 596257e
The issue there has now been fixed and I think that the fix is in sphinx 5.0.2 so that can probably be unpinned now: sphinx-doc/sphinx#10539 |
cupy isn't tested because you can't install it on CI (it requires a GPU, which generally isn't available in free CI resources). |
Let's keep Sphinx unpinned by default, and only pin it when issues crop up. |
I believe pymc3 was renamed to just pymc. |
Yes, but what should SymPy do? Should it support both like: try:
import pymc
except ImportError:
import pymc3 as pymc Do we need to do that? Currently SymPy is unable to make use of pymc because it is imported under a different name. Simply replacing |
I guess. It looks like pymc 4 was released fairly recently. https://www.pymc.io/blog/v4_announcement.html. I have no idea if pymc3 is still supported by the pymc devs, but it's probably as simple as that to support both for now, so we can do that then remove |
This is what I know: pymc version 3 is imported as pymc3 and relies on theano, which is deprecated. pymc version 4 is back to pymc as module name. This relies an aesara, which we do support and is under active development. As pymc3 gave deprecation warnings and issues with theano, it was removed from the imports and now that pymc 4 is released it was added back. However, the SymPy code relies on pymc3 (so it was actually not that useful to add pymc back...). There are currently 126 mentions of pymc3 in the code base, especially the file I guess that there may be code that relies on explicitly calling pymc3 somewhere, so at least one should still support feeding that as an argument. |
Maybe @brandonwillard can advise. Is it safe to just do this: try:
import pymc
except ImportError:
import pymc3 as pymc Then in a few years just change it to just |
That depends on the exact parts of PyMC being testing, of course, but the user-level API should be pretty consistent between From a quick search, it looks like most/all uses of |
Thanks @brandonwillard. I think everything is tested if we just make sure that the tests actually run. They currently don't just because they're not even trying to import pymc. |
With #23678 there are 107 skipped tests remaining. |
I think that the only thing pinned now is |
Hopefully these go away once we tag a release https://github.com/sympy/sympy/security/dependabot |
The optional dependencies tests are installing from a pinned requirements file in the releases/ directory (see https://github.com/sympy/sympy/blob/master/release/aptinstall.sh and https://github.com/sympy/sympy/blob/master/release/requirements.txt).
Is there a reason for this? I don't think it's a good idea to pin the dependencies, unless we have to. Otherwise, we won't catch bugs from new versions of the dependencies.
CC @oscarbenjamin
I'm also surprised that we are using pip instead of conda. If pip works I guess that's fine, but I would expect conda to be necessary for at least some dependencies.
Side note: is Sage being tested on GitHub Actions right now? I don't see it.
The text was updated successfully, but these errors were encountered: