Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--upgrade-packages fails to constrain subdeps, if either absent from a preexisting output file, or if --upgrade is also passed #1550

Closed
jamescw19 opened this issue Jan 13, 2022 · 21 comments · Fixed by #1578
Labels
bug Something is not working cli Related to command line interface things

Comments

@jamescw19
Copy link

The documentation states that the --upgrade and --upgrade-packages can be used together, however the --upgrade-packages flag is ignored when --upgrade is supplied.

Environment Versions

  1. OS Type: macOS and Ubuntu 18.04
  2. Python version: python 3.9.0
  3. pip version: pip 21.3.1
  4. pip-tools version: pip-compile 6.4.0

Steps to replicate

Using different args to the documentation, as requests is not in the django requirements, and requests<=3.0.0 does not affect the "latest" release.

# Create requirements.in file
echo "django" > requirements.in
# Run pip-compile
pip-compile --upgrade --upgrade-package 'sqlparse<=0.4.0'

Expected result

asgiref==3.4.1
    # via django
django==4.0.1
    # via -r requirements.in
sqlparse==0.4.0
    # via django

Actual result

asgiref==3.4.1
    # via django
django==4.0.1
    # via -r requirements.in
sqlparse==0.4.2 
    # via django

sqlparse is not restricted to <=0.4.0. If I omit the --upgrade flag, only supplying --upgrade-packages 'sqlparse<=0.4.0', I get the expected result (but my other packages are not upgraded to later versions). This also doesn't work if the argument is sqlparse==0.4.0.

My current workaround is just to delete the requirements file(s) before running pip-compile, and omitting the --upgrade flag so the existing files aren't used as a cache. I suspect either the --upgrade-packages constraint logic is inside the if not upgrade... conditional, or that --upgrade-packages only extend, not constrain the complete set of packages.

This feature should be permitted, or the documentation changed to avoid stating that this is possible

@jamescw19
Copy link
Author

jamescw19 commented Jan 13, 2022

After some more investigation, this issue occurs without the --upgrade flag if there is no existing output file. It only "works" if --upgrade is not given and the output file is present (therefore the pip-compile conditional branch occurs.

This actually affects my workaround above, as I'd have to run pip-compile twice after deleting the previous requirements files; first to use the latest versions, second to recognise the --upgrade-packages versions.

@richafrank richafrank added the cli Related to command line interface things label Jan 16, 2022
@richafrank
Copy link
Contributor

Looks like this behavior was changed in #1031. There's relevant discussion in that PR, but I can't tell whether this exact case was intentionally changed. @AndydeCleyre might be able to share more.

@AndydeCleyre
Copy link
Contributor

I did not realize they were intended to potentially be used together, as I didn't see sense in the sentiment "keep/install an old version of package x AND upgrade all packages to their latest" -- I figured the upgrade-all bit would trump the other bit. I was wrong.

I'll try to review this whole thing.

@AndydeCleyre
Copy link
Contributor

Until then, does it help to put your restricted versions into the input file?

@ghost
Copy link

ghost commented Feb 4, 2022

That makes sense. The requirements.in file approach would probably work, yeah. This night just be a documentation issue then, if it's not the intended behaviour, because the docs have an example showing the use together.

I can see why the sentiment of upgrading all and pinning a specific package would seem strange. In my specific case, I'm actually trying to force the requirements file to contain a specific, CPU-only flavour of PyTorch with torch==1.10.0+cpu, but upgrade all the other "standard" packages. At the moment, I end up with torch==1.10.0 in my requirements.txt. So it's a bit of a non-standard use-case

@AndydeCleyre
Copy link
Contributor

OK, reviewing the discussion at #759, I think a key note is right at the start. @atugushev opened with:

Pip-compile should raise an error while trying to upgrade a package which doesn't exist in a requirements.in.

Through the discussion, the "penalty" was softened and we did not end up raising any errors in this case.

But the expectation there illuminated is that packages passed to --upgrade-package are either present in the input file, or considered irrelevant. In other words, any package for which you want to control the version should be in the input file. That's my understanding, anyway.

With that in mind:

requirements.in:

django
sqlparse  # for django
$ pip-compile --upgrade-package 'sqlparse<=0.4.0'
asgiref==3.5.0
    # via django
django==4.0.2
    # via -r requirements.in
sqlparse==0.4.0
    # via
    #   -r requirements.in
    #   django
$ pip-compile --upgrade --upgrade-package 'sqlparse<=0.4.0'
asgiref==3.5.0
    # via django
django==4.0.2
    # via -r requirements.in
sqlparse==0.4.0
    # via
    #   -r requirements.in
    #   django
$ rm requirements.txt
$ pip-compile --upgrade --upgrade-package 'sqlparse<=0.4.0'
asgiref==3.5.0
    # via django
django==4.0.2
    # via -r requirements.in
sqlparse==0.4.0
    # via
    #   -r requirements.in
    #   django

So

  • is that a reasonable and good expectation?
  • if so
    • does it address the issues here?
    • we probably should make that clearer in the docs

@richafrank
Copy link
Contributor

Pip-compile should raise an error while trying to upgrade a package which doesn't exist in a requirements.in.

is that a reasonable and good expectation?

For what it's worth, I relied on the behavior requested here (upgrading specific packages not declared in reqs.in) for a while and didn't realize it was no longer available! Certainly a question is whether the complexity is worthwhile, but I found it really valuable to be able to easily upgrade a specific sub-dependency that had recently had a security fix release. pip-compile ensured I still had a set of compatible packages without forcing me to upgrade everything.

Of course, that use case can be worked around by temporarily adding the sub-dependency to reqs.in and then removing it, but it would be nice to have that automated. I didn't want the particular package in reqs.in permanently - if the relevant direct dependency no longer needed it, I would want it to disappear from the output.

@AndydeCleyre
Copy link
Contributor

I'm not saying things should be one way or another here, but want to mention a relevant alternative:

security.txt:

sqlparse<=0.4.0  # See https://...

requirements.in:

django
-c security.txt

@richafrank
Copy link
Contributor

Yep, definitely an option as well!

@AndydeCleyre AndydeCleyre changed the title --upgrade and --upgrade-packages cannot be used together --upgrade-packages ignores packages which are absent from the input file Feb 9, 2022
@AndydeCleyre
Copy link
Contributor

So is there a good reason not to treat --upgrade-package items identically to a -c constraints.txt file containing those items?

@AndydeCleyre
Copy link
Contributor

Oh right yes: because of the simple case of upgrading to the max. Never mind.

@AndydeCleyre
Copy link
Contributor

AndydeCleyre commented Feb 9, 2022

I made #1578 to trial different handling of --upgrade-packages. I constructed a temporary file from upgrade_packages and parsed that in -c constraints.txt style, in addition to our existing handling of upgrade_packages.

The result is good. If it's simple enough to do without creating a temp file, I'm interested in that.

It also presents the possibility of including the --upgrade-packages switch as a source in the annotations.

@jamescw19
Copy link
Author

is that a reasonable and good expectation?

we probably should make that clearer in the docs

I was going to say that this is definitely a reasonable solution, given your previous discussion about the issue -- happy to use requirements.in to define any constraints. But what you've done in #1578 looks like a great solution to the problem, merging CLI upgrade-packages with any existing constraints. If you're happy with the additional logic required, then this would absolutely solve my issue.

Thanks very much for investigating this!

@jamescw19
Copy link
Author

I've added some tests to validate your changes, and opened a PR into your fork. Looks like your changes solve the problem, from the test output

@AndydeCleyre AndydeCleyre changed the title --upgrade-packages ignores packages which are absent from the input file --upgrade-packages ignores packages which are absent from the input file, or if --upgrade is also passed Feb 10, 2022
@AndydeCleyre
Copy link
Contributor

Yeah while I prefer to put these kinds of needs into files rather than use -P, I can't think of a good reason not to treat these like -c constrainst.txt entries in addition to what we're doing.

The behavior would be more predictable and I now think more correct. #1578 is up for review.

@AndydeCleyre AndydeCleyre changed the title --upgrade-packages ignores packages which are absent from the input file, or if --upgrade is also passed --upgrade-packages fails to constrain subdeps, if either absent from a preexisting output file, or if --upgrade is also passed Feb 10, 2022
@jamescw19
Copy link
Author

Great, thanks for this! I agree, I think this becomes much more predictable.
So does this new behaviour account for the case where there is no requirements.in file, but requirements come from a setup.[py|cfg] file instead? Seems like it should from your changes, just clarifying I understand what the changes are permitting.

@AndydeCleyre
Copy link
Contributor

AndydeCleyre commented Feb 16, 2022

@jammie19 If I understand the question, then yes.

Here's behavior on master, using the pip-tools repo itself as the setup.py package to test:

$ pip-compile setup.py -U 2>&1 | grep tomli
tomli==2.0.1
$ pip-compile setup.py -U -P 'tomli<2.0.1' 2>&1 | grep tomli
tomli==2.0.1

And here's the result with #1578:

$ pip-compile setup.py -U 2>&1 | grep tomli
tomli==2.0.1
$ pip-compile setup.py -U -P 'tomli<2.0.1' 2>&1 | grep tomli
tomli==2.0.0

EDIT: I didn't note before, but tomli is not a direct dep; it's a dep of direct dep pep517.

@jamescw19
Copy link
Author

jamescw19 commented Feb 17, 2022

Thanks for clarifying, that looks great!

@tboddyspargo
Copy link

I hope I'm not misreading this issue, but I believe I am experiencing a negative effect of the same (or similar) behavior described above. I'd really appreciate if you could confirm/correct my expectations in this situation. I'm also happy to open a separate issue if this isn't the right place for it.

I have

# test.in
azure-storage-blob
spacy~=3.2.0

I run the command:

pip-compile --upgrade -P "azure-core~=1.22.0" test.in

Actual Output

Could not find a version that matches typing-extensions<4.0.0.0,>=3.6.4,>=3.7.4,>=3.7.4.1,>=3.7.4.3,>=4.0.1 (from spacy==3.2.3->-r test.in (line 2))
Tried: 3.6.2, 3.6.2, 3.6.2.1, 3.6.2.1, 3.6.5, 3.6.5, 3.6.6, 3.6.6, 3.7.2, 3.7.2, 3.7.4, 3.7.4, 3.7.4.1, 3.7.4.1, 3.7.4.2, 3.7.4.2, 3.7.4.3, 3.7.4.3, 3.10.0.0, 3.10.0.0, 3.10.0.1, 3.10.0.1, 3.10.0.2, 3.10.0.2, 4.0.0, 4.0.0, 4.0.1, 4.0.1, 4.1.0, 4.1.0, 4.1.1, 4.1.1
There are incompatible versions in the resolved dependencies:
  typing-extensions>=4.0.1 (from azure-core==1.23.0->azure-storage-blob==12.9.0->-r test.in (line 1))
  typing-extensions>=3.6.4 (from catalogue==2.0.6->spacy==3.2.3->-r test.in (line 2))
  typing-extensions>=3.7.4.3 (from pydantic==1.8.2->spacy==3.2.3->-r test.in (line 2))
  typing-extensions<4.0.0.0,>=3.7.4 (from spacy==3.2.3->-r test.in (line 2))
  typing-extensions<4.0.0.0,>=3.7.4.1 (from thinc==8.0.13->spacy==3.2.3->-r test.in (line 2))

Expected Output

First, I'm surprised that pip-compile doesn't identify lower versions of azure-core and azure-storage-blob (given that there's no version restriction on them) and check their dependencies against the other constraints.

  • Is it wrong of me to expect it to find the highest version of all dependencies (regardless of "degree of directness") that satisfies all version constraints even if that means a direct dependency won't be at the latest version?
  • Is it basically saying that azure-storage-blob doesn't have a restriction on typing-extensions, therefor it won't check the validity a lower version of azure-storage-blob even if one if its dependencies (azure-core) has a conflicting requirement on typing-extensions? Is that expected?

Finally, I expected the -P "azure-core~=1.22.0" to correctly constrain azure-core version, which would allow pip-compile to find a version of typing-extensions that satisfies all of the constraints.

@jamescw19
Copy link
Author

jamescw19 commented Mar 9, 2022

@tboddyspargo The final point here, constraining azure-core and the expectation to find the highest version of all dependencies with the -U flag combined with the -P "azure-core~=1.22.0" is the issue highlighted in this thread. #1578 should fix this behaviour, though.

The main issue you're facing, the inability to resolve typing-extensions is due to the new pip resolver. It should be fixed by #1539, from what I understand of that PR.

@tboddyspargo
Copy link

@jammie19 - thank you very much for clarifying for me! I'm so grateful for everyone's time and expertise!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something is not working cli Related to command line interface things
Projects
None yet
5 participants