-
-
Notifications
You must be signed in to change notification settings - Fork 600
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Disregard existing pins in the output which can't be found by pip, rather than failing to compile #1531
Conversation
This comment was marked as outdated.
This comment was marked as outdated.
eecf77b
to
7dd4b05
Compare
1d3cf80
to
603cedd
Compare
603cedd
to
18833a7
Compare
19e0c0f
to
0a2dd1f
Compare
0a2dd1f
to
3fcd01c
Compare
8e0fe23
to
7ecaecb
Compare
b60a3dd
to
8e05c6c
Compare
631838d
to
3499670
Compare
3499670
to
0cbec09
Compare
89d2c7c
to
c8780e3
Compare
else: | ||
return self.repository.find_best_match(ireq, prereleases) | ||
with suppress(NoCandidateFound): | ||
return self.repository.find_best_match(existing_pin, prereleases) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW we can also suppress yanked releases if best_candidate.link.is_yanked is True
. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also I think it would be nice to print a warning here that existing pin is not valid/exists and pip-compile discarded it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, thanks, we shouldn't use yanked releases -- I didn't realize find_best_match
would return them. I tested with a found yanked release (input: gssapi
; output: gssapi==1.6.6
), and unfortunately in this case the link
attribute of the InstallRequirement
candidate, as returned by find_best_match
, is None
, so we can't immediately check candidate.link.is_yanked
.
It's been just over a year since I submitted this PR, and I don't readily recall how we mutate InstallRequirements
to fill the attributes with useful info, if that's what's needed.
I also can't find any existing explicit mention/handling/testing of yanked releases in the codebase.
And yes, once we figure out the yanked business, I'll add a logged warning about discarded matches, thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Something like
diff --git piptools/repositories/pypi.py piptools/repositories/pypi.py
index 41230f6..5cae15b 100644
--- piptools/repositories/pypi.py
+++ piptools/repositories/pypi.py
@@ -142,9 +142,11 @@ def find_best_match(
)
matching_candidates = list(
- itertools.chain.from_iterable(
+ candidate
+ for candidate in itertools.chain.from_iterable(
candidates_by_version[ver] for ver in matching_versions
)
+ if not candidate.link.is_yanked
)
if not matching_candidates:
raise NoCandidateFound(ireq, all_candidates, self.finder)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm now doubting that we should reject already-pinned yanked versions. Despite a package being yanked (not deleted), a downstream package may still need that version, and understand the risks, or want to take time to handle it mindfully.
Another yanked example is attrs
21.1.0
, and when that is already pinned during a non-upgrading compile, we get a nice informative warning:
WARNING: The candidate selected for download or install is a yanked version: 'attrs' candidate (version 21.1.0 at https://files.pythonhosted.org/packages/2b/00/7f74fb9608daf3626aad5de91181026a5107b997ce2788eabffeb5d41cb1/attrs-21.1.0-py2.py3-none-any.whl (from https://pypi.org/simple/attrs/) (requires-python:>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*))
Reason for being yanked: Installable but not importable on Python 3.4.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As an aside, do you know why running tox -e checkqa
locally always has isort move from build import BuildBackendException
in piptools/scripts/compile.py
, but in CI it moves back?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Try rm -r ./build
in working dir. Sort would treat build
as local module if there's build
dir in python path.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Exactly the problem, thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I copied the warning that you're already using in the backtracking resolver.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
About yanked releases, I'm now thinking:
- existing pins which are yanked should indeed be discarded
- yanked releases must still be valid candidates, allowing for users to constrain their way into using a yanked release if they need to
- despite being "valid candidates," they must not prevail over a non-yanked alternative release satisfying the constraints
piptools/repositories/local.py
Outdated
with suppress(NoCandidateFound): | ||
return self.repository.find_best_match(existing_pin, prereleases) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also note this would probably slow down the re-compile process because it would always hit index_urls and find_links.f I'm not sure how important it is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, well it seems to me that in order to do the "right thing" here (avoid a hard failure on bad already-compiled versions), we need to consult the index about the already-compiled versions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The question is what's more important "speed" or "right thing, but in rare case"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did some simple benchmark.
Before the fix:
$ echo web3 > requirements.in
# first run
$ time pip-compile
real 0m18.898s
user 0m11.924s
sys 0m0.993s
# second run
$ time pip-compile
real 0m0.589s
user 0m0.517s
sys 0m0.072s
After the fix:
$ echo web3 > requirements.in
# first run
$ time pip-compile
real 0m17.062s
user 0m12.388s
sys 0m0.970s
# second run
$ time pip-compile
real 0m3.631s
user 0m2.583s
sys 0m0.120s
As I expected re-compile would be slower after the fix (6 times slower on my benchmark) due to the hitting to the simple API for every compiled package.
I understand the intention to do the right thing, but the issue doesn't seem critical and slows down pip-compile on re-runs, which might not be good. Also, the legacy resolver will be removed soon, and spending much energy on it might not be worth it.
What do you think if we only merge the test which I find pretty helpful?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for running the numbers!
What do you think if we only merge the test which I find pretty helpful?
Sure, sounds good to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you! Would you like to open a separate PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, let's continue at #1774
08f80d3
to
776b2b2
Compare
Fixes jazzband#1530, the following case: When a needed package is already pinned in the output file, but has an invalid or at least unavailable version there, the compilation will fail. The logic change also: - doesn't bother building an ireq when we've got one already, from finding a match.
…(legacy resolver)
3572041
to
7ea9468
Compare
Before running
pip-compile
, the output file may contain a pinned requirement which can't be found in PyPI or whichever repo (e.g. a revoked release). This change aims to disregard these un-find-able pins, rather than cause the compile operation to fail.Fixes #1530
Contributor checklist
Maintainer checklist
backwards incompatible
,feature
,enhancement
,deprecation
,bug
,dependency
,docs
orskip-changelog
as they determine changelog listing.