Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cargo publish times out waiting for crate to be in index with sparse-registry with JFrog #11314

Closed
yuvalojfrog opened this issue Oct 31, 2022 · 7 comments · Fixed by #11356
Closed
Labels

Comments

@yuvalojfrog
Copy link

Problem

In JFrog we have an automation that verifies the cargo sparse support using the nightly image.
As soon as version cargo 1.66.0-nightly (7e484fc 2022-10-27) was released, we began to notice that publishing was unstable and sometimes took quite a while, which caused a timeout in our tests. The question was if there have been any recent changes in publish that might lead to performance issues.

Steps

No response

Possible Solution(s)

No response

Notes

No response

Version

No response

@yuvalojfrog yuvalojfrog added the C-bug Category: bug label Oct 31, 2022
@epage
Copy link
Contributor

epage commented Oct 31, 2022

Could you clarify what you mean by unstable?

Could you provide the output?

A likely candidate is our new blocking feature. Any idea why this would be incompatible with JFrog?

@yuvalojfrog
Copy link
Author

Thank you so much for responding so quickly.
In response to your link, we added timeout = 0 with the -Zpublish-timeout and it seems to have resolved the issue.
When we continuously push two packages with different versions one after the other, the second publish failing most of the times.
is it expected? or an issue that will be fixed before the stabilization of the sparse index feature

@epage
Copy link
Contributor

epage commented Oct 31, 2022

The blocking feature is independent of the index protocol used and is expected to stablize in 1.66. Ideally, we can root cause this quickly to know if there is a fundamental bug in it.

There are several parts to where this could be going wrong:

  • publishing the crate, client side
  • publishing the crate, server side
  • blocking on the update / updating the index

Can you provide reproduction steps verbose cargo publish output (even better is configure CARGO_LOG)?

Can this be reproduced with your git registry support or with crates.io with either protocol?

@yuvalojfrog
Copy link
Author

we can't reproduce it with the git registry.
CARGOLOG.txt
here is the cargo_log

@epage
Copy link
Contributor

epage commented Nov 2, 2022

I've tried creating various test cases to reproduce this but haven't been able to so far.

There is a chance there is a bug in JFrog's backend. Could you do some debugging on this to help figure out if its on cargo of JFrog's end?

@epage epage changed the title cargo publish -Z sparse-registry is unstable cargo publish times out waiting for crate to be in index with sparse-registry with JFrog Nov 2, 2022
@yuvalojfrog
Copy link
Author

We ruled out the possibility that it was a bug on our side for a number of reasons:

  1. We haven't made any changes recently.
  2. We are also experiencing the same errors in previous versions of our product, versions that have worked so far.
  3. With an earlier nightly version we experience no problems.
  4. After adding the suggested workaround, everything works.

@epage
Copy link
Contributor

epage commented Nov 3, 2022

That just tells us that the new cargo publish behavior is exhibiting a bug, on one of the two sides. That doesn't tell us which side the bug is on. This could be exposing a bug within the registry that wasn't as noticeable previously. So far with our test HTTP registry, I am not seeing the problem. As I don't have access to JFrog's registry, I can't debug which side its on.

bors added a commit that referenced this issue Nov 4, 2022
test(publish): Cover more wait-for-publish cases

These came from trying to guess what cases are causing problems in #11314.  Unfortunately, can't reproduce it so far but figured it'd be good to keep these around.
@bors bors closed this as completed in 0756938 Nov 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants