Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] Private dependencies and disabled/inaccessible remotes. #6854

Closed
awgil opened this issue Apr 14, 2020 · 6 comments
Closed

[bug] Private dependencies and disabled/inaccessible remotes. #6854

awgil opened this issue Apr 14, 2020 · 6 comments
Assignees

Comments

@awgil
Copy link

awgil commented Apr 14, 2020

Consider app -> lib -> sublib dependency graph, where lib -> sublib is a private dependency. Both lib and sublib packages are created and uploaded to some (private) remote artifactory instance.

On a fresh PC, when you install dependencies of app (conan install path-to-app-recipe), conan will download recipe and package for lib and recipe only for sublib. At this point, imagine remote becomes inaccessible for some reason (e.g. user goes offline).

If user now tries to install dependencies of app again, it will fail. The problem can be traced to GraphBinariesAnalyzer::_evaluate_remote_pkg() function: it will attempt to connect to remote, get Forbidden exception, print (misleading - it seems to imply it tried to download binary rather than get its info!) error message and rethrow.

  1. If user now disables the remote, different error will be shown (saying even more unhelpful message that particular remote is disabled, without mentioning package it's trying to read data for).
  2. If user now removes the remote, everything will succeed (package will be marked as missing, but since it's a private dependency, it is not needed after all).
  3. Note that even if remote is accessible, user will get the performance impact of retrieving each binary package info for each indirect private dependency from remote every time he does an install. This doesn't seem right - one would assume that after doing install once, user's cache would contain everything needed for fully offline work.

Environment Details (include every applicable attribute)

  • Operating System+version: Windows 10
  • Compiler+version: VS 2019
  • Conan version: 1.24.0
  • Python version: 3.7.6

Steps to reproduce (Include if Applicable)

  1. Create sublib recipe, create lib recipe with private dependency on sublib, create app recipe with dependency on lib.
  2. Create and upload lib & sublib packages to private server.
  3. Clean local cache.
  4. Run conan install on app recipe.
  5. Make remote inaccessible (shut down server for example).
  6. Run conan install on app recipe and observe error Feature: Add travis, appveyor CI, check PRs #1.
  7. Disable remote.
  8. Run conan install on app recipe and observe error correct git clone https URI in readme #2.
  9. Remove remote.
  10. Run conan install on app recipe and observe success.
@memsharded
Copy link
Member

Hi @awgil

Some quick feedback:

Consider app -> lib -> sublib dependency graph, where lib -> sublib is a private dependency. Both lib and sublib packages are created and uploaded to some (private) remote artifactory instance.

If you are talking about self.requires(.., private=True) dependency, please take into account that this type of dependency is not recommended for most cases. It should be used only in extreme circumstances, and only if there is no other way.

Regarding the issue, it doesn't seem it is a bug, it is by design. private dependencies do not mean the recipe is not fetched or it is not part of the graph. Just the download of the binary is skipped, but the recipe really needs to be retrieved.

I would strongly recommend to user regular requires dependencies.

@awgil
Copy link
Author

awgil commented Apr 15, 2020

I understand that recipe has to be fetched - no problem with that. Problem is with that while binary package is skipped, conan still asks remote for "package info" of the binary: in GraphBinariesAnalyzer::_process_node() it checks whether binary package exists in cache, if not - it calls _evaluate_remote_pkg, which fails if remote is inaccessible. If I remove remote completely, it will succeed (returning BINARY_MISSING, which is not a problem, since binary won't be needed after all). The resulting behaviour is at least confusing: I have to remove (not even disable) inaccessible remote to be able to complete install successfully.

About not using private dependencies - but how would I model private dependencies then? Say, static library that gets linked into shared library and is not visible in the API - I definitely don't want consumer of the shared library to implicitly get access to and link with the libraries used as implementation details. That's the same reason why we don't make all cmake target properties public and instead carefully distinguish between private & public ones.

One more motivating example - consider a tool like https://renderdoc.org/ or https://bitbucket.org/wolfpld/tracy/src/master/. These are debuggers/profilers, and consist of a small client library that can be integrated into application we're developing, and large GUI analyzer application. We want the library and application version to always match, and we want to import analyzer app executables into build tree, so that developers can easily use correct version to analyze traces they gather. Small client library has no dependencies, but GUI application requires large UI frameworks to be built. If we perform the build of both parts of the tool using conan (it is useful if we make private patches to the tool) rather than packaging prebuilt binaries, the recipe will have to mention UI frameworks as requirements - but we definitely don't want these requirements to be propagated to the users of the tool.

@memsharded
Copy link
Member

I understand that recipe has to be fetched - no problem with that. Problem is with that while binary package is skipped, conan still asks remote for "package info" of the binary: in GraphBinariesAnalyzer::_process_node() it checks whether binary package exists in cache, if not - it calls _evaluate_remote_pkg, which fails if remote is inaccessible. If I remove remote completely, it will succeed (returning BINARY_MISSING, which is not a problem, since binary won't be needed after all). The resulting behaviour is at least confusing: I have to remove (not even disable) inaccessible remote to be able to complete install successfully.

Ok, we need to have a look at this then. Thanks for the detailed explanation!

About not using private dependencies - but how would I model private dependencies then? Say, static library that gets linked into shared library and is not visible in the API - I definitely don't want consumer of the shared library to implicitly get access to and link with the libraries used as implementation details. That's the same reason why we don't make all cmake target properties public and instead carefully distinguish between private & public ones.

Yes, this is something that is planned, a better graph model to model those things. But in the meantime, I would recommend to use normal requires, and manage things in the build scripts. In most cases, the linker is smart enough to optimize out the not used libraries. If not, the information from generated cmake files can be manipulated and controlled. That might be less problematic than using private dependencies.

We want the library and application version to always match, and we want to import analyzer app executables into build tree, so that developers can easily use correct version to analyze traces they gather.

Just in case, private dependencies do not guarantee that the same version is used, actually the opposite. They do not guarantee that the versions match, because it was explicitly designed to support the case of different versions. You can depend on libA/1.2.3 on one private requirement and on libA/2.3.4 on a different private requirement of the same dependency graph.

One more motivating example - consider a tool like https://renderdoc.org/ or https://bitbucket.org/wolfpld/tracy/src/master/. These are debuggers/profilers, and consist of a small client library that can be integrated into application we're developing, and large GUI analyzer application.

For debuggers/profiles, I would recommend build_requires instead. The main reason of build_requires is to use tools (not libraries). They can be even defined in the profiles, and they can be installed separately as well with virtualenv generators.

@awgil
Copy link
Author

awgil commented Apr 15, 2020

Just in case, private dependencies do not guarantee that the same version is used, actually the opposite. They do not guarantee that the versions match, because it was explicitly designed to support the case of different versions. You can depend on libA/1.2.3 on one private requirement and on libA/2.3.4 on a different private requirement of the same dependency graph.

Yes, I understand that, what i meant is that myapp depends on tool publicly, and tool depends on large-ui-framework privately, so that this dependency is not propagated to myapp (in fact, if myapp ever uses large-ui-framework, it can use different version, which is great!).

By the way, what is the difference between requires(private=True) and build_requires? I assume "static library linked into shared library" is better modeled by requires(private=True), is that right?

@schwaerz
Copy link

We have a similar use case. I was quite sure that this should be available in Conan already somehow, but looks like I will have to model it myself?

  • We build different tools together using some Conan packages (static libraries) as dependencies
  • Finally we want to deliver those tools as conan package (to be used with a tool_requires)
  • What we currently do:
    ** We use requires to collect all the tools which have to be installed on so machine, as the package collecting all tools shall just be a wrapper, it shall not include anything by itself - and build_requires would not support that.

Unfortunately even with private dependenices in the tools itself, that means that the users of our tools will have to have access to the development Conan repositories (and even call the configure methods, which actually creates the problem for us) though it would not be required at all.

@memsharded
Copy link
Member

Let me clarify, because this is a bit outdated issue and Conan 2.0 is imminent:

Conan 2.0 does the following:

  • The graphs, by construction, will always be complete, including all dependencies, including also all tool_requires/build_requires, at the recipe level. That means, it will need the recipe of all dependencies and transitive dependencies to be able to build a graph.
  • Conan 2.0 has a better graph model so even if it needs the recipes, in many cases it will be able to optimize the fetching/download of many binaries. All transitive dependencies of an application or a shared library that links with static libraries, for example, will be unnecessary (the binaries) and will be optimized.
  • The full dependency graph is always necessary, because even if the binaries can be optimized, the version and revision information is necessary for the correct computation of the package_id, and knowing with more safety what needs to be rebuilt from sources and what not. The package_id model in Conan 2.0 is more advanced, complete, flexible safe and optimal than the model in 1.X.

That means that the only way to make some dependencies fully invisible or non existing for consumers of some packages is to actually re-package in a new one, decoupling the dependencies. Conan 2.0 bring some interesting capabilities for this kind of task:

  • The new deployers might be very useful to collect the necessary things from dependencies that need to re-packaged
  • The new custom commands together with the new python API will allow for much higher levels of automation, integrated within Conan. Defining some conventions of what to repackage and creating a custom repackage command or something like that should be very doable.

The concept of repackaging is something that we will build some examples, docs and maybe even more tooling in the 2.X roadmap, some time after 2.0 is stabilized, but this will be a new concept. The main dependency graphs of Conan 2.0 doesn't allow to completely hide transitive dependencies recipes, so I think we can close this ticket as won't fix. Please follow up the 2.X roadmap, I have just created a dedicated ticket to centralize feedback in #13171.

Thanks all for the feedback!

@memsharded memsharded closed this as not planned Won't fix, can't repro, duplicate, stale Feb 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants