Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposal to simplify the local development flow: conan install + use your build system #19

Merged
merged 5 commits into from Feb 26, 2021

Conversation

memsharded
Copy link
Member

The local development flow in Conan 1.X makes very challenging to do a developer build that is identical to the one created by the conan create package, because the build helpers in the build() method have a lot of logic and inject dynamically a lot of information into the build. The proposal is:

  • The main local developer flow would be a conan install, and then use the normal build tools. The recipe generate() method should write files to disk containing all the information necessary to build, in the form of toolchains, find_packages, or environment definition, that can be used directly by developers.
  • The conan package command can be removed as it is not providing any value that conan export-pkg cannot do.
  • The files generated at conan install time: conanbuildinfo.txt, conaninfo.txt, graph_info.json will be removed, as their only use case is to save state so the conan build command can partially recover information.
  • The conan build command might dissapear if not relevant anymore. If it is maintained, it will be stateless, executing first a conan install and computing the full dependency graph, so it doesn't depend on any locally saved state, and will be less confusing from a UX point of view.

  • Upvote 👍 or downvote 👎 to show acceptance or not to the proposal (other reactions will be ignored)
    • Please, use 👀 to acknowledge you've read it, but it doesn't affect your workflow
  • Comment and reviews to suggest changes to all (or part) of the proposal.

@michaelmaguire
Copy link

We work with several internal teams each of whom choose to use conan in different ways. conan build was a very useful abstraction that allowed us to know right away how to build a package without needing to spend time understanding what the appropriate normal build tools would be and how to invoke them.

I feel it would be a great loss if the conan install then conan build workflow disappeared.

@datalogics-kam
Copy link

datalogics-kam commented Feb 9, 2021

I'm not sure about removing conan build. We use that a lot in the invoke tasks we've built around Conan:

  • We isolated all our building/testing in the conanfile.py so that building in development is the same as making a package.
  • We have tasks cmake, build, and test that essentially do one phase of building by calling the internal conan build mechanism and using without to turn off the steps we aren't doing.
  • We really rely heavily on the fact that running a method in conanfile.py has all the information set up from the install folder, including the PATH to tools, other environment variables (CMAKE_ROOT for instance, stuff for Doxygen), options, and the like.
  • In places where we have unusual builds (Scons...) or testing (strange custom stuff from 2007 based on Python's nose), that, too is isolated in the conanfile.py, so that it's abstracted.
  • This all wraps up into a one line build invoke bootstrap build test, where bootstrap expands to invoke conan.install-config conan.login conan.install cmake. This works on every project, whether it's CMake-based or SCons-based.

Losing conan build or making it slow would have a large impact on us; we'd have to retool a lot of things, and maybe lose some abstractions, or have to recreate them ourselves.



## Migration notes
These commands are not frequently used in automated or CI flows, mostly developer side commands. No migration issues expected.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Every CI workflow at our establishment is using conan build. Not every CI flow in an a Conan application produces a package, remember there are projects that can be pure consumers.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, conan build might be maintained if there is enough support, not a problem. Just that it will operate slightly differently regarding the saved state and the arguments. I would like to understand, if the generate() abstraction is good enough, wouldn't it be interesting to have a "native build" calling CMake or Visual or whatever directly instead of conan build?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just that it will operate slightly differently regarding the saved state and the arguments.

Could I still give it an install folder to get the state from?

wouldn't it be interesting to have a "native build" calling CMake or Visual or whatever directly instead of conan build?

In some cases, but then it changes from

invoke build

to

  • If you're on this project then use scons (ETA: We've got a proposal to get that project out of SCons and into CMake; it's just a matter of when)
  • And if you're on this project, you can use cmake --build
  • But if you want to be more direct, then you need to figure out if the underlying CMake generator was Unix Makefiles or Xcode or Visual Studio
  • ...

I mean, we have some people who'll reach for an IDE and know they can find the project in the build folder, and some people who just hit the text files with an editor, and then know they can invoke build.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are also using conan build in our CI when testing that packages can be built correctly on PRs, without polluting the CI server's cache. I find it much more elegant than using native tools as it makes CI script much simpler and leaves the complexity of using special build tool for each specific platform. We abstract away the builds for iOS/Android/Windows/... in conan package that all our packages python_requires. This gives us also the ability to have versioning on build procedure (i.e. old versions supported only ios device builds, new also support Mac Catalyst builds). By having that in our CI script we would be at risk of using incompatible build procedure (i.e. new Mac Catalyst build) on older version of the package, yielding incomprehensible error.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good, I see the case, and how it can be convenient.

Could I still give it an install folder to get the state from?

This is exactly what we are trying to get rid of. That state is really problematic, always buggy and incomplete, it is really a nightmare. The proposal is to pass conan build the arguments that you will pass to have that state, like you do to conan install, mostly the --profile. Just use the same arguments as conan install, or skip conan install, as it can be already callled by conan build, as it will expand the graph anyway to recover the state.

The performance penalty should be very reduced, our benchmark for large graphs with hundreds of deps (when already installed in the cache) is a few seconds, which is typically much lower than the build itself, and negligible in CI time.

Copy link

@dheater dheater Feb 10, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the idea of skipping conan install and just using conan build 👍 as long as conan build is kept.

@memsharded
Copy link
Member Author

Please @michaelmaguire @datalogics-kam read carefully the proposal, it is not saying the conan build will be removed. It says that it might be maintained if necessary, and it outlines how it will operate if it is maintained. Please let me know if your vote will be an upvote if the proposal is updated to reflect that conan build will be maintained. What is very important is the new integration model in which most of the information must be generated by generate() and not injected dynamically by build() method, because this is the model of the new toolchains and build systems integration, that is already being implemented in Conan 1.X (MSBuildDeps, MSBuildToolchain, CMakeDeps, CMakeToolchain, etc).

@michaelmaguire
Copy link

michaelmaguire commented Feb 9, 2021

Please @michaelmaguire @datalogics-kam read carefully the proposal, it is not saying the conan build will be removed. It says that it might be maintained if necessary, and it outlines how it will operate if it is maintained. Please let me know if your vote will be an upvote if the proposal is updated to reflect that conan build will be maintained. What is very important is the new integration model in which most of the information must be generated by generate() and not injected dynamically by build() method, because this is the model of the new toolchains and build systems integration, that is already being implemented in Conan 1.X (MSBuildDeps, MSBuildToolchain, CMakeDeps, CMakeToolchain, etc).

The proposal originally said:

The conan build command might dissapear

which I downvoted.

I would be happy to change to an upvote if the propsal is changed to state that conan build would be maintained, albeit in the newly suggested "static" state where most work is done in the generate() step.

@datalogics-kam
Copy link

@memsharded I probably need more time to think about it. Since this is a proposal, I don't really have a prototype to try out, which might alleviate my concerns. If we can guarantee that conan build remain without significant performance penalty, then I ought to be able to support it.

@a4z
Copy link

a4z commented Feb 9, 2021

when I started with conan I also used build/install/package, because it mapped to what I know from rpm and other package format creation. Meanwhile I have no use for them anymore. So I am fine with it

@DoDoENT
Copy link

DoDoENT commented Feb 9, 2021

Although I would be OK with the changed behaviour of conan build, as described in the proposal, I would actually miss the conan package command - it's really useful for testing the package function and checking if the package layout is as expected, without polluting the local cache - this is very important, especially on CI servers.

The only problem with current implementation of conan package is that it does not calculate the package_id, so we need to call conan export-pkg to do that and this then copies the package into the local cache, which is something we don't want - we wan't to be able to perform conan test on package without exporting it to the local cache.

Currently, our developers manually clean the local cache after developing and testing the packaging procedure and our CI server has a very ugly hack to ensure that temporary packages exported while building pull requests are not retained in the cache for other builds in order to ensure that current build does not affect the other build being performed on the same server in parallel.

Here are the excerpts from our jenkins pipeline scripts:

This part tests that package can be built on without polluting the Conan cache at all:

script.dir( 'build-conan-package' ) {
    script.withEnv(environment) {
        script.common.executeCmd( "${scriptPrefix} conan install ../${repository}/${packageSubfolder}/${conanfileName} -s build_type=${buildTypes[buildTypeIndex]} ${runtime} ${profileCmd} ${packageConfiguration} ${buildMissing}" )

        // upload any packages that were built by-dependency during `conan install` invocation
        script.commonConanBuild.uploadAllBuiltPackages()
        // clean any build directories
        script.commonConanBuild.cleanTemporaries()

        if ( requiresConanSourceInvocation ) {
            script.dir( "../${repository}/${packageSubfolder}" ) {
                script.common.executeCmd( "conan source ./${conanfileName}" )
            }
        }
        script.common.executeCmd( "${scriptPrefix} conan build ../${repository}/${packageSubfolder}/${conanfileName} --source-folder=../${repository}/${packageSubfolder} --build-folder=. --package-folder=./conan-package" )
        script.common.executeCmd( "${scriptPrefix} conan package ../${repository}/${packageSubfolder}/${conanfileName} --source-folder=../${repository}/${packageSubfolder} --build-folder=. --package-folder=./conan-package" )
    }
}

Next, since conan test does not work with the result of conan package, we need to do this very ugly stuff:

script.withEnv(environment) {
    def pkgTestError = null

    // conan export-pkg requires git revision because of scm=auto in conan packages - create a dummy git repository
    // PR builds work with source only, the .git folder is not stashed during checkout as it's large and not required until
    // we are actually on branch from which Jenkins should make the release
    // this hack is here only because we want to use scm=auto when exporting to local cache, but here we actually
    // don't want to export the package to the cache - we just want to `conan test` it
    if ( !needCleanupFakeGit ) {
        script.dir( repository ) {
            script.common.executeCmd( 'git init' )
            script.common.executeCmd( 'git add Jenkinsfile' )
            script.common.executeCmd( 'git commit -m "dummy"' )
            script.common.executeCmd( 'git remote add origin dummy' )
        }
        needCleanupFakeGit = true
    }

    def tempPackageName = "${packageName}/${packageVersion}@${conanUsername}/executor${script.env.EXECUTOR_NUMBER}"

    script.common.executeCmd( "conan export-pkg ./${repository}/${packageSubfolder}/${conanfileName} ${tempPackageName} --package-folder=./build-conan-package/conan-package --force -s build_type=${buildTypes[buildTypeIndex]} ${runtime} ${profileCmd} ${packageConfiguration}" )

    if ( !packageRegisteredForCleanup ) {
        packagesForCleanup << tempPackageName
        packageRegisteredForCleanup = true
    }

    // this is to avoid conan error if conan test package does not have exactly the same options as the package being tested
    // why does conan even support options on test-package? It makes no sense to have different package options on
    // package being tested and test package...
    def testPackageConfiguration = packageConfiguration.replaceAll( /-o\s*(\w+(?:=|\s+)[\w'"-]+)/, "-o ${packageName}:\$1" )

    def conanTestCommand = "${scriptPrefix} conan test ./${testPackageFolder} ${packageName}/${packageVersion}@${conanUsername}/executor${script.env.EXECUTOR_NUMBER} -s build_type=${buildTypes[buildTypeIndex]} ${runtime} ${profileCmd} ${testPackageConfiguration} ${buildMissing} --test-build-folder=./test-conan-package"

    try {
        if ( profile.contains( 'ios' ) ) {
            // unlock keychain for application signing on ios node
            script.withCredentials([script.usernamePassword(credentialsId: script.credentials.getIosKeychainCredentialsID(), passwordVariable: 'KEYCHAIN_PASSWORD', usernameVariable: 'KEYCHAIN_USER')]) {
                script.sh "security unlock-keychain -p ${script.env.KEYCHAIN_PASSWORD} ${script.env.KEYCHAIN} && ${conanTestCommand}"
            }
        } else {
            script.common.executeCmd( conanTestCommand )
        }
    } catch (error) {
        script.echo "There was an error during package test: ${error}"
        script.error "There was an error during package test: ${error}"
    }
}

Of course, all this (and much more is in big try block to catch any errors and ensure that following finally block will be executed which cleanups the temporary files:

} finally {
    if ( needCleanupFakeGit ) {
        script.dir( repository ) {
            if ( script.common.isWindows() ) {
                script.bat 'rd /s /q .git'
            } else {
                script.sh 'rm -rf .git'
            }
        }
    }
    for ( int i = 0; i < packagesForCleanup.size(); i++ ) {
        // cleanup all packages for testing
        try {
            script.common.executeCmd( "conan remove ${packagesForCleanup[ i ]} --force" )
        } catch( ignored ) {}
    }
}

Now, if conan could work with multiple caches, we could simply do a full conan create on our package into a temporary folder, while all its dependencies will go to a default local cache, conan test the package and then nuke the temporary folder.

@Daniel-Roberts-Bose
Copy link

I have a general question on this one. If the conan build command goes away, what happens to the build() function in conanfiles? Is the proposal saying this function would become deprecated and instead of having any build functionality in the conanfile, we would simply call cmake or make manually from the command line?

And if the above case is accurate, how would create actually build anything?

@memsharded
Copy link
Member Author

@Daniel-Roberts-Bose

No, the build() method functionality doesn't go away at all. It is still called when building in the Conan cache, with a conan create or conan install --build call. It is true that the build helpers like CMake() that live in the build() method gets simplified, as part of the logic they implement today is moved to the CMakeToolchain() class in the generate() method. The whole idea is that for developers, the new flow implements a possible cmake . -DCMAKE_TOOLCHAIN_FILE=conantoolchain.cmake that should be the same as calling conan build command, and should be the same as when the build() method is called when building with conan create in the cache.

In any case, it seems that conan build command will be maintained, as it seems there is demand to keep it.

@jasal82
Copy link

jasal82 commented Feb 9, 2021

We've also set up a bunch of invoke tasks that consume the conanbuildinfo.json which can optionally be generated. Will this still be possible? We need the information in that file to find tool locations, merge manifests, gather license information, etc.

@memsharded
Copy link
Member Author

@DoDoENT

Although I would be OK with the changed behaviour of conan build, as described in the proposal, I would actually miss the conan package command - it's really useful for testing the package function and checking if the package layout is as expected, without polluting the local cache - this is very important, especially on CI servers.

But that is the point the conan package should never run on a CI server. It was a developer command to debug the package() method, but I don't see how it could help in a CI server.

@memsharded
Copy link
Member Author

@jasal82

We've also set up a bunch of invoke tasks that consume the conanbuildinfo.json which can optionally be generated. Will this still be possible? We need the information in that file to find tool locations, merge manifests, gather license information, etc.

Yes, that will still be possible, it is unrelated. That comes from the json generator, which you have explicitly used somewhere in your recipes or commands. It is not connected to those other files that Conan generates always, independently of generators.

@jasal82
Copy link

jasal82 commented Feb 9, 2021

Do you already have a solution for the Visual Studio builds, because AFAIK the generator can not be stored in the toolchain files. So a simple conan build .. would become a cmake -G"Visual Studio 2017" -T v140,host=x64 ..

@mietzen
Copy link

mietzen commented Feb 9, 2021

We also use the conan build and package methods in CI for functional tests. After building and uploading our product conan package we consume it by the functional test recipe and build against it. But we do not want to populate the cache with its artifacts, since we only want to store the junit results. Therefore building it locally and use the package method to copy our tests results is quite useful.
I will also try to get the feedback of some of our developers, I know that some are only using conan build -c and afterwards open their IDE's, but some also use the full "local flow" with install source build package.

Edit:
After reading more about CMakeToolchain and generate I changed my reaction to:

you've read it, but it doesn't affect your workflow

Since we need to rework all conanfiles when upgrading to 2.0 the generate() and export-pkg provide the same functionality as build() and package.

@jasal82
Copy link

jasal82 commented Feb 9, 2021

A slightly off-topic but still somehow related question regarding the generated conantoolchain.cmake. Our Yocto device image builds produce toolchain SDKs and wrap them automatically in Conan packages. These are then used as build_requires in profiles to 'inject' the toolchain into builds which is kinda convenient because it can be automatically installed on the build machines. In order to fix problems with third party component builds like OpenCV we also had to make the packages generate and export toolchain files which are then passed to CMake via -DCMAKE_TOOLCHAIN_FILE. According to your proposal this parameter will then already be reserved for the conantoolchain.cmake.

@FabienLaurent
Copy link

FabienLaurent commented Feb 9, 2021

In my projects, developers have to build locally packages that they are not necessarily familiar with. The conan build command provides a nice abstraction and developers don't need to open the Readmes to figure out how to compile everything.

We could adapt our recipes to work with the new proposed implementation. So I would give a thumbs up to this proposal if the command is kept.

@ohanar
Copy link

ohanar commented Feb 9, 2021

@jasal82 So while the generator cannot be specified via toolchain files, the generator's toolset can be. So the command would be cmake -G"Visual Studio 2017" -DCMAKE_TOOLCHAIN_FILE="conan_toolchain.cmake" .., and the v140,host=x64 would be set in the toolchain file. This means, in theory, same toolchain file could be used for different generators, so it could be used with Visual Studio 2017 or 2019, or something like ninja, although this is not true presently (see conan-io/conan#7908).

@jasal82
Copy link

jasal82 commented Feb 9, 2021

@DoDoENT

Obviously we're not the only company that already invested several man-years for creating sophisticated Jenkins pipelines to actually get something built with Conan ;)

Although I would be OK with the changed behaviour of conan build, as described in the proposal, I would actually miss the conan package command - it's really useful for testing the package function and checking if the package layout is as expected, without polluting the local cache - this is very important, especially on CI servers.

Just out of curiosity, it seems you're sharing the same Conan cache between different builds. Do you have the cache on shared storage or is everything built on the same machine? We decided that it's better to have everything containerized and start from a clean cache (for reproducability reasons), but for CI it could be beneficial sometimes to have shared cache semantics.

@jasal82
Copy link

jasal82 commented Feb 9, 2021

@ohanar I see, but if it's not possible to include everything in the toolchain file I cannot agree with this proposal. It would be hard to keep track of the conan install settings used in different build folders which then have to be replicated in the CMake command, potentially days later. If you fail replicating the same settings the build will be inconsistent. EDIT: It would probably help if the toolset version was always configured in the toolchain file, but then it could still cause confusion when the generated project files are incompatible with the expected VS IDE version or when new dependencies are added that require a conan install to update the project files.

@memsharded
Copy link
Member Author

Just out of curiosity, it seems you're sharing the same Conan cache between different builds. Do you have the cache on shared storage or is everything built on the same machine? We decided that it's better to have everything containerized and start from a clean cache (for reproducability reasons), but for CI it could be beneficial sometimes to have shared cache semantics.

@jasal82 @DoDoENT Please recall that the Conan cache is not designed for concurrency, so it shouldn't be shared among concurrent CI jobs. Although it might work for some cases, it can also easily break in unexpected ways in many other situations. It can be specially bad if the cache is put in a shared folder, as any possible synchronization mechanism would be completely invalid in such shared folders.

@memsharded
Copy link
Member Author

memsharded commented Feb 9, 2021

@jasal82 @ohanar

cmake -G"Visual Studio 2017" -T v140,host=x64

Exactly what @ohanar said: cmake -G"Visual Studio 2017" -DCMAKE_TOOLCHAIN_FILE="conan_toolchain.cmake"

This is one of the main ideas of the proposal. That this should be possible. It is really bad, and many, many users complain that the integration with build systems is not transparent and that you actually need conan build to build, and you cannot use your build system as usual. The main goal of this proposal is to enable this flow, have the mechanisms that will allow with cmake -G"Visual Studio 2017" -DCMAKE_TOOLCHAIN_FILE="conan_toolchain.cmake" to achieve exactly the same build that you can achieve with conan build. This makes the conan build a convenient command, maybe very convenient, but not a completely fundamental and necessary command. So I am fine to keep the conan build for achieving such convenience.

@ohanar
Copy link

ohanar commented Feb 9, 2021

@jasal82 For us it would be an anti-feature to enforce a particular cmake generator for a given configuration. We have different developers who use different cmake generators for the same target platforms.

In your use case, you can always write a wrapper script in the generate method, so something like:

def generate(self):
    ...
    if self.settings.compiler == "msvc":
        # write script to enforce a particular generator
        with open("run_cmake.bat", "w") as file:
            file.write('cmake -G"Visual Studio 2017" -DCMAKE_TOOLCHAIN_FILE="conan_toolchain.cmake" %*\n')
def build(self):
    if self.settings.compiler == "msvc":
        # use our script
        self.run(rf'.\run_cmake.bat {self.source_folder}')
        ...
    else:
        ...

@jasal82
Copy link

jasal82 commented Feb 9, 2021

@memsharded @ohanar I thought about it some more and I think generating wrapper scripts could solve the problem for us. As long as there is a solution for merging the Conan toolchain file with a potential upstream toolchain file as mentioned in my other comment above I'd say we're good with the proposal.

@ytimenkov
Copy link

ytimenkov commented Feb 10, 2021

It is really bad, and many, many users complain that the integration with build systems is not transparent and that you actually need conan build to build, and you cannot use your build system as usual.

There are 2 different things: using build system as usual and configuring it. Configuring was never an easy part even before where one needs to tune dozens of parameters according to own needs. As I complained before invoking CMake manually is still much more to type and prone to errors.

Conan elegantly solved all these issues by encapsulating this knowledge in the (uniform) form of options and settings.

But you're too focused on CMake. There are other generators and it's worse that different build systems used in the same company for different components, so having a simple uniform workflow to set up development working copy significantly lowers the entry barrier.

This makes the conan build a convenient command, maybe very convenient, but not a completely fundamental and necessary command. So I am fine to keep the conan build for achieving such convenience.

Or maybe provide a better alternative for just "developer" flow 😊

@ytimenkov
Copy link

ytimenkov commented Feb 10, 2021

@jasal82 You mean like configure generator as in @ohanar proposal? 🤔

Well, from Conan architecture perspective this could be a reasonable alternative if such a generator is provided by Conan, or if built-in toolchains provide this as an option.

P.S. Upvoted because I do believe that all options should be converted into values in the toolchain file(s) (which we already do) and let that deprecation of conan build discussion not shadow the main idea of the proposal.

@uboot
Copy link

uboot commented Feb 10, 2021

A few concerns from my side:

  • I think currently it is possible to pass arbitrary variables to CMake (or any other build system; many more are supported by Conan), including a custom CMAKE_TOOLCHAIN_FILE. Also it is possible to set custom environment variables when executing the build tool. How should all these settings be mapped to the content of the conan_toolchain.cmake? What about other build tools (e.g. Meson)?
  • As @jasal82 already mentioned the CMAKE_TOOLCHAIN_FILE variable might be used for the cross build toolchain provided by Yocto or Buildroot. There is also the option of providing a CMAKE_PROJECT_<PROJECT-NAME>_INCLUDE variable. But again, this might also be used for other purposes in selected projects.
  • I agree with @DoDoENT that conan package comes handy to test the install step without changing the package cache.

@memsharded
Copy link
Member Author

@uboot

I think currently it is possible to pass arbitrary variables to CMake (or any other build system; many more are supported by Conan), including a custom CMAKE_TOOLCHAIN_FILE. Also it is possible to set custom environment variables when executing the build tool. How should all these settings be mapped to the content of the conan_toolchain.cmake? What about other build tools (e.g. Meson)?

Those variables can also be defined in the toolchain. Current CMakeToolchain allows to define .variables field in the generate() method, and they will be put in the conantoolchain.cmake. It will also be possible to add your own toolchain as well, replace the conan one or append to it, using the Conan provided toolchains is not mandatory, it is just a helper. What will be the proposed flows is that such information must be provided at conan install time, that is, created by the generate() method, instead of existing only in memory while executing the build() method, which makes the process non reproducible as developer with your IDE, for example.

The new MesonToolchain generator is already there as well. Same with MSBuildToolchain. All will operate with the same idea.

I agree with @DoDoENT that conan package comes handy to test the install step without changing the package cache.

The proposed flow is instead:

$ conan export-pkg ....
> Package created in folder <folder>
> Package created mypkg/version@user/channel#rrev:package_id#prev
# If you want to get rid of it, copy and paste
$ conan remove mypkg/version@user/channel#rrev:package_id#prev

It will be basically the same lines, just instead of a local rm -rf mypkgfolder, will be a conan remove. But the advantages are many: from a more complete and real process, in which the package_id is computed, to being able to actually test that package from another consumer project (or conan test).

@DoDoENT
Copy link

DoDoENT commented Feb 10, 2021

I think currently it is possible to pass arbitrary variables to CMake (or any other build system; many more are supported by Conan), including a custom CMAKE_TOOLCHAIN_FILE. Also, it is possible to set custom environment variables when executing the build tool. How should all these settings be mapped to the content of the conan_toolchain.cmake? What about other build tools (e.g. Meson)?

My understanding is that all flags that are actually enforced by current conan build command will be serialized into this conan_toolchain.cmake so that using cmake -DCMAKE_TOOLCHAIN_FILE=conan_toolchain.cmake will yield exactly the same behaviour as conan build currently does. I definitely like that, as it will liberate us from scripts that first ensure that Conan toolchains are installed (NDK, iOS, emscripten) and then create bash aliases such as icmake, ecmake or acmake (which are basically something like alias icmake = cmake -GXcode -DCMAKE_TOOLCHAIN_FILE=/path/to/calculated/conan/package/containing/ios.toolchain.cmake).

However, what bothers me is the removal of conan package method instead of improving it by adding package_id calculation support for it. Currently, the only way to test the conan package is to build it and export it into the local conan cache, which is not desirable for CI environments, as it pollutes the cache for other builds that will follow.

Just out of curiosity, it seems you're sharing the same Conan cache between different builds. Do you have the cache on shared storage or is everything built on the same machine? We decided that it's better to have everything containerized and start from a clean cache (for reproducability reasons), but for CI it could be beneficial sometimes to have shared cache semantics.

Yes, we do. Initially, we planned on having the same conan cache for multiple Jenkins executors that may use it in parallel, but this didn't work correctly, and, as @memsharded said, it's not something Conan was designed for. Then we decided to have separate conan cache for each Jenkins executor. We initially tried with having each build have the clean cache, but this was way too slow for us, so we decided to keep the cache between different builds on the same Jenkins executor. This increased our PR builds by incredible amounts (I was actually planning on giving talk about that in the inaugural ConanDays, but the pandemic prevented that).

For example, a single android debug arm64 conan install with clean cache for one of our most downstream projects takes about 2 minutes and 34 seconds and downloads around 6.4 GB of data in local conan cache (76 conan packages in total - no package was built by dependency; all binaries have been downloaded from local Artifactory server on gigabit ethernet). On the other hand, the very same conan install with full cache takes about 2 seconds.

Now, if you consider that actual compilation of the project takes about 11 seconds, you will clearly see why we don't want to start with empty cache on each build (we'd rather spend the saved time on running more tests). Then, a CI server needs to perform such build for both debug and release for 4 architectures of android, 3 on ios (device, simulator, catalyst), 1 on windows, 2 on Linux (gcc and clang), 1 on macOS and 1 (currently - we plan to add additional flavour) on emscripten, the time spent on downloading conan packages gets significantly larger than the actual build and test time of the project. We simply can't afford that. Furthermore, if there are multiple Jenkins executors doing lots of conan API calls to the Artifactory server it sometimes drives it to the breaking point, where Artifactory simply has too many connections and refuses new ones - in those cases, we get failed builds that need to be re-played.

We achieve the reproducibility by keeping the cache clean - we don't allow local packages that are being tested with conan test on PR builds to remain in those caches. This is why I argue that it should be possible for conan to either work with multiple caches simultaneously or to be able to conan test the package without exporting it to the local cache. If we kept the temporary packages in executor's local cache, we would have at least two problems:

  • the cache will grow out of control and we would have to clean it more often than we have to do currently (we currently do it once a week, so usually only the builds on Monday morning are slower because they need to pull all conan packages from the Artifactory)
  • the different build on the same executor that Jenkins will execute after the current one, would be able to use the temporary package which is not meant for usage

The whole reason why we are using conan is because of its very well designed binary cache - this reduced our build times from more than an hour (while building the entire codebase from source each and every time) to just several minutes (provided that the cache is full). We then used the saved time to run much more different (and more complex) tests on our PR builds.

@DoDoENT
Copy link

DoDoENT commented Feb 17, 2021

if the android ndk is your only concern, link it , done. Bum, all the sudden your extraction time is gone

Unfortunately, it's not. NDK is just one example of a larger package. There are also emscripten SDK and lots of our internal packages that are several hundred megabytes in size. But that is not the largest problem (it can be worked around with fast disks and fast network) - the problem is recipe fetching, which is slow and sequential no matter how fast network you have - on my benchmark just downloading recipes for 76 packages took more than half of the reported time (the download caches does not contain recipes, only binaries). This is why I advocate for the non-polluting cache.

Not that I have something against those, I would love to see a useful build/package/export. But it should be able useful.
Build doesn't even mention profiles.

I agree. I voted in favour of simpler conan build that combined both current conan install and conan build, including adding flags that were currently specific only to conan install.

I have issues with exporting the untested packages into the cache, namely the need to export it to the cache in order to test it and inspect that the package layout is as expected.

I have the feeling here is a loud minority asking for complexity to keep and even more, and maybe even extending complexity. Because of, and for, something that is broken.

I would argue that the current solution is more complex - because, as @kenfred stated:

The current cache model suffers from an issue of multiple responsibilities. It's a cache for what's in the remotes, it's a sandbox for development, and it's a staging area to upload to remotes. These responsibilities can be conflicting and, IMO, not the best user experience.

By enabling to test the conan package (and inspect its layout, which is just a type of test) without needing it to export to local cache, the complexity would be lowered - because the cache would then be just the cache for what's in the remotes (something a cache is supposed to be)

@KerstinKeller
Copy link

@memsharded Can we have a seperate discussion about the Conan Cache topic, also with feedback from the developers?
I think both @kenfred and @DoDoENT have pointed out valid shortcomings of the current cache design, and I feels it's worth having a discussion about it.

As for this topic, I've also given a thumbs up now. I will try to integrate the generate() Method to some of our recipes and see how it plays togehter with the CMakePaths generator.

@memsharded
Copy link
Member Author

One of the issues that I see with the assertion that because the cache would then be just the cache for what's in the remotes is not true. Because there is the need of the "build dependencies from sources", not everything is a conan create. When building dependencies from sources, you will get new builds and new packages. These packages will not exist in any server, and then the assumption that the cache should only contain things that are in the remotes is not valid.

Those packages might not work either, they are still "under testing", because recipes might not be complete for the current platform or subsystem, and now the cache would be polluted exactly in the same way that if you did a conan export-pkg.

So the Conan cache must be understood as a bidirectional storage, from the server to the client and from the client to the server. In both directions packages in the cache can be broken, then aiming by design that the cache is always "pristine" is impossible.

I agree with @KerstinKeller that we are deviating from the main proposal of the thread, and this should be taken separately to another thread. There will be more discussions and other proposals related the development flow and the cache, which doesn't mean we could agree on the following (as described in this proposal):

  • The conan install with generate() method followed by native build system is a desired flow, instead of being always forced to use conan build command.
  • The conan build command will be "complete", to reduce some of the complexity derived from the saved state and also the unexpected UX that doesn't allow to pass "install" arguments as many users expect.
  • The conan package command, as-is, doesn't serve the development flow, besides a very reduced debugging capability for package() method. Then it can be removed, such debugging capability is perfectly served by export-pkg. We will keep working in other proposals to improve the co-development and package-testing development experience, and if a conan package command is defined then, it might be a completely new from scratch proposal, not the current command (so the current command it is form will be removed).

@DoDoENT
Copy link

DoDoENT commented Feb 17, 2021

Those packages might not work either, they are still "under testing", because recipes might not be complete for the current platform or subsystem, and now the cache would be polluted exactly in the same way that if you did a conan export-pkg.

Indeed, this is true. I agree.

@KerstinKeller
Copy link

So the Conan cache must be understood as a bidirectional storage, from the server to the client and from the client to the server. In both directions packages in the cache can be broken, then aiming by design that the cache is always "pristine" is impossible.

Yes, I agree that with the current way you designed the cache, it is impossible to keep it pristine.
But at the same time, I'd argue (while probably you'd have very good reasons you designed the cache the way it is designed) that this design is not the only thinkable design of a conan cache.

There could very well be a design, of maybe a multi-layer cache, with a readonly layer, that gets fed from from the servers and a writable layer where packages can be created locally, and a staging layer for uploads. (however that is realized technically).

Or there could be a design, where, in source package creation is the default behavior to a conan create command. It is then followed by an explicit conan promote to promote locally created packages to the cache. (no more source and build folders ever in the cache!)

I hope that you agree that some of the problems that users are facing (for me personally, test_package only works for packages already in cache), are due to current design of the cache. If discussing it, or possibly making changes to this design is in scope for Conan 2.0 is a totally different topic. 😄

@kenfred
Copy link

kenfred commented Feb 17, 2021

One of the issues that I see with the assertion that because the cache would then be just the cache for what's in the remotes is not true. Because there is the need of the "build dependencies from sources", not everything is a conan create. When building dependencies from sources, you will get new builds and new packages. These packages will not exist in any server, and then the assumption that the cache should only contain things that are in the remotes is not valid.

I think you're getting hung up on the "in development" terminology. Of course no software is ever "done." If you'd like, think about it as a versioning and package immutability issue. The cache should only contain packages that have been released, given an official version and revision. If the package fails on my platform, I can be confident it's because the released packaged is deficient and not because I was actively developing it. Personally, I would only release a package through CI, so the would make that cache unidirectional and give confidence that all packages on the server and in the cache are released packages.

  • The conan install with generate() method followed by native build system is a desired flow, instead of being always forced to use conan build command.
  • The conan build command will be "complete", to reduce some of the complexity derived from the saved state and also the unexpected UX that doesn't allow to pass "install" arguments as many users expect.
  • The conan package command, as-is, doesn't serve the development flow, besides a very reduced debugging capability for package() method. Then it can be removed, such debugging capability is perfectly served by export-pkg. We will keep working in other proposals to improve the co-development and package-testing development experience, and if a conan package command is defined then, it might be a completely new from scratch proposal, not the current command (so the current command it is form will be removed).

I know it has seemed a bit off topic, but it is actually completely relevant to this proposal. I have to downvote because keeping conan install + conan build + conan package are critical to enabling out-of-cache, heterogeneous package co-development. They serve as the build-system-agnostic API that a super project makefile would use to build them all together and make sure that you rebuild one if another one changes.

@memsharded
Copy link
Member Author

I know it has seemed a bit off topic, but it is actually completely relevant to this proposal. I have to downvote because keeping conan install + conan build + conan package are critical to enabling out-of-cache, heterogeneous package co-development. They serve as the build-system-agnostic API that a super project makefile would use to build them all together and make sure that you rebuild one if another one changes.

@kenfred this is what is confusing to me. The result of the conan package command is invisible to other packages in co-development, as all the mechanisms will refer to either the Conan cache or to the "build" folder of the developed package. The output of generators will never point to the folder where conan package put the artifacts, and editable mode works on the build layout and not that package folder. Can you please clarify this?

@kenfred
Copy link

kenfred commented Feb 17, 2021

@kenfred this is what is confusing to me. The result of the conan package command is invisible to other packages in co-development, as all the mechanisms will refer to either the Conan cache or to the "build" folder of the developed package. The output of generators will never point to the folder where conan package put the artifacts, and editable mode works on the build layout and not that package folder. Can you please clarify this?

I am assuming that layouts and editables are going to get a significant redesign. At least that is what @jgsogo implied here. You can see in that same thread that I'm strongly advocating getting rid of layouts and instead use package folders. Trying to describe how to reach into the build output folder is problematic and redundant since everything is already known via package_info. I also try to convey the issues with editables and the current cmake workspace generator. It is my understanding that these are all being considered for redesign. Is that not correct?

To summarize my proposal:

  • Get rid of layouts and editables
  • Use the workspace file to point to the source dir of all the packages to be co-developed
  • conan install on the workspace file will:
    • Create "build" and "package" folders for each of the packages in the group at the cwd. (This local folder is akin to @DoDoENT's local-cache concept. It will resemble the cache in a lot of ways).
    • Generate a CMakeLists.txt or Makefile or VS Solution, etc. that is a super build. Running that build will install, build, and package each of the packages in the group in the reverse order of the dependency tree. The generators for the individual packages in the group will point to the "package" folders in this local directory for packages in the group.

This has a ton of benefits, which I've tried to describe. You can see that this idea relies on the fact that install, build, and package are things you can do on a conanfile. So for the proposal at hand, specifically removing the conan package command and perhaps changes to conan build, further entrench us in the multi-responsibility of the cache and harm opportunities to improve co-development.

@memsharded
Copy link
Member Author

@kenfred your proposal about workspaces (and co-development and local flow in general) has an important problem: it cannot work with the package folder, it must work with the build folder. Build systems, like CMake and VS, those super-projects that can be generated, will not understand the extra package() step, and they will stop at the built artifacts in the build folder, because that is what they know about. Furthermore, having the user somehow calling conan package for every modified and built package is impractical, and users have firmly rejected doing it, and saying they only want to do normal builds of the super-project, and everything should incrementally process modified files and build and link together, no extra conan commands or steps.

This has been the major challenge with workspaces and editables, I really wish it was possible for build systems to automate and understand the package step, but so far it has been impossible, and this is the reason we have tried to implement the editable concept (as a base for workspaces) via layout text files. It is true, we are actively working on this, trying to improve the experience (removing the external layout files), but still modelling where the built artifacts are at build() stage (not package() stage) seems very necessary.

@kenfred
Copy link

kenfred commented Feb 18, 2021

Thank you for this discussion!

@kenfred your proposal about workspaces (and co-development and local flow in general) has an important problem: it cannot work with the package folder, it must work with the build folder. Build systems, like CMake and VS, those super-projects that can be generated, will not understand the extra package() step, and they will stop at the built artifacts in the build folder, because that is what they know about.

This is not true. The rules and targets of these systems can run any custom step, expect outputs, and make those outputs dependencies of other steps. With CMake's ExternalProject, there are arguments for INSTALL_COMMAND and INSTALL_DIR, which would could be loaded with conan package and the "package" folder, respectively. Additionally, the DEPENDS argument will allow you to set up package dependencies.

Side note: There are some idiosyncrasies in CMake's ExternalProject. If it's found to be unsuitable, custom commands/targets could work, too.

Furthermore, having the user somehow calling conan package for every modified and built package is impractical, and users have firmly rejected doing it, and saying they only want to do normal builds of the super-project, and everything should incrementally process modified files and build and link together, no extra conan commands or steps.

See above. The super project will call conan package for you on all of the packages. Builds will be incremental because the individual conan builds will respect their individual build output folders. Subsequently, the conan package step will copy the build products to the "package" folder. You could argue that this conan package step would be redundant if nothing was rebuilt for that package. I would argue that it's normally a simple copy and no big deal. But if you wanted to, you could have a separate cmake custom command that depends on the conan build custom command and does the conan package only when necessary. (Often cmake.install is done within the build method anyway, so at least for cmake packages, you're not saving anything by separating the two).

All of my babbling is to say, conan package is useful and could potentially be used for more things and I vote against its removal.

@ytimenkov
Copy link

@kenfred

See above. The super project will call conan package for you on all of the packages. Builds will be incremental because the individual conan builds will respect their individual build output folders.

You're missing the point: it's not an incremental build. It's a script unconditionally calling a number of commands. As good as always running conan install as a pre-build step.

I think it was me (at least) who insisted on workspace feature should see through the whole build to set up transitive dependencies properly on a file level regardless of whether they come from the same package or different.

Without it you can't properly handle diamond (or other complex) package dependencies: you either have to build them one by one or deal with inconsistent builds because dependent packages are built before their dependencies (in the original workspace implementation). Or you need to invent a much more complex meta-generator which can propagate dependency information from Conan down to the build system.

And this is not to mention wasting time on waiting while those "incremental" builds finish. This way you get back to the speed of autotools where no-op build may take a minute.

@ytimenkov
Copy link

I'm quite frustrated to see that long list of anti-patterns many of which explicitly discouraged by Conan authors but still pushing for them: using CI workflow for development (exporting packages to local cache, "polluting" it), using developer commands on CI (build and package)...

No, it's fine to find another ways to use tools, but what is being asked at the bottom of this thread is to use microscope as a hammer: a total misuse of the tool. And ask is "no, we want a microscope to hammer nails with" 😔.

My understanding of the Tribe was to commit to try experimental things to get a feedback on how they work, find possible improvements, like being a pre-alpha tester for a feature, and not guarding pre-conan way of setting up a CI and holding everyone back by "we used this command and we would like continue using it".

@ytimenkov
Copy link

And maybe to put points over i: "development flow" is when edit code in IDE with code navigation, debugging and other good stuff, not running do-it-all command in the terminal.

Therefore a clear separation between "conan time" and "IDE time": the latter should not involve conan at all, only natural edit-build-debug cycle.

@a4z
Copy link

a4z commented Feb 18, 2021

reading through this, I think conan should have a stable API, and a plugin system, then people can extend it to their needs.
If something is useful , get shared and used by others, it can be considered if it is common useful, or fills a nice.

It is clear that in the tribe there are mostly people that think a little but more about what they need, and that there are passionated views about each very use case. And that people have perfect ideas on how to solve their very special use case.

But conan has to deliver a solid, simple base that covers the very basic work flows. And on which you can build on.
So, keep it simple, stable , solid , and provide an api for extension. And follow a clear vision. If continuing like that, there will not be much chance to get rid of a lot of things. And expensive to maintain complexity will be inevitable. And there is enough history of various projects to see how that will end. I hope for a better future for conan.

@jgsogo
Copy link
Contributor

jgsogo commented Feb 18, 2021

First of all, I want to share my view about the proposal here.

Local workflow (inside IDE development)

I think we all agree that Conan will benefit from the effort we are doing regarding toolchains. They will make it possible to develop a package using an IDE in a more natural way. This is local workflow for the same package, it is like using Conan as a consumer. The new conan install + <build_system> conan-toolchain is a huge improvement for local development (some build_systems might require an environment, like Visual Studio via CMake to set the generator, or any other project when it comes to cross-building). It doesn't invalidate this improvement.

The build() method will survive, it is necessary to tell Conan how to build a package when using conan create and to provide a single command-line call for CI system. Good news is that these toolchains efforts should be able to simplify the logic endoded inside the build() method. It is a win-win.


Creation of local builds/packages

I feel like the problem around conan build can be resumed with this quote someone wrote above:

  • conan build as a way to abstract the build process

Sometimes we need a command to generate a local package (whatever it means, see below) and conan build is very convenient to abstract the build process. Yes, it is needed. We need a way to create a build for a package locally (some of us are interested only in the build step, others find it valuable to test the package step too). We can call it conan create-local [--no-package-step] as someone suggested or conan build [--run-package-step], it is only a name, we are far from naming issues right now. We need the functionality, although we don't know the interface.

conan build stays, maybe with improvements, but the functionality is totally needed.


Co-development, local/temporary caches,

And this topic of a local package is very related to some other comments about temporary or local caches. You should know that we are exploring the look and feel of a new cache, so it is time to write and think about it and gather all the requirements and limitations (conan-io/conan#8510).

  • we wan't to be able to perform conan test on package without exporting it to the local cache
  • multiple caches [feature request] add support for simultaneous multiple conan caches conan#5513 (temporal ones) will allow a local conan create
  • that packages were first always created in a local, project specific cache, build, tested, and only then "promoted" to the system wide cache
  • The current cache model suffers from an issue of multiple responsibilities. It's a cache for what's in the remotes, it's a sandbox for development, and it's a staging area to upload to remotes. These responsibilities can be conflicting and, IMO, not the best user experience.

Co-developing locally two packages that belong to the same graph is challenging in Conan. We thought that the workspaces could be the answer, but they are very challenging and maybe they are doomed to work only if all the graph uses the same build-system. But, this is something most of us are doing today, and so far, sharing the cache among those projects and the same cache for all Conan is very inconvenient. And, if you use two caches, then we are no longer talking about co-developing. I experienced this when working on the IDE plugins, trying to work on several packages in parallel requires two much attention from the developer side (basically, build projects in order after changes upstream).

It looks like the answer can be on a better cache design, shared between projects we are working on (co-development), temporary cache (testing local builds/packages/test_packages),... without adding too much complexity for the solo-developer that wants a tool that works.

Implementation won't be the problem if we accept a database-based cache (conan-io/conan#8510), so we need to think about the business logic.

Looks like we need a shared cache for all the Conans running in a machine, it will contain packages coming from the remote and local packages commited/promoted to it, and then we need some staging cache that we can share between different projects (co-development) or a single project (local build/package/test_package) with packages that won't be taken into account by other Conans. Probably read/write cache or multi-layered/multi-colored cache are different conceptualizations of the same idea.

These shared/temporary cache can live in the main Conan cache (~/.conan/<cache>) or in a local folder ($(cwd)/.conan/<cache>). As we are talking about sharing a local/temporary cache, we need some pointer:

  • the user is adding something to every command: conan install --use-cache=/this/folder,
  • the cache is found in some local folder like $(pwd)/.conan and Conan automagically uses it.

And we are dangerously approaching a different topic: inside $(cwd)/.conan there isn't the cache itself, there is some workspace configuration that is able to override the one at ~/.conan/conan.conf... and we have ended up with a User/Workspace settings like many other tools (which is the automagical management of the CONAN_USER_HOME env variable). Because it is not only the cache, but different projects might use different package_id settings. --> This is out of scope, another tribe design document.


This is exciting topic and something WE WILL design, I'll try to organize this information, split it into different topics and provide a place to talk about them. Otherwise, all these valuable discussions will be hard to follow.

@kenfred
Copy link

kenfred commented Feb 18, 2021

@jgsogo Thanks! I'm concerned about your direction of multiple caches and the "doom" of workspaces. Nevertheless, I think we're all mostly in agreement on the main pain points and broad outlines of a solution. I hope that we can discuss further before you get too far down a path in your design. For now, I'll continue the discussion of multiple caches/local workspace build folder at conan-io/conan#5513.

@datalogics-kam
Copy link

I really don't like that conan build will imply an install...but I've tinkered with the toolchains and I can rework our build tasks to get around that. We'll probably have to take one of our bigger projects and move it from SCons to CMake, and maybe Conan 2.0 will help drive that effort.

One of the really nice parts of the "package development lifecycle" was that the piecewise and well-factored steps meant that if you wanted to do specific aspects of the process individually, you could. Requirements resolution is a lot of overhead; it may need network access, and it's at least NP-hard, so a requirements graph of any decent complexity is going to take some time to resolve. It really doesn't belong in a compile-debug-fix cycle. Thus, the utility of conan build will be lost on us anyway.

The one thing I'd be missing with a task refactoring is that Conan sets up the environment prior to running commands. I can make our stuff read the deps_env_info out of the output of the json generator, and use tools.environment_append() around running commands. Those will still be there, right?

@jamesweir-tomtom
Copy link

jamesweir-tomtom commented Feb 23, 2021

We make use of conan install -j JSON for a number of processes I presume this is unaffected by this proposal?

Regarding conan build we currently extensively use that to at least invoke cmake with all the expected configuration rather than having to replicate that effort for ourselves, if this is replaced with toolchains, find package modules etc... there would still be a need to pass at least some of this to the build system invocation (cmake in this case), or do you see that as something that would then be embodied within the build system description itself much like the include of conanbuildinfo.cmake ?

@memsharded
Copy link
Member Author

Hi @jamesweir-tomtom

conan install is not affected at all by this proposal.

Yes, the idea is that what is necessary to pass to cmake directly as consumer is much less, something that a human could easily do without much overhead: mostly the CMake generator and the cmaketoolchain file. And with that it should achieve the exact same build as if calling conan build command. The usage that you are proposing, to use it to wrap that invocation will still remain, it just might need some minor changes to the UI (like the conan install might not be mandatory, or that it will accept more arguments to specify the configuration).

@memsharded
Copy link
Member Author

@datalogics-kam

The one thing I'd be missing with a task refactoring is that Conan sets up the environment prior to running commands. I can make our stuff read the deps_env_info out of the output of the json generator, and use tools.environment_append() around running commands. Those will still be there, right?

The environment management might become more explicit. You can see an ongoing effort in Conan 1.X in conan-io/conan#8534. The goal is to make it similar to other toolchains: the environment needs to become explicit and finer control of what is added to the environment is needed. Specially with the build-host context scenarios, it is necessary to distinguish a bit more the "build-environment" from the "run-environment", and there are a few blocked issues/bugs/requests by the fact that there is a single deps_env_info that aggregates all the environment.

The overall idea is still remains, but it might have a different interface, instead of tools.environment_append() you might want to define those things in the generate() method, so they are stored to a environment script, that subsequent commands will read to inject the environment. It is still early stage, but as the plan is to introduce it as experimental optin (not breaking) in Conan 1.X from conan-io/conan#8534, there will be time to evaluate it, test it, adopt it, before getting to Conan 2.0, and we will probably discuss here later with the lessons learned from it.

@memsharded
Copy link
Member Author

We have added a few further notes to this proposal in 7951d20, to connect with other efforts, for example we have started to consider a 2 level cache design as the result of this discussion. With these comments, we think the proposal is good to be merged and we can start working on implementation in the develop2 branch. As always, this is a the current design proposal, we will keep evolving it as we learning from the implementation and feedback, and other connected pieces also evolve.

Thanks very much all for all these useful discussions and feedback!

@memsharded memsharded merged commit 32f8848 into conan-io:main Feb 26, 2021
@memsharded memsharded deleted the proposal/local_development_flow branch February 26, 2021 13:10
@DoDoENT
Copy link

DoDoENT commented Feb 26, 2021

... and there are a few blocked issues/bugs/requests by the fact that there is a single deps_env_info that aggregates all the environment.

Is this issue also part of that? It's been idle for 14 days now...

@memsharded
Copy link
Member Author

@DoDoENT not really. That initiative is so far about pure environment and env-vars, while the "build_modules" issues belong to cpp_info and build system generators as cmake. Many of the things that we are learning in the environment proposal will probably extrapolate to build system integrations, in particular, a more explicit management of the build-host context.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet