Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFE: env var for post build and test command #115

Open
jonathanunderwood opened this issue Dec 20, 2018 · 10 comments
Open

RFE: env var for post build and test command #115

jonathanunderwood opened this issue Dec 20, 2018 · 10 comments

Comments

@jonathanunderwood
Copy link

jonathanunderwood commented Dec 20, 2018

The current model for cibuildwheel is that the only artefact that needs to be copied out of the docker container is the resulting wheel.

However, since cibuildwheel offers functionality to test that built wheel in the docker container, I think it's helpful to think about copying out other artefacts, or providing a way to specify post-test commands to run. One such example would be a test coverage report that results from the tests. For example, a report that needs to be uploaded to a service such as codecov, coveralls, etc.

Currently, it's very clunky to do that: you'd need to move the coverage report to the output directory, and then upload it.

It would be really useful to have a way to specify a command that's run after the wheel build and test operations. An alternative would be an option to specify other artefacts that need to be copied out.

@joerick
Copy link
Contributor

joerick commented Dec 21, 2018

Hey Jonathan! Thanks for the issue. Have you tried appending your 'copy out' command to the end of the TEST_COMMAND? You can do something like this CIBW_TEST_COMMAND="test_command && after_successful_test_command". Let me know if it works for your use case.

@jonathanunderwood
Copy link
Author

Hi. Yes, that's what I am doing presently, and it does work. Perhaps my RFE is unecessary for this particular use case, but it felt like something that would be useful. Happy for you to close this if you don't think it's that useful - I also see the value in keeping the codebase as simple as possible.

Thank you for this great tool by the way, I am now using it for the python lz4 bindings. I had previously looked at publishing linux and osx wheels and given up, and then I found your tool :)

@joerick
Copy link
Contributor

joerick commented Dec 21, 2018

It wouldn't be too much work to add, but maybe for now we could have better docs for this kind of use case. The coverage report is something that we don't support very well with our TEST_COMMAND, I must admit.

Does anyone else have experience gathering coverage reports? I'd be curious what our options are - maybe we could build something specific in to CIBW for them.

@jonathanunderwood
Copy link
Author

Actually, interestingly, that approach only works for osx builds. For linux builds, for some reason, codecov is unable to recognize the project as a git repo, and coveralls, similarly. See:

https://travis-ci.org/python-lz4/python-lz4/builds/471116521

There's something quite inconsistent between the linux and osx builds.

@YannickJadoul
Copy link
Member

@jonathanunderwood The reason for this seems to be that the tests are run from a different directory and so you're indeed not in a git repository anymore: see https://github.com/joerick/cibuildwheel/blob/master/cibuildwheel/linux.py#L93

I'm not sure on the rationale behind this, but I expect it would be to make sure the installed version is being run rather than the code in the git checkout.

I don't know why it would not give the same problem in macOS, then, because there the command should also be run from $HOME? https://github.com/joerick/cibuildwheel/blob/master/cibuildwheel/macos.py#L131

@jonathanunderwood
Copy link
Author

@YannickJadoul yep, I figured that out after posting my comment. I worked around that, and then hit further inconsistencies in how environment variables are passed through to the build: issue #117

@s-weigand
Copy link

First of all thanks for the awesome tool ❤️

Since you were interested in how to gather the coverage reports this is my hacky approach for GitHub-Actions.

The problem with gathering coverage reports (when not directly uploading the form within the docker image) are:

  1. You need to get them out of the docker
  2. To combine them the paths to the files need to exist on the host

After the actual test command in CIBW_TEST_COMMAND is done I run the following script to adjust the paths in the .coverage files and make them available at the host.

import sqlite3
import sys
import os
import re


def is_cibuildwheel():
    """Check if run with cibuildwheel."""
    return "CIBUILDWHEEL" in os.environ


def replace_docker_path(path, runner_project_dir):
    """Update path to a file installed in a temp venv to runner_project_dir."""
    pattern = re.compile(r"\/tmp\/.+?\/site-packages")
    return pattern.sub(runner_project_dir, path)


def update_coverag_file(coverage_path, runner_project_dir):
    """
    Since the paths inside of docker vary from the runner paths,
    the paths in the .coverage file need to be adjusted to combine them,
    since 'coverage combine <folder>' checks if the file paths exist.
    """
    try:
        sqliteConnection = sqlite3.connect(coverage_path)
        cursor = sqliteConnection.cursor()
        print("Connected to Coverage SQLite")

        read_file_query = "SELECT id, path from file"
        cursor.execute(read_file_query)

        old_records = cursor.fetchall()
        new_records = [
            (replace_docker_path(path, runner_project_dir), _id) for _id, path in old_records
        ]
        print("Updated coverage file paths:\n", new_records)

        sql_update_query = "Update file set path = ? where id = ?"
        cursor.executemany(sql_update_query, new_records)
        sqliteConnection.commit()
        print("Coverage Updated successfully")
        cursor.close()

    except sqlite3.Error as error:
        print("Failed to coverage: ", error)
    finally:
        if sqliteConnection:
            sqliteConnection.close()
            print("The sqlite connection is closed")


def copy_coverage_cibuildwheel_docker(runner_project_dir):
    """
    When run with cibuildwheel under linux, the tests run in the folder /project
    inside docker and the coverage files need to be copied to the output folder.
    """
    coverage_path = "/project/tests/.coverage"
    if os.path.isfile(coverage_path):
        update_coverag_file(coverage_path, runner_project_dir)
        env_hash = hash((sys.version, os.environ.get("AUDITWHEEL_PLAT", "")))
        os.makedirs("/output", exist_ok=True)
        os.rename(coverage_path, "/output/.coverage.{}".format(env_hash))


if __name__ == "__main__":
    if is_cibuildwheel():
        # for CIBW under linux
        copy_coverage_cibuildwheel_docker("/home/runner/work/<project_name>/<project_name>")

Then I combine the files with coverage and upload them with codecov/codecov-action.

    - name: Set up Python 3.8 to combine coverage Linux
      if: runner.os == 'Linux'
      uses: actions/setup-python@v2
      with:
        python-version: 3.8

    - name: Combine coverage Linux
      if: runner.os == 'Linux'
      run: |
        echo '############ PWD'
        pwd
        python -m pip install coverage[toml]
        echo '############ combine'
        coverage combine ./wheelhouse
        echo '############ XML'
        coverage xml -o ./coverage.xml
        echo '############ FIND'
        find . -name .coverage.*
        find . -name coverage.xml

    - name: Codecov Upload
      uses: codecov/codecov-action@v1
      with:
        file: ./coverage.xml

I hope someone can point me at a better solution or at least this will help someone with the same struggles.

@joerick
Copy link
Contributor

joerick commented May 4, 2021

Thank you @s-weigand - your copy_coverage_cibuildwheel_docker and using the /output dir is a clever trick! I wonder if we could build in a post test command that supported this better... perhaps the user could do CIBW_AFTER_TEST=cd {output_dir} && coverage combine --append {package_dir} and that would aggregate coverage reports in the output folder.

@s-weigand
Copy link

@joerick Sorry for the late reply.
Since coverage.py should now properly support relative paths (nedbat/coveragepy#1147) my update_coverag_file hack shouldn't be needed anymore.
As an alternative to using the output folder, one could also mount a different dedicated dir and copy the output there, not to break existing workflows where all files get copied with a plain wildcard *.

@vespakoen
Copy link

I agree there should be an "after-all" / "after-test" command, that can be run inside of the container, I need it to create a tar.gz archive of the compiled Boost.Python libraries, which should be moved into the /host folder once all builds have completed.
The test-command almost works for this purpose, but the problem is that the test-command tries to install the (compiled) python modules, which fails when cross compiling for different architectures.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants