Skip to content

Update dependency eslint-plugin-n to v17.7.0 #4455

Update dependency eslint-plugin-n to v17.7.0

Update dependency eslint-plugin-n to v17.7.0 #4455

Workflow file for this run

This code is licensed from CircleCI to the user under the MIT license.

Check failure on line 1 in .github/workflows/install.python.yml

View workflow run for this annotation

GitHub Actions / .github/workflows/install.python.yml

Invalid workflow file

You have an error in your yaml syntax on line 1
# See here for details: https://circleci.com/developer/orbs/licensing
# TODO a mypy job
# publish
# TODO release requires something like version_variable = pyproject.toml:version to be present;
# could we make it easier? let be configured via orb and do runtime file manipulations? cli arg?
# https://circleci.com/gh/dialoguemd/orbs-test/8
# https://github.com/relekang/python-semantic-release/issues/119
# TODO ability to publish to pypi instead of gemfury
# TODO is there a way to make this transient issue go away? https://circleci.com/gh/dialoguemd/hippocrates/1252?utm_campaign=vcs-integration-link&utm_medium=referral&utm_source=github-build-link
version: 2.1
orbs:
slack: dialogue/slack@1.1.3
utils: dialogue/utils@3.2.6
base: dialogue/base@1.9.1
k8s: dialogue/k8s@4.2.6
description: |
This orb's commands all assume that `setup` from dialogue/base, and `setup` from this orb, have been run prior.
This orb encapsulates logic related to working with python projects, namely:
- installing dependencies
- running tests
- publishing packages
The commands all transparently (zero-config) support the following package managers:
- poetry: https://poetry.eustace.io
- pipenv: https://pipenv.readthedocs.io
- pip: https://pypi.org/project/pip/
You must have GEMFURY_TOKEN exported to install packges from and publish packages to Gemfury.
aliases:
- &param__executor
executor:
description: The executor to be use for a given job.
type: executor
default: python
- &param__fix_if
fix_if:
description: A bash test to dynamically enable or not autofix commits.
type: string
default: '[ "$ENABLE_AUTOFIX" ]'
- &param_autofix__if
if:
type: string
default: '[ "$CIRCLE_BRANCH_IS_BRANCH" ]'
description: |
Dynamic bash expression to determine if autofix should take place. Defaults to being disabled on trunk (env var comes from https://circleci.com/orbs/registry/orb/dialogue/base#commands-export_circle_extras).
To learn about or refresh on bash `if`:
- https://www.tldp.org/LDP/Bash-Beginners-Guide/html/sect_07_01.html
- https://linuxacademy.com/blog/linux/conditions-in-bash-scripting-if-statements
- &param__include_dev
include_dev:
# Opt-in flag until further discussion with team
type: boolean
default: false
description: Also publish development packages on feature branches
- &param__working_directory
working_directory:
type: string
default: "."
description: Directory to run post-checkout steps
- &param__tests_directory
tests_directory:
type: string
default: tests
description: Tell pytest where the tests are
- &param_job__source_directory
source_directory:
type: string
default: ""
description: 'Tell pytest where your source code is. Resolution algo is: (1) use this if given; (2) match working_directory if not "."; (3) match repo name'
- &param_cmd__source_directory
source_directory:
type: string
default: "$CIRCLE_PROJECT_REPONAME"
description: Tell pytest where your source code is
- &param__codecov_flag
codecov_flag:
type: string
default: ""
description: "Specify a codecov flag. Flags must be lowercase, alphanumeric, and not exceed 45 characters. Only lowercase, alphanumeric values are accepted : `^[a-z0-9_]{1,45}$`. Check https://docs.codecov.io/docs/flags for more info."
- &param__pytest_options
pytest_options:
type: string
default: ""
description: "Specify a pytest additional options. Options must be separated with spaces : --downgrade-db --show-locals"
- &param__before_install_steps
before_install_steps:
description: "Steps that will be executed before installing dependencies"
type: steps
default: []
jobs:
test:
description: |
Refer to command version for docs.
parameters:
<<: *param__executor
<<: *param__working_directory
<<: *param__tests_directory
<<: *param_job__source_directory
<<: *param__codecov_flag
<<: *param__pytest_options
<<: *param__before_install_steps
before_tests_steps:
description: "Steps that will be executed after dependencies are installed, but before tests are run"
type: steps
default: []
executor: <<parameters.executor>>
working_directory: ~/project/<<parameters.working_directory>>
steps:
- base/setup
- setup
- install_deps:
before_install_steps: << parameters.before_install_steps >>
- steps: << parameters.before_tests_steps >>
- run:
name: "python: resolve source directory"
command: |
param_working_directory="<<parameters.working_directory>>"
param_source_fodler="<<parameters.source_directory>>"
if [ "$param_source_fodler" ]; then
cci-export RESOLVED_SOURCE_DIRECTORY "$param_source_directory"
elif [ "$param_working_directory" != "." ]; then
cci-export RESOLVED_SOURCE_DIRECTORY "$param_working_directory"
else
cci-export RESOLVED_SOURCE_DIRECTORY "$CIRCLE_PROJECT_REPONAME"
fi
- test:
source_directory: "$RESOLVED_SOURCE_DIRECTORY"
tests_directory: <<parameters.tests_directory>>
codecov_flag: <<parameters.codecov_flag>>
pytest_options: <<parameters.pytest_options>>
run:
description: |
Run CLIs installed by your project. Wraps `pipenv run` / `poetry run` respectively if either in use.
parameters:
run:
description: Command to execute
type: string
<<: *param__executor
<<: *param__working_directory
executor: <<parameters.executor>>
working_directory: ~/project/<<parameters.working_directory>>
steps:
- base/setup
- setup
- install_deps
- run:
name: "python: run <<parameters.run>>"
command: ppp-run <<parameters.run>>
pylama:
description: |
Run pylama, a linting tool (https://github.com/klen/pylama).
A version of pylama is not provided, your project must depend on it.
parameters:
<<: *param__executor
<<: *param__working_directory
<<: *param__before_install_steps
executor: <<parameters.executor>>
working_directory: ~/project/<<parameters.working_directory>>
steps:
- base/setup
- setup
- install_deps:
before_install_steps: << parameters.before_install_steps >>
- run:
name: "python: pylama"
command: ppp-run pylama
isort:
description: |
Run isort, an imports auto-formatter (https://github.com/timothycrosley/isort). By default, when on a feature branch (simply, not trunk) it will commit the fixes back to the repo. These commits do not trigger CircleCI builds becuase the commit message includes a skip ci directive (https://circleci.com/docs/2.0/skip-build/#skipping-a-build).
Export env var ENABLE_AUTOFIX to enable autofix.
parameters:
<<: *param__executor
<<: *param__fix_if
<<: *param_autofix__if
<<: *param__working_directory
<<: *param__before_install_steps
executor: <<parameters.executor>>
working_directory: ~/project/<<parameters.working_directory>>
steps:
- base/setup
- setup
- install_deps:
before_install_steps: << parameters.before_install_steps >>
- utils/autofix:
message: isort
check_command: ppp-run isort --check-only --recursive .
fix_command: ppp-run isort --recursive .
fix_if: <<parameters.fix_if>>
if: <<parameters.if>>
black:
description: |
Run black, a code auto-formatter (https://github.com/ambv/black). By default, when on a feature branch (simply, not trunk) it will commit the fixes back to the repo. These commits do not trigger CircleCI builds becuase the commit message includes a skip ci directive (https://circleci.com/docs/2.0/skip-build/#skipping-a-build).
Export env var ENABLE_AUTOFIX to enable autofix.
parameters:
<<: *param__executor
<<: *param__fix_if
<<: *param_autofix__if
<<: *param__working_directory
<<: *param__before_install_steps
exclude:
description: |
RegEx pattern of files to ignore, passed to black's --exclude flag, appended to a base set provided by this job.
type: string
default: ""
flags:
description: |
String appended to black invocation. Pass desired flags here, reference https://github.com/ambv/black#command-line-options.
type: string
default: ""
executor: <<parameters.executor>>
working_directory: ~/project/<<parameters.working_directory>>
steps:
- base/setup
- setup
- install_deps:
before_install_steps: << parameters.before_install_steps >>
- utils/autofix:
message: black
check_command: ppp-run black --check --exclude '.vscode|.eggs|node_modules|.serverless|venv|.*/snapshots/.*<<# parameters.exclude>>|<<parameters.exclude>><</ parameters.exclude>>' <<parameters.flags>> .
fix_command: ppp-run black --exclude '.vscode|.eggs|node_modules|.serverless|venv|.*/snapshots/.*<<# parameters.exclude>>|<<parameters.exclude>><</ parameters.exclude>>' <<parameters.flags>> .
fix_if: <<parameters.fix_if>>
if: <<parameters.if>>
openapi:
description: |
Generate openapi (https://github.com/OAI/OpenAPI-Specification) spec from FastAPI (https://fastapi.tiangolo.com/#interactive-api-docs) framework
The resulting openapi.json file will be stored in the workspace for use in other jobs.
parameters:
<<: *param__executor
<<: *param__working_directory
root_module_name:
type: string
default: "$CIRCLE_PROJECT_REPONAME"
description: Name of the root python module of the python package
executor: <<parameters.executor>>
working_directory: ~/project/<<parameters.working_directory>>
steps:
- base/setup
- setup
- install_deps
- attach_workspace:
at: /tmp/workspace
- run:
name: Genereate openapi spec
command: |
param_working_directory="<<parameters.working_directory>>"
export param_root_module_name="<<parameters.root_module_name>>"
cat > openapi.py \<<EOF
import fastapi
import gc
import json
import os
import pkg_resources
import pkgutil
dist = pkg_resources.get_distribution(os.environ["param_root_module_name"])
root_module_name = pkg_resources.to_filename(dist.project_name)
module = __import__(root_module_name)
for loader, name, is_pkg in pkgutil.walk_packages(module.__path__):
__import__(f"{root_module_name}.{name}")
server = None
for obj in gc.get_objects():
if isinstance(obj, fastapi.FastAPI):
server = obj
break
if server:
openapi_output = fastapi.openapi.utils.get_openapi(title=dist.project_name, version=dist.version, routes=server.routes)
with open('openapi.json', 'w') as fp:
json.dump(openapi_output, fp)
else:
print("FastAPI server no found. OpenAPI has not been generated.")
EOF
ppp-run python openapi.py
if [[ -f "openapi.json" ]]; then
cat openapi.json | jq
if [[ ! "$param_working_directory" == "." ]]; then
mkdir "/tmp/workspace/$param_working_directory"
fi
mv openapi.json "/tmp/workspace/$param_working_directory"
else
log "There were no openapi spec generated. The python REST framework is not FastAPI or an error happened while generation the spec."
fi
- persist_to_workspace:
root: /tmp/workspace
paths:
- "<<parameters.working_directory>>/*.json"
- "*.json"
publish:
description: Refer to command version for docs.
parameters:
<<: *param__executor
executor: <<parameters.executor>>
steps:
- base/setup
- setup
- publish
- slack/message:
key: release
thread: true
if: '[ "$POST_MESSAGE_RESPONSE" ]'
# It would be nice to turn the package@version text into a link, but gemfury
# does not seem to make it possible to construct URLs from the data we have.
text: Published python package ${PACKAGE_NAME}@${NEXT_VER} from cci build <$CIRCLE_BUILD_URL|#$CIRCLE_BUILD_NUM>
publish_package:
description: |
DEPRECATED Use publish job instead
Refer to command version for docs.
parameters:
<<: *param__include_dev
<<: *param__executor
executor: <<parameters.executor>>
steps:
- base/setup
- setup
- publish_package:
include_dev: <<parameters.include_dev>>
- slack/message:
key: release
thread: true
if: '[ "$POST_MESSAGE_RESPONSE" ]'
# It would be nice to turn the package@version text into a link, but gemfury
# does not seem to make it possible to construct URLs from the data we have.
text: Published python package ${PACKAGE_NAME}@${PACKAGE_VERSION} from cci build <$CIRCLE_BUILD_URL|#$CIRCLE_BUILD_NUM>
build_and_publish_docs:
description: Bulds the docs using Sphinx and deploys the docs to https://docs.dev.dialoguecorp.com.
parameters:
slug:
description: The slug of the documentation. For multi-component apps, join the name of the app and the component with a slash. Ex "beautiful-mind/action-server".
type: string
default: $CIRCLE_PROJECT_REPONAME
docs_source_path:
description: Path to the docs source in the working directory.
type: string
default: docs
bucket_name:
description: Name of the bucket holding the docks contents.
type: string
default: dialoguecorp.docs.dev.ca-central-1
stage:
description: The stage in which the docs app and bucket are deployed.
type: string
default: dev
place:
description: The place in which the docs app and bucket are deployed.
type: string
default: ca2
<<: *param__working_directory
# TODO: improve this by potentially using s3fs
executor: base/launchpad
working_directory: ~/project/<<parameters.working_directory>>
steps:
- k8s/setup:
stage: <<parameters.stage>>
place: <<parameters.place>>
- setup
- install_deps
- run:
name: "python: generate docs"
command: |
ppp-run sphinx-build <<parameters.docs_source_path>> generated_docs -b dirhtml -D version=${CIRCLE_GIT_TAG} -D release=${CIRCLE_GIT_TAG}
- run:
name: "python: upload generated docs to s3"
command: |
if [ "$CIRCLE_BRANCH_IS_TRUNK" ]; then
aws s3 cp generated_docs s3://<<parameters.bucket_name>>/<<parameters.slug>> --recursive
else
aws s3 cp generated_docs s3://<<parameters.bucket_name>>/branches/<<parameters.slug>>/${CIRCLE_BRANCH_SLUG} --recursive
fi
if [ "$CIRCLE_GIT_TAG" ]; then
aws s3 cp generated_docs s3://<<parameters.bucket_name>>/tags/<<parameters.slug>>/${CIRCLE_GIT_TAG} --recursive
fi
- run:
name: "python: restart the docs app"
command: |
kubectl rollout restart deployment app --namespace docs
commands:
publish:
description: |
This command publishes a package to gemfury. It does not currently support publishing to PyPi.
On every build of a non-trunk branch the following version-pattern to be published:
0.dev+<GIT_COMMIT_SHA1>
For example:
0.dev+c20210e8091da85dba2705419f5f6a301b00e079
The pattern conforms to https://www.python.org/dev/peps/pep-0440/#developmental-releases.
On every build of a trunk branch the previous semver value found in the git tag history, bumped, and then git tagged on the current commit. The bump type is derived from the commit message in accordance with conventional commit.
steps:
- run:
name: "python: publish package"
command: |
# Export for access in downstream steps e.g. slack messages
cci-export PACKAGE_NAME $(pcregrep -o1 'name *= *"(.+)"' < pyproject.toml)
if [ "$CIRCLE_BRANCH_IS_TRUNK" ]; then
cci-export NEXT_VER "$(semver::cci-calc-next-ver || kill $$)"
else
# Originally inspired by https://pypi.org/project/vcver/
# Related PEP https://www.python.org/dev/peps/pep-0440/
#
# The version published must always be unique, it cannot overwrite
# a previously published version. We need 0.dev0+ prefix to satisfy
# PEP 440.
#
# The prefix "0." is chosen because it is not _generally_ known at
# branch time what the next version will be: patch, minor, major?
#
cci-export NEXT_VER "0.dev+${CIRCLE_SHA1}"
log "development release, but based on current git history and commit message, next version would be $(semver::cci-calc-next-ver || kill $$)"
fi
# Manipulate pyproject version so that:
# 1. build produces a coresponding file name for the built packages
# 2. publish (later step) reads version in manifest to find packages
yj -tj < pyproject.toml | jq '.tool.poetry.version = $ENV.NEXT_VER' | yj -jt > pyproject.toml.next
mv pyproject.toml.next pyproject.toml
poetry publish --build --no-interaction --repository dialogue
- run:
name: "python: push git tag (trunk only)"
command: |
if [ "$CIRCLE_BRANCH_IS_TRUNK" ]; then
git tag "$NEXT_VER"
git push --tags
else
skip-step "git tags are not set for development releases"
fi
publish_package:
description: |
DEPRECATED Use publish command instead
This command publishes a package to gemfury. It does not currently support publishing to PyPi.
By default publishing only happens if on trunk branch. However you can opt-in to have development releases as well. When development releases are enabled every build on feature branches (aka. not trunk) will cause the following version pattern to be published:
0.dev+<GIT_COMMIT_SHA1>
For example:
0.dev+c20210e8091da85dba2705419f5f6a301b00e079
The pattern conforms to https://www.python.org/dev/peps/pep-0440/#developmental-releases. This is an experimental feature and may become opt-out rather than opt-in in a future major.
parameters:
<<: *param__include_dev
steps:
- run:
name: "python: Skip job if not applicable"
command: |
_PARAM_INCLUDE_DEV="<<parameters.include_dev>>"
if [ "$CIRCLE_BRANCH_IS_BRANCH" ] && [ "$_PARAM_INCLUDE_DEV" = "false" ]; then
skip-job "Not trunk and dev releases not enabled"
fi
- run:
name: "python: Build package"
command: |
# Perform manifest manipulations if is a dev release
if [ "$CIRCLE_BRANCH_IS_BRANCH" ]; then
# Originally inspired by https://pypi.org/project/vcver/
# Related PEP https://www.python.org/dev/peps/pep-0440/
#
# The version published must always be unique, it cannot overwrite
# a previously published version. We need 0.dev0+ prefix to satisfy
# PEP 440.
#
# The prefix "0." is chosen because it is not _generally_ known at
# feature branch time in source code what the next version will be:
# patch, minor, major?
#
export _PACKAGE_VERSION="0.dev+${CIRCLE_SHA1}"
if [ "$PYTHON_PACKAGING_TOOL_IS_POETRY" ]; then
# Manipulate pyproject version so that:
# 1. build produces a coresponding file name for the built packages
# 2. publish (later step) reads version in manifest to find packages
yj -tj < pyproject.toml | jq '.tool.poetry.version = $ENV._PACKAGE_VERSION' | yj -jt > pyproject.toml.replace
mv pyproject.toml.replace pyproject.toml
else
# Manipulate the setup.py version so that:
# 1. build produces a coresponding file name for the built package
# 2. publish (later step) reads version in setup.py to find package file
pcregrep '__version__ *= *.+' < setup.py > /dev/null # assert match b/c sed does not
sed -E -i.bak "s/__version__ *= *.+/__version__ = \"$_PACKAGE_VERSION\"/" setup.py
rm setup.py.bak
fi
fi
# build distributable packages
if [ "$PYTHON_PACKAGING_TOOL_IS_POETRY" ]; then
poetry build
else
python setup.py sdist
fi
- run:
name: "python: Publish package"
command: |
# export for access in downstream steps e.g. slack messages
if [ "$PYTHON_PACKAGING_TOOL_IS_POETRY" ]; then
cci-export PACKAGE_NAME $(pcregrep -o1 'name *= *"(.+)"' < pyproject.toml)
cci-export PACKAGE_VERSION $(pcregrep -o1 'version *= *"(.+)"' < pyproject.toml)
else
cci-export PACKAGE_NAME $(python setup.py --name)
cci-export PACKAGE_VERSION $(python setup.py --version)
fi
if [ "$PYTHON_PACKAGING_TOOL_IS_POETRY" ]; then
poetry publish --no-interaction --repository dialogue
else
log "publishing ${PACKAGE_NAME}@${PACKAGE_VERSION}"
curl \
--form package=@dist/${PACKAGE_NAME}-${PACKAGE_VERSION}.tar.gz \
--fail \
https://${GEMFURY_TOKEN}@push.fury.io/dialogue/
fi
test:
description: |
Run the test suite using pytest.
Your project is responsible for installing pytest as well as coverage and pytest-cov.
Integration with CircleCI test data collection is automatic. Learn/refresh about that CCI feature here https://circleci.com/docs/2.0/collect-test-data/.
Enable Code Climate integration by exporting CC_TEST_REPORTER_ID. Refer to dialogue/utils/with_codeclimate docs for further detail (https://circleci.com/orbs/registry/orb/dialogue/utils#commands-with_codeclimate).
Enable Codecov integration by exporting CODECOV_TOKEN. Refer to dialogue/utils/send_coverage_to_codecov docs for further detail (https://circleci.com/orbs/registry/orb/dialogue/utils#commands-send_coverage_to_codecov).
As an optional/bonus alternative to coverage services, coverage report UI is automatically stored as an artifact which can then be browsed.
parameters:
<<: *param__tests_directory
<<: *param_cmd__source_directory
<<: *param__codecov_flag
<<: *param__pytest_options
steps:
- utils/with_codeclimate:
steps:
- run:
name: "python: pytest"
command: |
tests_directory="<<parameters.tests_directory>>"
source_directory="<<parameters.source_directory>>"
pytest_options="<<# parameters.pytest_options >><<parameters.pytest_options>><</ parameters.pytest_options >>"
# TODO can fail if user does not have coverage installed e.g.:
# coverage = "^4.5"
# pytest-cov = "^2.6"
# ref: https://circleci.com/gh/dialoguemd/dxa-stories-formatter/317?utm_campaign=vcs-integration-link&utm_medium=referral&utm_source=github-build-link
# Should we inspect for those deps and adjust accordingly automatically?
# Or, an explicit cov param
# Or, a custom command param
# explicit cov param allows for putting several steps behind conditional
# NOTES
#
# [1] for codeclimate and codecov
# [2] for artifact storage, for self-hostd cov explorer, NTH for debugging, not replacing a real service
# [3] for circleci test data collection
#
# [1] [2] [3]
ppp-run pytest "${tests_directory}" -vv --showlocals --cov-report xml --cov-report html --junitxml=test-reports/pytest/junit.xml --cov "${source_directory}" "${pytest_options}"
- utils/send_coverage_to_codecov:
codecov_flag: <<parameters.codecov_flag>>
- store_test_results:
path: test-reports # [3]
- store_artifacts:
path: htmlcov # [2]
- store_artifacts:
path: test-reports # [2]
- store_artifacts:
path: coverage.xml # [2]
- store_artifacts:
path: .coverage # [2]
install_deps:
description: |
Install your project's dependencies.
When using pip:
- Install command runs as `pip install -e . [[-r <YOUR REQUIREMENTS FILE>] ...]`
- A virtual env is setup and exported so future steps run within it.
- Requirements files are automatically infered from checksums present in pip_cache_key_requirement_files param
- All checksum files are temporarily created if not already present so checksumming and passing to pip -r will work
- By default the supported requirements file names are as follows; try to conform your project to this, otherwise customize via the param:
- requirements.txt
- requirements-dev.txt
- requirements-docs.txt
- requirements-tests.txt
- When passing a custom command a virtual env will still be setup as will the _REQUIREMENT_ARGS env var containing e.g. "-r requirements.txt -r requirements-dev.txt"
When using pipenv:
- Install command runs as `pipenv sync --dev`
When using poetry:
- Install command runs as `poetry install`
Tip, about globally managing custom flags/command:
The parameters `command` and `flags` have been defaulted to environment variables because it provides a way for a project to centrally alter how install will work across all existing jobs. The alternative would be that you have to write your own jobs or have this orb updated to allow threading command/flags params through all jobs, which would still be burdensome for users since every would have to do this in single job in their workflow.
parameters:
command:
description: Replace the install command with your own version in raw bash.
type: string
default: "${PYTHON_PACKAGING_TOOL_COMMAND}"
flags:
description: Append flags to the install command.
type: string
default: "${PYTHON_PACKAGING_TOOL_FLAGS}"
tool:
description: Which package manager to use.
type: enum
enum: ["poetry", "pip", "pipenv", "${PYTHON_PACKAGING_TOOL}"]
default: ${PYTHON_PACKAGING_TOOL}
pip_cache_key_requirement_files:
description: Only for pip projects. Which requirement files to install from. Expressed as a CCI cache key.
type: string
default: '{{ checksum "requirements.txt" }}-{{ checksum "requirements-dev.txt" }}-{{ checksum "requirements-docs.txt" }}-{{ checksum "requirements-tests.txt" }}'
<<: *param__before_install_steps
steps:
- steps: << parameters.before_install_steps >>
- run:
name: "python: Setup package manager coexist checksum files"
command: |
# Setup a file that will track all the temporary files we create so that we
# can properly clean them up in a later step
cci-export TEMP_CHECKSUM_FILES_TRACKING_FILE temporary-checksum-files.txt
_FILENAMES=(
Pipfile.lock
poetry.lock
$(echo $'<<parameters.pip_cache_key_requirement_files>>' | pcregrep -o1 'checksum "([^"]+)"')
)
log Temporially creating missing dependency checksum files
touch "$TEMP_CHECKSUM_FILES_TRACKING_FILE"
for F in "${_FILENAMES[@]}"; do
if [ -f "$F" ]; then
log-detail "$F, FOUND, doing nothing"
else
log-detail "$F, MISSING, creating an empty temporary file of that name"
touch "$F"
echo "$F" >> "$TEMP_CHECKSUM_FILES_TRACKING_FILE"
fi
done
- utils/with_cache:
key: '{{ checksum "Pipfile.lock" }}-{{ checksum "poetry.lock" }}-<<parameters.pip_cache_key_requirement_files>>'
namespace: pip-or-pipenv-or-poetry
path: ~/.cache/pip
path2: ~/.cache/pipenv
path3: ~/.cache/pypoetry
steps:
- run:
name: "python: Install python dependencies"
command: |
# pip based package management is a non-trivial beast... some references that
# came in handy while building this:
#
# - https://pip.pypa.io/en/stable/reference/pip_install/#caching
# - https://circleci.com/docs/2.0/language-python/
# - https://circleci.com/docs/2.0/caching/#using-a-lock-file
# - https://discuss.circleci.com/t/caching-installed-python-packages/10835/8
# - https://stackoverflow.com/questions/34578168/where-is-pip-cache-folder
_PARAM_FLAGS="<<parameters.flags>>"
_PARAM_COMMAND="<<parameters.command>>"
_PARAM_TOOL="<<parameters.tool>>"
# Run special setup respective to the packaging tool in use.
# Do this separately from actual install so that cci
# command parameter, if used, relies upon this setup.
if [ "$_PARAM_TOOL" = "pip" ]; then
# Activate a virtual environment. There is an alternative method
# that avoids using virtualenv, should we ever need it:
#
# _PATH=$(python -c "import site; print(site.getsitepackages()[0])")
# sudo chown -R circleci:circleci /usr/local/bin
# sudo chown -R circleci:circleci $_PATH
#
python -m venv venv
cci-export-bash '. venv/bin/activate'
# Build up the pip cli requirement file flags by detecting every single
# one present in the project root.
log "Will build requirement args for pip cli"
_REQUIREMENT_ARGS=""
for _FILENAME in *requirement*.txt; do
_REQUIREMENT_ARGS+=" --requirement $_FILENAME"
done
log-detail "$_REQUIREMENT_ARGS"
fi
# Run the actual install command
if [ "$_PARAM_COMMAND" ]; then
echo "==> Doing a custom dependency install"
eval "$_PARAM_COMMAND"
elif [ "$_PARAM_TOOL" = "pipenv" ]; then
echo "==> Doing a pipenv-based dependency install"
pipenv sync --dev $_PARAM_FLAGS
elif [ "$_PARAM_TOOL" = "poetry" ]; then
echo "==> Doing a poetry-based dependency install"
poetry install $_PARAM_FLAGS
elif [ "$_PARAM_TOOL" = "pip" ]; then
echo "==> Doing a pip-based dependency install"
eval "pip install -e . $_REQUIREMENT_ARGS $_PARAM_FLAGS"
else
bail "
Unknown type of python packaging for \$_PARAM_TOOL: $_PARAM_TOOL
"
fi
- run:
name: "python: Cleanup package manager coexist checksum files"
command: |
echo "==> Will remove temporary files that supported cache key checksum"
while read _FILENAME; do
rm "$_FILENAME"
echo "--> Did remove $_FILENAME"
done < "$TEMP_CHECKSUM_FILES_TRACKING_FILE"
echo "==> Will remove tracking file itself"
rm "$TEMP_CHECKSUM_FILES_TRACKING_FILE"
echo "--> OK"
setup:
description: |
Requires environment variable GEMFURY_TOKEN.
This commands sets up support for multiple package-managers. It does this by exporting some bash functions and environment variables that abstract the underlying tool. It uses the following simple heuristics to decide which tool is in use:
- pipenv if a `Pipfile` is found in repo root
- poetry if a `poetry.lock` is found in repo root
- pip if nothing else matches (aka. default)
An environment variable PYTHON_PACKAGING_TOOL is exported to indicate what was detected. Its possible values are: pipenv, poetry, pip. There is also a convenience env var exported for the tool in use to facilitate bash conditional logic, _one_ of the following:
- `PYTHON_PACKAGING_TOOL_IS_PIPENV`
- `PYTHON_PACKAGING_TOOL_IS_POETRY`
- `PYTHON_PACKAGING_TOOL_IS_PIP`
A bash function `ppp-run` is also exported to abstract running project-installed CLIs in bash. For example:
given: `ppp-run pytest`
then:
poetry: `poetry run pytest`
pipenv: `pipenv run pytest`
pip: `pytest`
If poetry, then:
- the latest version of poetry is installed.
- the dialogue repository is configured
Exports PIP_EXTRA_INDEX_URL to install private dependencies from gemfury.
steps:
- run:
name: "python: Setup and configure package manager"
command: |
# TODO org-global-v2 should only keep the token, not whole string
cci-export -s -o PIP_EXTRA_INDEX_URL "https://pypi.fury.io/$GEMFURY_TOKEN/dialogue/"
log 'will inspect project to detect if poetry, pipenv, or just pip is in use'
if [ -f "Pipfile" ]; then
cci-export PYTHON_PACKAGING_TOOL pipenv
cci-export PYTHON_PACKAGING_TOOL_IS_PIPENV true
elif [ -f "poetry.lock" ]; then
cci-export PYTHON_PACKAGING_TOOL poetry
cci-export PYTHON_PACKAGING_TOOL_IS_POETRY true
else
cci-export PYTHON_PACKAGING_TOOL pip
cci-export PYTHON_PACKAGING_TOOL_IS_PIP true
fi
# Bash alias does not work on CCI for unknown reason;
# There is an unanswered thread in the community asking
# about it: https://discuss.circleci.com/t/why-doesnt-alias-work-at-all/13032
if [ "$PYTHON_PACKAGING_TOOL_IS_PIPENV" ]; then
cci-export-bash 'ppp-run () { pipenv run $@; }'
elif [ "$PYTHON_PACKAGING_TOOL_IS_POETRY" ]; then
cci-export-bash 'ppp-run () { poetry run $@; }'
curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | python
cci-export-bash 'source $HOME/.poetry/env'
log 'configure poetry cache'
poetry config cache-dir ~/.cache/pypoetry
log 'configure poetry to work with gemfury'
poetry config repositories.dialogue https://pypi.fury.io/dialogue
poetry config http-basic.dialogue $GEMFURY_TOKEN ""
else
cci-export-bash 'ppp-run () { $@; }'
fi
executors:
python:
description: |
Based on the official CircleCI python convenience images. Defaults to python version 3.7 (latest patch version) and with node included. Used by this orb's jobs.
parameters:
version:
description: Python version to be pulled
type: enum
default: "3.7"
enum:
- "3.9"
- "3.9.0"
- "3.8"
- "3.8.6"
- "3.8.5"
- "3.8.4"
- "3.8.3"
- "3.8.2"
- "3.8.1"
- "3.8.0"
- "3.7"
- "3.7.9"
- "3.7.8"
- "3.7.7"
- "3.7.6"
- "3.7.5"
- "3.7.4"
- "3.7.3"
- "3.7.2"
- "3.7.1"
- "3.7.0"
- "3.6"
- "3.6.8"
- "3.6.7"
- "3.6.6"
- "3.6.5"
- "3.6.4"
- "3.6.3"
node_enabled:
description: Whether or not node should be included.
type: boolean
default: false
node:
description: The node variant to use.
type: enum
default: node
enum:
- node
- node-browsers
- node-browsers-legacy
docker:
- image: cimg/python:<<parameters.version>><<# parameters.node_enabled>>-<<parameters.node>><</ parameters.node_enabled>>
auth:
username: $DOCKERHUB_USER
password: $DOCKERHUB_ACCESS_TOKEN
python36:
description: |
Based on the official CircleCI convenience image python 3.6.8. This is an opinionated convenience executor, as opposed to configuring the flexible python executor yourself.
docker:
- image: circleci/python:3.6.8-node
auth:
username: $DOCKERHUB_USER
password: $DOCKERHUB_ACCESS_TOKEN