Skip to content

Commit

Permalink
♻️ REFACTOR: Remove reentry requirement (#5058)
Browse files Browse the repository at this point in the history
This commit replaces the use of `reentry`, for entry point loading,
with `importlib_metadata` and, in turn,
removes the requirement for users to run `reentry scan` after installations.

aiida-core makes heavy use of entry-points to define plugins.
The `reentry` package was introduced to load these plugins since,
at the time, the de facto `pkg_resources` method for using entry points
was too slow, in particular for responsive CLI usage.
This, however, came with the drawback that users must perform an extra step
to register the plugins before aiida-core can be used, or when new plugins are installed.

In recent years `importlib.metadata` and its backport `importlib_metadata`
has replaced `pkg_resources`, and as of python/importlib_metadata#317
is now on a par with `reentry` for performance.

For now, we use `importlib_metadata` for all python versions,
rather than the built-in (as of python 3.8) `importlib.metadata`,
so that we can use the new python 3.10 API and performance boosts.
  • Loading branch information
chrisjsewell committed Aug 12, 2021
1 parent 6f93253 commit 3ad0712
Show file tree
Hide file tree
Showing 34 changed files with 99 additions and 236 deletions.
3 changes: 0 additions & 3 deletions .docker/opt/configure-aiida.sh
Expand Up @@ -8,9 +8,6 @@ set -x
# Environment.
export SHELL=/bin/bash

# Update the list of installed aiida plugins.
reentry scan

# Setup AiiDA autocompletion.
grep _VERDI_COMPLETE /home/${SYSTEM_USER}/.bashrc &> /dev/null || echo 'eval "$(_VERDI_COMPLETE=source verdi)"' >> /home/${SYSTEM_USER}/.bashrc

Expand Down
4 changes: 2 additions & 2 deletions .github/system_tests/test_verdi_load_time.sh
Expand Up @@ -21,10 +21,10 @@ while true; do
load_time=$(/usr/bin/time -q -f "%e" $VERDI 2>&1 > /dev/null)

if (( $(echo "$load_time < $LOAD_LIMIT" | bc -l) )); then
echo "SUCCESS: loading time $load_time at iteration $iteration below $load_limit"
echo "SUCCESS: loading time $load_time at iteration $iteration below $LOAD_LIMIT"
break
else
echo "WARNING: loading time $load_time at iteration $iteration above $load_limit"
echo "WARNING: loading time $load_time at iteration $iteration above $LOAD_LIMIT"

if [ $iteration -eq $MAX_NUMBER_ATTEMPTS ]; then
echo "ERROR: loading time exceeded the load limit $iteration consecutive times."
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/benchmark.yml
Expand Up @@ -54,7 +54,6 @@ jobs:
python -m pip install --upgrade pip
pip install -r requirements/requirements-py-3.8.txt
pip install --no-deps -e .
reentry scan
pip freeze
- name: Run benchmarks
Expand Down
12 changes: 8 additions & 4 deletions .github/workflows/ci-code.yml
Expand Up @@ -98,7 +98,6 @@ jobs:
run: |
pip install --use-feature=2020-resolver -r requirements/requirements-py-${{ matrix.python-version }}.txt
pip install --use-feature=2020-resolver --no-deps -e .
reentry scan
pip freeze
- name: Setup environment
Expand All @@ -125,15 +124,20 @@ jobs:
verdi:

runs-on: ubuntu-latest
timeout-minutes: 30
timeout-minutes: 15

strategy:
fail-fast: false
matrix:
python-version: [3.8, 3.9]

steps:
- uses: actions/checkout@v2

- name: Set up Python 3.8
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: 3.8
python-version: ${{ matrix.python-version }}

- name: Install python dependencies
run: pip install -e .
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/rabbitmq.yml
Expand Up @@ -60,7 +60,6 @@ jobs:
run: |
pip install -r requirements/requirements-py-3.8.txt
pip install --no-deps -e .
reentry scan
pip freeze
- name: Run tests
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/release.yml
Expand Up @@ -85,7 +85,6 @@ jobs:
pip install --upgrade pip setuptools
pip install -r requirements/requirements-py-3.8.txt
pip install --no-deps -e .
reentry scan
- name: Run sub-set of test suite
run: pytest -sv -k 'requires_rmq'

Expand Down
1 change: 0 additions & 1 deletion .github/workflows/test-install.yml
Expand Up @@ -160,7 +160,6 @@ jobs:
- name: Install aiida-core
run: |
pip install -e .[atomic_tools,docs,notebook,rest,tests]
reentry scan
- run: pip freeze

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/verdi.sh
Expand Up @@ -20,10 +20,10 @@ while true; do
load_time=$(/usr/bin/time -q -f "%e" $VERDI 2>&1 > /dev/null)

if (( $(echo "$load_time < $LOAD_LIMIT" | bc -l) )); then
echo "SUCCESS: loading time $load_time at iteration $iteration below $load_limit"
echo "SUCCESS: loading time $load_time at iteration $iteration below $LOAD_LIMIT"
break
else
echo "WARNING: loading time $load_time at iteration $iteration above $load_limit"
echo "WARNING: loading time $load_time at iteration $iteration above $LOAD_LIMIT"

if [ $iteration -eq $MAX_NUMBER_ATTEMPTS ]; then
echo "ERROR: loading time exceeded the load limit $iteration consecutive times."
Expand Down
4 changes: 0 additions & 4 deletions .molecule/default/setup_aiida.yml
Expand Up @@ -12,10 +12,6 @@

tasks:

- name: reentry scan
command: "{{ venv_bin }}/reentry scan"
changed_when: false

- name: Create a new database with name "{{ aiida_backend }}"
postgresql_db:
name: "{{ aiida_backend }}"
Expand Down
16 changes: 2 additions & 14 deletions .pre-commit-config.yaml
@@ -1,7 +1,7 @@
ci:
autoupdate_schedule: monthly
autofix_prs: true
skip: [mypy, pylint, dm-generate-all, pyproject, dependencies, verdi-autodocs, version-number]
skip: [mypy, pylint, dm-generate-all, dependencies, verdi-autodocs, version-number]

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
Expand Down Expand Up @@ -70,6 +70,7 @@ repos:
aiida/manage/database/delete/nodes.py|
aiida/orm/nodes/node.py|
aiida/orm/nodes/process/.*py|
aiida/plugins/entry_point.py|
aiida/repository/.*py|
aiida/tools/graph/graph_traversers.py|
aiida/tools/groups/paths.py|
Expand Down Expand Up @@ -97,19 +98,6 @@ repos:
utils/dependency_management.py
)$
- id: pyproject
name: Validate pyproject.toml
entry: python ./utils/dependency_management.py validate-pyproject-toml
language: system
pass_filenames: false
files: >-
(?x)^(
setup.json|
setup.py|
utils/dependency_management.py|
pyproject.toml
)$
- id: dependencies
name: Validate environment.yml
entry: python ./utils/dependency_management.py validate-environment-yml
Expand Down
6 changes: 3 additions & 3 deletions aiida/cmdline/commands/cmd_computer.py
Expand Up @@ -20,7 +20,7 @@
from aiida.cmdline.utils import echo
from aiida.cmdline.utils.decorators import with_dbenv
from aiida.common.exceptions import ValidationError
from aiida.plugins.entry_point import get_entry_points
from aiida.plugins.entry_point import get_entry_point_names
from aiida.transports import cli as transport_cli


Expand Down Expand Up @@ -597,5 +597,5 @@ def computer_config_show(computer, user, defaults, as_option_string):
echo.echo(tabulate.tabulate(table, tablefmt='plain'))


for ep in get_entry_points('aiida.transports'):
computer_configure.add_command(transport_cli.create_configure_cmd(ep.name))
for ep_name in get_entry_point_names('aiida.transports'):
computer_configure.add_command(transport_cli.create_configure_cmd(ep_name))
7 changes: 4 additions & 3 deletions aiida/manage/database/integrity/plugins.py
Expand Up @@ -96,8 +96,7 @@ class of `JobCalculation`, would get `calculation.job.quantumespresso.pw.PwCalcu
:param type_strings: a set of type strings whose entry point is to be inferred
:return: a mapping of current node type string to the inferred entry point name
"""
from reentry.entrypoint import EntryPoint
from aiida.plugins.entry_point import get_entry_points
from aiida.plugins.entry_point import get_entry_points, parse_entry_point

prefix_calc_job = 'calculation.job.'
entry_point_group = 'aiida.calculations'
Expand All @@ -109,7 +108,9 @@ class of `JobCalculation`, would get `calculation.job.quantumespresso.pw.PwCalcu
# from the aiida-registry. Note that if entry points with the same name are found in both sets, the entry point
# from the local environment is kept as leading.
entry_points_local = get_entry_points(group=entry_point_group)
entry_points_registry = [EntryPoint.parse(entry_point) for entry_point in registered_calculation_entry_points]
entry_points_registry = [
parse_entry_point(entry_point_group, entry_point) for entry_point in registered_calculation_entry_points
]

entry_points = entry_points_local
entry_point_names = [entry_point.name for entry_point in entry_points]
Expand Down
4 changes: 2 additions & 2 deletions aiida/orm/nodes/node.py
Expand Up @@ -199,8 +199,8 @@ def validate_storability(self) -> None:
if not is_registered_entry_point(self.__module__, self.__class__.__name__, groups=('aiida.node', 'aiida.data')):
raise exceptions.StoringNotAllowed(
f'class `{self.__module__}:{self.__class__.__name__}` does not have a registered entry point. '
'Consider running `reentry scan`. If the issue persists, check that the corresponding plugin is '
'installed and that the entry point shows up in `verdi plugin list`.'
'Check that the corresponding plugin is installed '
'and that the entry point shows up in `verdi plugin list`.'
)

@classproperty
Expand Down
2 changes: 1 addition & 1 deletion aiida/orm/nodes/process/calculation/calcjob.py
Expand Up @@ -67,7 +67,7 @@ def tools(self) -> 'CalculationTools':
if self._tools is None:
entry_point_string = self.process_type

if is_valid_entry_point_string(entry_point_string):
if entry_point_string and is_valid_entry_point_string(entry_point_string):
entry_point = get_entry_point_from_string(entry_point_string)

try:
Expand Down
1 change: 1 addition & 0 deletions aiida/plugins/__init__.py
Expand Up @@ -32,6 +32,7 @@
'WorkflowFactory',
'load_entry_point',
'load_entry_point_from_string',
'parse_entry_point',
)

# yapf: enable

0 comments on commit 3ad0712

Please sign in to comment.