Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Renaming nose compatible methods in flavour of regular pytest naming #29035

Merged
merged 1 commit into from Jan 20, 2023

Conversation

Taragolis
Copy link
Contributor

@Taragolis Taragolis commented Jan 19, 2023

In pytest 7.2 execute nose compatible methods such as setup()/teardown() was removed: pytest-dev/pytest#9886

Even though we use old version of pytest it better resolve it before we upgrade to the newest pytest

Most of the methods in this PR are actually fixtures and I don't think it affected by this deprecation.

@Taragolis Taragolis added changelog:skip Changes that should be skipped from the changelog (CI, tests, etc..) type:misc/internal Changelog: Misc changes that should appear in change log and removed provider:amazon-aws AWS/Amazon - related issues area:providers area:logging labels Jan 19, 2023
@Taragolis
Copy link
Contributor Author

And after found about deprecation functionality in pytest's changelog I finally realise why test which use setup/teardown methods work previously 🤣

@uranusjr
Copy link
Member

These should continue to work after they remove Nose compatibility so I think this PR is unnecessary.

@Taragolis
Copy link
Contributor Author

Well even if this work then at least this feature undocumented in pytest, only this methods documented

@Taragolis
Copy link
Contributor Author

Taragolis commented Jan 19, 2023

Yep, check in general this actually work (except teardown fixture)

import pytest


class TestNoseSetupTearDown:
    def setup(self):
        self.variable = 1

    def teardown(self):
        self.variable = 0

    def test_nose_setup(self):
        assert self.variable == 1


class TestFixturesUsesNoseNaming:
    @pytest.fixture(autouse=True)
    def setup(self):
        self.variable = 1

    @pytest.fixture(autouse=True)
    def teardown(self):
        """This fixture will fail until it renamed"""
        # No make sense to use fixture by this way, and we internally not use fixtures which has this name
        yield
        self.variable = 0

    def test_pytest_fixtures(self):
        assert self.variable == 1

But with additional warnings

pytest tests/test_nose.py
================================================================================ test session starts ================================================================================
platform darwin -- Python 3.9.9, pytest-7.2.1, pluggy-1.0.0
rootdir: /Users/taragolis/Projects/common/airflow-local, configfile: pytest.ini
plugins: anyio-3.6.2
collected 2 items                                                                                                                                                                   

tests/test_nose.py ..E                                                                                                                                                        [100%]

====================================================================================== ERRORS =======================================================================================
_______________________________________________________ ERROR at teardown of TestFixturesUsesNoseNaming.test_pytest_fixtures ________________________________________________________
Fixture "teardown" called directly. Fixtures are not meant to be called directly,
but are created automatically when test functions request them as parameters.
See https://docs.pytest.org/en/stable/explanation/fixtures.html for more information about fixtures, and
https://docs.pytest.org/en/stable/deprecations.html#calling-fixtures-directly about how to update your code.
================================================================================= warnings summary ==================================================================================
tests/test_nose.py::TestNoseSetupTearDown::test_nose_setup
  /Users/taragolis/Projects/common/airflow-local/.nox/develop/lib/python3.9/site-packages/_pytest/fixtures.py:901: PytestRemovedIn8Warning: Support for nose tests is deprecated and will be removed in a future release.
  tests/test_nose.py::TestNoseSetupTearDown::test_nose_setup is using nose-specific method: `setup(self)`
  To remove this warning, rename it to `setup_method(self)`
  See docs: https://docs.pytest.org/en/stable/deprecations.html#support-for-tests-written-for-nose
    fixture_result = next(generator)

tests/test_nose.py::TestNoseSetupTearDown::test_nose_setup
  /Users/taragolis/Projects/common/airflow-local/.nox/develop/lib/python3.9/site-packages/_pytest/fixtures.py:917: PytestRemovedIn8Warning: Support for nose tests is deprecated and will be removed in a future release.
  tests/test_nose.py::TestNoseSetupTearDown::test_nose_setup is using nose-specific method: `teardown(self)`
  To remove this warning, rename it to `teardown_method(self)`
  See docs: https://docs.pytest.org/en/stable/deprecations.html#support-for-tests-written-for-nose
    next(it)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============================================================================== short test summary info ==============================================================================
ERROR tests/test_nose.py::TestFixturesUsesNoseNaming::test_pytest_fixtures
====================================================================== 2 passed, 2 warnings, 1 error in 0.03s =======================================================================

@Taragolis
Copy link
Contributor Author

@uranusjr In additional we could add -p no:nose in pytest.ini adoptions, which disable autouse setup() and teardown() methods. WDYT?

@Taragolis Taragolis added the full tests needed We need to run full set of tests for this PR to merge label Jan 19, 2023
@potiuk
Copy link
Member

potiuk commented Jan 19, 2023

Yeah. I like to unify the tests and make our tests pytest-only. Great that we can disable the built-in plugin to verify it (and to prevent it from accidentally happen in the future).

@potiuk
Copy link
Member

potiuk commented Jan 19, 2023

(@uranusjr ?)

@Taragolis
Copy link
Contributor Author

Well... I thought that the k8s test failed due to the issues with Github Actions however it also use setup methods and we use pytest.ini across all tests.

Let's migrate them first.

@potiuk
Copy link
Member

potiuk commented Jan 20, 2023

Well... I thought that the k8s test failed due to the issues with Github Actions however it also use setup methods and we use pytest.ini across all tests.

Let's migrate them first.

Live is like a box of chocolates, you never know which one you get. You got Helm test this time :)

@Taragolis
Copy link
Contributor Author

Live is like a box of chocolates, you never know which one you get. You got Helm test this time :)

I postpone migrate this tests to pytest as much as possible, so this looks like a right time to do this. Just need to check locally that everything work as expected and I will create PR for that.

Also we need to use changes in pytest.ini as trigger for all tests because changes in it might affects all tests in selected directories

  • test
  • docker_tests
  • kubernetes_tests
  • dev/breeze/tests

@potiuk, is it the right place?

FileGroupForCi.ENVIRONMENT_FILES: [
r"^.github/workflows",
r"^dev/breeze",
r"^dev/.*\.py$",
r"^Dockerfile",
r"^scripts",
r"^setup.py",
r"^setup.cfg",
r"^generated/provider_dependencies.json$",
],

@potiuk
Copy link
Member

potiuk commented Jan 20, 2023

@potiuk, is it the right place?

Yep. But you can also add "full tests needed" label on a PR to run a complete set of tests.

@Taragolis
Copy link
Contributor Author

@potiuk, is it the right place?

Yep. But you can also add "full tests needed" label on a PR to run a complete set of tests.

And very easy to forget add this label even if author of the PR has access to do that. ¯\_(ツ)_/¯
This file does not change frequently.

Or we could move content from pytest.ini to pyproject.toml

@Taragolis
Copy link
Contributor Author

Just single non-relevant to this PR failure. Somehow distributed could avoid caplog filter: https://github.com/apache/airflow/actions/runs/3970014684/jobs/6805354398#step:6:6520

        with caplog.at_level(logging.ERROR, logger="airflow.jobs.backfill_job.BackfillJob"):
            caplog.clear()
            job.run()
>           assert "Backfill cannot be created for DagRun" in caplog.messages[0]
E           AssertionError: assert 'Backfill cannot be created for DagRun' in 'Failed to reconnect to scheduler after 30.00 seconds, closing client'

...

------------------------------ Captured log call -------------------------------
INFO     airflow.models.dag:dag.py:2692 Sync 1 DAGs
INFO     airflow.models.dag:dag.py:2713 Creating ORM DAG for test_backfill_skip_active_scheduled_dagrun
INFO     airflow.models.dag:dag.py:3439 Setting next_dagrun for test_backfill_skip_active_scheduled_dagrun to 2016-01-01T00:00:00+00:00, run_after=2016-01-02T00:00:00+00:00
ERROR    distributed.client:client.py:1312 Failed to reconnect to scheduler after 30.00 seconds, closing client
ERROR    airflow.jobs.backfill_job.BackfillJob:backfill_job.py:830 Backfill cannot be created for DagRun scheduled__2016-01-01T00:00:00+00:00 in 2016-01-01T00:00:00, as there's already scheduled in a RUNNING state.
ERROR    airflow.jobs.backfill_job.BackfillJob:backfill_job.py:837 Changing DagRun into BACKFILL would cause scheduler to lose track of executing tasks. Not changing DagRun type into BACKFILL, and trying insert another DagRun into database would cause database constraint violation for dag_id + execution_date combination. Please adjust backfill dates or wait for this DagRun to finish.

@potiuk
Copy link
Member

potiuk commented Jan 20, 2023

Just single non-relevant to this PR failure. Somehow distributed could avoid caplog filter: https://github.com/apache/airflow/actions/runs/3970014684/jobs/6805354398#step:6:6520

No -not skipped - simply a bad timing /flaky test:

Failed to reconnect to scheduler after 30.00 seconds, closing client

@potiuk potiuk merged commit bab57f1 into apache:main Jan 20, 2023
@Taragolis
Copy link
Contributor Author

Taragolis commented Jan 20, 2023

No -not skipped - simply a bad timing /flaky test:

Failed to reconnect to scheduler after 30.00 seconds, closing client

Technically we should not grab this record. We grab logs and check it inside with caplog.at_level(...) and clear previously captured logs.

So it could be:

  1. A problem with caplog fixture, will check may be same issue created in pytest
  2. It is possible that distributed change some logging configuration during execution
  3. Some part of Airflow logging

@potiuk
Copy link
Member

potiuk commented Jan 20, 2023

Ah - isn't that the "airflow.jobs.backfill_job.BackfillJob" generated log ? I thought simply backfill job sends the log entry when reconnecting? Because if not then maybe for some reaosn the logger is hi-jacked

@Taragolis
Copy link
Contributor Author

Ah - isn't that the "airflow.jobs.backfill_job.BackfillJob" generated log ?

Yeah, it is distributed.client:client.py logger create this message

ERROR    distributed.client:client.py:1312 Failed to reconnect to scheduler after 30.00 seconds, closing client

Let's have a look if it happen more often then do something with it. Do not know about probability, maybe 1:1000 or maybe 1:1M

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
changelog:skip Changes that should be skipped from the changelog (CI, tests, etc..) full tests needed We need to run full set of tests for this PR to merge type:misc/internal Changelog: Misc changes that should appear in change log
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants