Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What's missing for feature parity with standard Odoo testing machinery? #67

Open
yajo opened this issue May 9, 2024 · 2 comments
Open

Comments

@yajo
Copy link

yajo commented May 9, 2024

This is more a question than an issue, but it could become a feature request if feature parity is feasible.

AFAICS pytest-odoo had a gap to bridge years ago: running tests without installing or updating addons:

pytest-odoo/README.rst

Lines 7 to 13 in 9315331

Also allowing to run tests without updating given module.
Odoo's `--test-enable` machinery and pytest-odoo do not cover exactly the same scope. The Odoo's test runner runs the tests during
the upgrades of the addons, that's why they need the "at install" and "post install" markers.
Running tests during upgrades of addons is waaay too slow to work efficiently in a TDD mode, that's where pytest-odoo shines.
Consider that all the tests are running `post-install` with pytest-odoo, as you must run the upgrade of the addon before (but only when needed vs each run).

Those sentences are no longer true. You can achieve the same with pure Odoo:

  • Run it with --test-file ./addon/tests/test_this.py and it won't install or update, only test.
  • Run it with --test-tags .test_name and only the test with that name will run.

So, we could say that the main goal of pytest-odoo is no longer an issue.

However, it still has the benefits of having the whole pytest ecosystem:

  • It can generate coverage reports easily.
  • It produces nicer output. Extremely nicer.
  • It can produce XML reports.
  • Nice IDE integration for test detection, execution and debugging.
  • Probably more plugins that I'm forgetting right now.

I pushed odoo/odoo#151728 some time ago to address these issues directly in Odoo, but it's clearly doomed for oblivion. So I'd be more than happy to entirely replace odoo testing machinery with pytest-odoo. But then...

Pytest-odoo can be considered a development tool, but not the tool that should replace entirely `--test-enable` in a CI.

Sadness.

People out there seem to ignore this warning. I'd like to ignore it too if possible. But I love my CI too much to feed it with unhealthy food.

So... what's missing to be able to remove that warning from the README? What's exactly the difference between pytest-odoo and odoo --test-enable that makes it not suitable for CI workloads?

IIUC these 2 points:

  1. Lack of subtest support in pytest. This was true when pytest-odoo was born, but nowadays pytest-subtests should cover this. Does it work with Odoo? Has anybody tested it?
  2. Unability to install or update addons, testing them under their canonical conditions.

Is it possible to reach feature parity and then reliably use pytest-odoo on CI workloads, entirely replacing (or wrapping) odoo?

Thanks!

@moduon MT-1075

@ap-wtioit
Copy link

People out there odoo/odoo#151728 (comment). I'd like to ignore it too if possible. But I love my CI too much to feed it with unhealthy food.

Not so nice mention ;-)

Regarding ignoring the warning, we have some safeguards in place to deal with the issues. And tests run with pytest-odoo are as stable as the tests on Odoo runbot / the default odoo tests. I feel the issues caused by not so fine demo data in Odoo are much worse than the issues caused by pytest-odoo.

Our way of running the tests without pytest-odoo (that are also needed if running tests before the post_install "fixes")

  • odoo -i module_to_test --stop-after-init
  • odoo -u module_to_test --test-enable --stop-after-init (also needs some fixes for handling work configs and other odoo config parameters incompatible with tests)

The second step can be easily replaced with pytest-odoo. And pytest-odoo enables us to select a test_something.py in the PyCharm IDE and select "run tests" from the context menu (with some adaptions to the args needed/done by script when using doodba). And also we can now finely tune which tests to run in a PR/MR based on the changed files.

Most issues i had with pytest-odoo where because of the tests written for Odoo not being compatible with pytest (and not with pytest-odoo):

  • abstract methods
  • methods named test_ not beeing a test (mostly in workarrounds for the lack of proper parameterized tests)
  • imported extended Tests beeing run twice if you do not del ImportedTestClass at the end

@yajo
Copy link
Author

yajo commented May 24, 2024

Not so nice mention ;-)

Sorry I didn't mean to be rude 😅

Thanks a lot for your comments.

Our way of running the tests without pytest-odoo (that are also needed if running tests before the post_install "fixes")

  • odoo -i module_to_test --stop-after-init
  • odoo -u module_to_test --test-enable --stop-after-init (also needs some fixes for handling work configs and other odoo config parameters incompatible with tests)

Well, in my case I wouldn't run tests like that. Tests tagged at_install are designed to run while installing the modules, and they don't have a full scope. Contrary to tests tagged post_install, which run at the end and have a full scope.

IIUC you're treating all tests as if they were post_install, but that always failed somewhere to me.

I prefer to do all in one shot:

odoo --stop-after-init -i mod1,mod2 --test-enable --test-tags /mod1,/mod2

That will install all the dependencies and run all at_install tests for mod1,mod2 during the install phase (as expected) and their post_install tests after loading the full registry (as expected). This is faster and adds zero surprises.

I wish there were some way to do that with pytest-odoo, but in reality all I need for it is the junit reports, so maybe https://github.com/orgs/OCA/discussions/167 is a better alternative.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants