Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Single failed/skipped subTest causes whole test to fail/skip immediately #4027

Closed
rplevka opened this issue Nov 24, 2016 · 1 comment
Closed
Labels
Bug This is an issue with the robottelo framework

Comments

@rplevka
Copy link
Member

rplevka commented Nov 24, 2016

We have a problem with subTestusage.
In a test such as this one we use subTest to iterate over multiple API endpoints and we want to test them all.
However, due to {title} we exit the test immediately after first subtest failure/skip and never return to the rest.
This leads to seemingly random failures/skips over multiple builds, since every build gets subTests with a set of randomly ordered values and the problematic values are sometimes positioned in the beginning and sometimes somewhere in the middle, etc.

The bigger problem is, that we simply skip the rest of the subtests if such scenario takes place.

Perhaps we might implement some sort of subTest decorator, which would collect the assertion Exceptions in some sort of a list without raising them immediately and the final test assertion would be done on the length of this list
eg:

if len(assertionErrors) is not 0):
    raise AssertionError as e:
    for error in assertionErrors:
        e.args += (error.err)
    raise

external references: pytest-dev/pytest#1367

If my problem description is not clear, this example should demonstrate the issue:

from unittest2 import TestCase

class NumbersTest(TestCase):
    def test_even(self):
        """
        Test that numbers between 0 and 5 are all even.
        """
        for i in range(0, 6): 
            with self.subTest(i=i):
                if i == 3:
                    self.skipTest("Skip 3.")
                self.assertEqual(i % 2, 0)
$ pytest unittest_test.py
======================= test session starts =======================
platform linux2 -- Python 2.7.11, pytest-3.0.4, py-1.4.31, pluggy-0.4.0
rootdir: /home/rplevka/work/playground, inifile: 
plugins: xdist-1.14
collected 1 items 

unittest_test.py F

============================ FAILURES =============================
______________________ NumbersTest.test_even ______________________

self = <unittest_test.NumbersTest testMethod=test_even>

    def test_even(self):
        """
            Test that numbers between 0 and 5 are all even.
            """
        for i in range(0, 6):
            with self.subTest(i=i):
                if i == 3:
                    self.skipTest("Skip 3.")
>               self.assertEqual(i % 2, 0)
E               AssertionError: 1 != 0

unittest_test.py:12: AssertionError
==================== 1 failed in 0.02 seconds =====================
@rplevka rplevka added the Bug This is an issue with the robottelo framework label Nov 24, 2016
@lpramuk
Copy link
Contributor

lpramuk commented Nov 25, 2016

Sometimes skipping after first fail is intended - typically when we iterate over similar values.
But sometimes is unwanted - when we iterate over very different values.
For example docker resource:

  • unix socket
  • local tcp
  • remote tcp

Once one type of docker CR fails other types are not tested at all !
We randomly test these. Better is to split such cases into multiple TestCase classes with similar tests. (thus we wouldn't mind failing early in subtest we wouldn't mind)

@rochacbruno rochacbruno added the TestFailure Issues and PRs related to a test failing in automation label Jan 13, 2017
@rplevka rplevka closed this as completed Nov 1, 2018
@rplevka rplevka removed the TestFailure Issues and PRs related to a test failing in automation label Nov 1, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug This is an issue with the robottelo framework
Projects
None yet
Development

No branches or pull requests

3 participants