Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate tests parametrizing many cases #3070

Closed
jbrockmendel opened this issue Dec 29, 2017 · 3 comments
Closed

Generate tests parametrizing many cases #3070

jbrockmendel opened this issue Dec 29, 2017 · 3 comments
Labels
type: question general question, might be closed after 2 weeks of inactivity

Comments

@jbrockmendel
Copy link

Over in dateutil we've been slowly transitioning towards using pytest. In my first attempt to use pytest.mark.parametrize I made some absurd number of combinations by specifying a half dozen parameters and iterating over large ranges of possible dates. Millions of tests, RAM use exploded during test collection, and I had to kill the process. Painless learning experience as these thing go.

Is there a supported way to build these tests generator-style instead of list-style? Something analogous to nose's yield-base tests?

@pytestbot
Copy link
Contributor

GitMate.io thinks the contributor most likely able to help you is @RonnyPfannschmidt.

@RonnyPfannschmidt
Copy link
Member

for generating millions of examples and verifying those, its best to use hypothesis and its strategies,
pytest would have to collect all those tests with all the metadata

collection happens always before test running

@RonnyPfannschmidt RonnyPfannschmidt added the type: question general question, might be closed after 2 weeks of inactivity label Dec 31, 2017
@nicoddemus
Copy link
Member

I think the short answer for now is that pytest currently doesn't support generator-style tests because the current model requires that collection happens before the run phase. I don't see this changing because it is a major shift in design and would probably signify major breakage.

I suppose an alternative would be to support subtest-style tests in pytest (#1367), because you would have a single test item and multiple reports. This is probably something possible to accomplish because currently for each test item we already have a multiple reports, and some plugins (pytest-repeat for example) take this further and generate multiple "call" reports for each item anyway.

I'm closing this for now but feel free to follow up with any questions you may have.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: question general question, might be closed after 2 weeks of inactivity
Projects
None yet
Development

No branches or pull requests

4 participants