Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Python 3.4's subTest feature #5

Open
ibrahima opened this issue Feb 15, 2018 · 2 comments
Open

Support Python 3.4's subTest feature #5

ibrahima opened this issue Feb 15, 2018 · 2 comments

Comments

@ibrahima
Copy link
Contributor

https://docs.python.org/3/library/unittest.html#distinguishing-test-iterations-using-subtests

Makes it easier to script parametrized tests.

@ibrahima
Copy link
Contributor Author

Hi @lrperlmu, thanks for the detailed request in #22! If you'd like, you could elaborate on your expected behavior here as well.

Since filing that issue we have added a partial credit feature, which might be a possible workaround if you wanted to merge multiple subtests into one test with partial credit: https://github.com/gradescope/gradescope-utils/blob/master/gradescope_utils/autograder_utils/decorators.py#L103

I haven't dug into this much but I'm not sure how the unittest framework represents subtests internally, whether we get separate passing or failing test results for each subtest. From the documentation it does look like it would do so, and hopefully it would be reasonable to add such a feature.

It looks like we'd need to add an addSubTest method to the JSONTestResult that would do the right thing and produce multiple outputs for each subtest, or at least store the appropriate data on the TestResult object so that the JSONTestRunner can produce the desired output at the end.

This isn't something that is on our current roadmap, but if anyone is able to submit a PR then we can consider adding this feature!

@lrperlmu
Copy link

@ibrahima thanks for your reply! Pasting the relevant info from the dupicate #22 here:

Is your feature request related to a problem? Please describe.
gradescope-utils does not produce output for subtests
(for definition of subtest, see https://docs.python.org/3/library/unittest.html#distinguishing-test-iterations-using-subtests)

Describe the solution you'd like
Output a separate json object for each subtest, so that each subtest appears in the json output just like each test.

To elaborate, I have some test code that needs to run sequentially, i.e. it would not work to make separate tests for it. But I want to make several assertions in the middle of that code block, and I want the rest of the test code to continue executing regardless of the outcome of each assertion. Regular tests exit the first time an assertion fails and the code after the first failed assertion does not run. Subtests would support what I'm asking for because each assertion can be in its own subtest and the test keeps running if one subtest fails.

I don't have the capacity right now to make a PR, but it's on my radar in case my availability opens up. Good to know this project is open to outside contributions :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants