Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nose doesn't honor the decorator 'unittest.expectedFailure' #33

Open
jpellerin opened this issue Dec 14, 2011 · 8 comments
Open

Nose doesn't honor the decorator 'unittest.expectedFailure' #33

jpellerin opened this issue Dec 14, 2011 · 8 comments
Assignees

Comments

@jpellerin
Copy link
Member

What steps will reproduce the problem?

  1. Create a unit test method and decorate with the 'decorator'
    unittest.expectedFailure (introduced in Python 2.7)
  2. Run the above test using 'nosetests <test_module.py>'

3.The warning message 'RuntimeWarning: TestResult has no
addExpectedFailure method, reporting as passes RuntimeWarning)' is
reported on the console is printed

What is the expected output? What do you see instead?

When the unit test modules that has the unittests adorned with the
decorator 'unittest.expectedFailure' are invoked using 'python
<test_module>.py' no warnings are printed to console. But the
invocation of same test module using nose (i.e. 'nosetests
<test_module.py>') prints a RunTimeWarning above reported.

What version of the product are you using? On what operating system?

$ nosetests -V
nosetests-script.py version 1.0.0
OS: Windows XP SP3
Python version : 2.7

Please provide any additional information below.

Here are the detailed steps to simulate the problem scenario.

Consider a dummy class
%cat report.py

class Hello(object):

def value(self):
    return 100

And the associated unittest module
(Both reside in same directory)

%cat report_tests.py

import unittest
from report import Hello

class TestReport(unittest.TestCase):

@unittest.expectedFailure
def test_value(self):
    h = Hello()
    self.assertEqual(10,h.value())

if name == 'main':
unittest.main()

%nosetests -V
nosetests-script.py version 1.0.0

% nosetests report_tests.py
C:\Python27\lib\unittest\case.py:327: RuntimeWarning: TestResult has no addExpectedFailure method, reporting as passes
RuntimeWarning)

.

Ran 1 test in 0.000s

OK

%python -V
Python 2.7

%python report_tests.py

x

Ran 1 test in 0.000s

OK (expected failures=1)

As the above steps indicate the test invocation doesn't thrown any
warning when launched using python but shows a warning message when
launched using nosetests.

Google Code Info:
Issue #: 428
Author: sateeshp...@gmail.com
Created On: 2011-06-14T08:42:23.000Z
Closed On:

@ghost ghost assigned jpellerin Dec 14, 2011
@jpellerin
Copy link
Member Author

Here is a patch that adds a plugin along the lines of nose.plugins.skip for handling Python 2.7's expected failures. It needs some work and unit tests, but it's functional.

Google Code Info:
Author: robert.kern@gmail.com
Created On: 2011-11-10T23:17:58.000Z

@startling
Copy link

Bump. This issue is pretty annoying. If there's a patch, what else needs to be done? How can I help out?

@jpellerin
Copy link
Member Author

The comment on the patch says it needs tests and "some work", so I haven't reviewed it yet. Updating the patch would be one way to go, or you could try out nose2, which supports all unittest2 features (except the load_tests protocol, which I haven't gotten to yet).

@onip
Copy link

onip commented Jul 23, 2013

this is still a problem with nose 1.3.0

csdev added a commit to csdev/nose that referenced this issue Jan 3, 2015
csdev added a commit to csdev/nose that referenced this issue Jan 3, 2015
csdev added a commit to csdev/nose that referenced this issue Jan 5, 2015
@lukeyeager
Copy link

Still an issue with Nose 1.3.4

@jszakmeister
Copy link
Contributor

Patches welcome.

@pradyunsg
Copy link

Anyone interested in merging in the patch linked (csdev's fork)?

@jszakmeister
Copy link
Contributor

See #881. The contributor stalled on it, so the answer is likely no.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants