Conversation
|
@blink1073 WOW. Looks like you figured out the checklists AND kicked the checklist's ass! =) |
dbf1cae to
187ea08
Compare
|
Help @scikit-image/core, I've got the whole thing done save for one error. The test passes when I run ======================================================================
FAIL: skimage.filters.rank.tests.test_rank.test_all
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/travis/venv/local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/travis/build/blink1073/scikit-image/skimage/filters/rank/tests/test_rank.py", line 18, in test_all
check_all()
File "/home/travis/build/blink1073/scikit-image/skimage/filters/rank/tests/test_rank.py", line 27, in check_all
rank.autolevel(image, selem))
File "/usr/lib/python2.7/dist-packages/numpy/testing/utils.py", line 256, in assert_equal
return assert_array_equal(actual, desired, err_msg, verbose)
File "/usr/lib/python2.7/dist-packages/numpy/testing/utils.py", line 707, in assert_array_equal
verbose=verbose, header='Arrays are not equal')
File "/usr/lib/python2.7/dist-packages/numpy/testing/utils.py", line 636, in assert_array_compare
raise AssertionError(msg)
AssertionError:
Arrays are not equal
(mismatch 92.16%)
x: array([[ 0, 255, 37, 171, 8, 255, 0, 218, 255, 98, 255, 0, 25,
249, 0, 49, 0, 255, 225, 165, 255, 195, 126, 255, 0],
[255, 0, 241, 126, 168, 42, 255, 80, 148, 0, 196, 253, 31,...
y: array([[ 0, 201, 0, 216, 55, 0, 193, 36, 255, 0, 255, 0, 179,
0, 255, 110, 0, 255, 4, 199, 0, 216, 70, 222, 255],
[220, 255, 42, 200, 255, 99, 20, 72, 224, 11, 66, 207, 130,...
---------------------------------------------------------------------- |
|
Well, its fixed, but it's a head scratcher. There's a call to |
|
This is ready for review. I had one failed build in the last commit due to a network timeout, restarted with fingers crossed! |
skimage/_shared/testing.py
Outdated
There was a problem hiding this comment.
This is a bit of a misdirection... I would prefer from skimage._shared._warnings import all_warnings. Does that make sense?
There was a problem hiding this comment.
And would not with warnings.catch_warnings() work? I think it reads better.
Final thought: I don't know whether it's feasible but it would be nice to specifically discard expected warnings, not all warnings... Sorry for putting that out there... =P Thoughts?
|
@blink1073 A phenomenal amount of work! Sorry to pile more on... There's a few nitpicky things, a few clarifications that I'd like to have in the code for future us, but mostly, I'm not entirely convinced that the approach of hiding all warnings in a block of code is right (rather than just expected warnings). For example, in try-except clauses, we always specify the type of error we want to catch, so that if something truly unexpected happens, you don't suppress it silently. Would it be possible, for example, to match the message in the warning, adding the capability to with all_warnings(matching='precision'):
img_as_ubyte(float_image)I'm hoping this won't be too hard to fix since you've been so diligent with commenting the type of error in your context calls... What do you think? |
|
@jni, you're right, it did feel like I was writing a bare |
|
@jni what do you think of this: @contextmanager
def catch_warning(matching):
"""Context for use in testing to catch a known warning matching a regex
Parameters
----------
matching : str or compiled regex
Regex for the desired warning to catch
Examples
--------
>>> from skimage import data, img_as_ubyte, img_as_float
>>> with catch_warning('precision loss'):
... d = img_as_ubyte(img_as_float(data.coins()))
Notes
-----
Uses `all_warnings` to ensure all warnings are raised.
Upon exiting, it checks the recorded warnings for the desired matching
string.
"""
with all_warnings() as w:
yield w
matched = False
for warn in w:
if re.search(matching, str(warn.message)):
matched = True
if not matched:
raise ValueError('No warning raised matching: "%s"' % matching) |
|
@jni improved version: @contextmanager
def catch_warning(matching):
"""Context for use in testing to catch a known warning matching a regex
Parameters
----------
matching : str or compiled regex
Regex for the desired warning to catch
Examples
--------
>>> from skimage import data, img_as_ubyte, img_as_float
>>> with catch_warning('precision loss'):
... d = img_as_ubyte(img_as_float(data.coins()))
Notes
-----
Uses `all_warnings` to ensure all warnings are raised.
Upon exiting, it checks the recorded warnings for the desired matching
string. Raises a warning if the match was not found or an Unexpected
warning is found.
"""
with all_warnings() as w:
yield w
matched = False
for warn in w:
if re.search(matching, str(warn.message)):
matched = True
else:
raise ValueError('Unexpected warning: %s' % str(warn.message))
if not matched:
raise ValueError('No warning raised matching: "%s"' % matching) |
|
Well that did not work, I'm not sure. |
|
cc/ @tacaswell, any idea why we'd get this traceback after a test? I made sure we are calling Traceback (most recent call last):
File "/home/travis/venv/lib/python2.6/site-packages/matplotlib/backends/backend_qt5.py", line 427, in idle_draw
self.draw()
File "/home/travis/venv/lib/python2.6/site-packages/matplotlib/backends/backend_qt5agg.py", line 149, in draw
self._priv_update()
RuntimeError: Internal C++ object (FigureCanvas) already deleted. |
|
@blink1073 there's a couple of other warnings missed: https://travis-ci.org/scikit-image/scikit-image/jobs/45190537#L6695 Personally I think this PR is already extremely valuable and would be happy to leave these till next time. Your call if you want to complete it! |
|
@jni, all four of those are in doctests, and I'd rather run the doctest and get a warning than skip it. All of the warnings are reasonable to me. |
|
@blink1073 but why not ignore the doctest output? |
|
Sure, but how? |
|
|
|
That is not a built-in command, and I'm not sure if |
|
Welp, |
|
Welp indeed. =P maybe rip out that last commit and I'll merge? |
|
I say it can't hurt to have it in there, there is some leakage when you don't use it. |
|
Well, you've certainly earned your say. In it goes! ;) |
|
Boom. Awesome. |
|
Yay! That one sucked... |
|
Another PR to remove the |
|
Sucked for you, but @blink1073, it is one of the rockingest PRs ever. Super-clean builds are a great thing. And the warning catcher is excellent. Thank you! |
|
@blink1073 I won't say no...! =) |
|
I may be the team janitor, but I make that toilet shine! |
|
ROTFL |
|
The team janitor did great work! :-) |
|
@jni I see another import of "filter" instead of "filters" in this PR |
|
@stefanv crap! Looks like my... filters... need updating! =P |
Addresses #1182.
test_setupandtest_teardownfunctions inskimage._shared.testingto control the test environment, which is called from each test folder in its__init__.py.test_setupfunction imports packages known to raise warnings on import, then setswarnings.simplefilter('error) to raise errors for warnings.test_teardownremoves the filter, allowing doctests to raise warnings. It was deemed unwise to skip large swaths of doctests just because they were raising warnings.test_teardown. This technique could be used in future improvement endeavours.expected_warningsinskimage._shared._warningsthat allows us to flexibly mark tests for any combination of expected warnings, instead of them causing errors.