65

I really want to avoid these annoying numpy warnings since I have to deal with a lot of NaNs. I know this is usually done with seterr, but for some reason here it does not work:

import numpy as np
data = np.random.random(100000).reshape(10, 100, 100) * np.nan
np.seterr(all="ignore")
np.nanmedian(data, axis=[1, 2])

It gives me a runtime warning even though I set numpy to ignore all errors...any help?

Edit (this is the warning that is recieved):

/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-p‌​ackages/numpy/lib/nanfunctions.py:612: RuntimeWarning: All-NaN slice encountered warnings.warn("All-NaN slice encountered", RuntimeWarning)

prosoitos
  • 6,679
  • 5
  • 27
  • 41
HansSnah
  • 2,160
  • 4
  • 18
  • 31
  • this should be also in `seterr` like other errors, if some code can handle missing values and there is no straightforward way to not have `All-NnN` slices (like if it is some sort of downsampling among non-missing values) it should just return `nan` and be quite. – dashesy Jun 13 '15 at 23:56

4 Answers4

96

Warnings can often be useful and in most cases I wouldn't advise this, but you can always make use of the Warnings module to ignore all warnings with filterwarnings:

warnings.filterwarnings('ignore')

Should you want to suppress uniquely your particular error, you could specify it with:

with warnings.catch_warnings():
    warnings.filterwarnings('ignore', r'All-NaN (slice|axis) encountered')
Robert Kern
  • 13,118
  • 3
  • 35
  • 32
miradulo
  • 28,857
  • 6
  • 80
  • 93
  • 21
    Thanks for showing how. I understand your reason for recommending against it. OTOH lately Python package authors' warnings are getting out of hand. If one uses numpy, tensorflow and the like it is common to have thousands of lines of warnings. This is a bad situation. If the warning become too onerous people just ignore them or suppress them - not least because they make finding actual error messages while developing very very very difficult. – Eric M Oct 07 '19 at 16:11
  • 3
    An example: I'm here because I'm taking logarithms and occasionally one of the inputs is 0. The NaN result is all I need down the line, so I'd rather not be spammed by warnings. – WolfLink May 18 '22 at 08:25
20

The warnings controlled by seterr() are those issued by the numpy ufunc machinery; e.g. when A / B creates a NaN in the C code that implements the division, say because there was an inf/inf somewhere in those arrays. Other numpy code may issue their own warnings for other reasons. In this case, you are using one of the NaN-ignoring reduction functions, like nanmin() or the like. You are passing it an array that contains all NaNs, or at least all NaNs along an axis that you requested the reduction along. Since the usual reason one uses nanmin() is to not get another NaN out, nanmin() will issue a warning that it has no choice but to give you a NaN. This goes directly to the standard library warnings machinery and not the numpy ufunc error control machinery since it isn't a ufunc and this production of a NaN isn't the same as what seterr(invalid=...) otherwise deals with.

Robert Kern
  • 13,118
  • 3
  • 35
  • 32
  • I have a code that can handle missing data (`nan`) so I would like to have the warning ignored only for the input to that function, but not silence it globally. `seterr` is great because one can temporarily silence something, unfortunately these new `nan` errors are not tune-able with it. – dashesy Jun 13 '15 at 23:48
  • 1
    You can also [temporarily silence these warnings](https://docs.python.org/2/library/warnings.html#temporarily-suppressing-warnings) as well with the standard library `warnings` machinery. – Robert Kern Jun 14 '15 at 15:43
  • 1
    Is there any difference between the np.warnings version and the python warnings machinery? – nedlrichards Apr 19 '19 at 17:08
  • 4
    `numpy.warnings` is just an accidental alias to the stdlib `warnings` module. `numpy/__init__.py` imports it to use it but neglects to delete it from its namespace. Don't use `numpy.warnings`; just `import warnings`. – Robert Kern Jun 08 '19 at 00:20
7

You may want to avoid suppressing the warning, because numpy raises this for a good reason. If you want to clean up your output, maybe handle it by explicitly returning a pre-defined value when your array is all nan.

def clean_nanmedian(s):
    if np.all(np.isnan(s)):
        return np.nan
    return np.nanmedian(s)

Also, keep in mind that this RuntimeWarning is raised only the first time that this happens in your run-time.

idnavid
  • 1,795
  • 17
  • 20
  • Good tidbit about how it only warns on the first occurrence. That actually answered a separate question I had. Thank you! – Chris Wong Sep 30 '22 at 23:36
0

Using

def fxn():
    warnings.warn("deprecated", DeprecationWarning)
with warnings.catch_warnings():
    warnings.simplefilter("ignore")
    warnings.filterwarnings('ignore', r'All-NaN (slice|axis) encountered')
    fxn()

did not suppress all Numpy warnings, in fact I still can see sklearn / numpy related warnings like

/usr/local/lib/python3.7/site-packages/sklearn/linear_model/randomized_l1.py:580: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  eps=4 * np.finfo(np.float).eps, n_jobs=None,
/usr/local/lib/python3.7/site-packages/sklearn/feature_extraction/image.py:167: DeprecationWarning: `np.int` is a deprecated alias for the builtin `int`. To silence this warning, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  dtype=np.int):

To remove those ones as well completely my solution was as simple as

import numpy as np
np.seterr(all="ignore")

Putting all together I did this suppressAllWarnings that can be furtherly customized:

def suppressAllWarnings(params={}):
    '''
        LP: suppress all warnings
        params = { regex: [ String ] }
    '''
    options = {
        "level": "ignore",
        "regex": [ r'All-NaN (slice|axis) encountered' ]
    }
    for attr in params:
        options[attr]=params[attr]
    import warnings
    # system-wide warning
    def fxn():
        warnings.warn("deprecated", DeprecationWarning)
    with warnings.catch_warnings():
        warnings.simplefilter("ignore")
        fxn()
        # LP custom regex warning
        for regex in options['regex']:
            warnings.filterwarnings('ignore', regex)
    # by module warnings
    try:
        # numpy warnings
        import numpy as np
        np.seterr(all=options['level'])
    except:
        pass

suppressAllWarnings()

[EDIT]

This function cannot solve import level issues, in that case it may help to surround the import with suppress_warnings:

with np.testing.suppress_warnings() as sup:
    sup.filter(DeprecationWarning)
    from sklearn.linear_model import LogisticRegression
    from sklearn.feature_extraction.text import TfidfVectorizer
loretoparisi
  • 15,724
  • 11
  • 102
  • 146