184

I want to view an image in Jupyter notebook. It's a 9.9MB .png file.

from IPython.display import Image
Image(filename='path_to_image/image.png')

I get the below error:

IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.

A bit surprising and reported elsewhere.

Is this expected and is there a simple solution?

(Error msg suggests changing limit in --NotebookApp.iopub_data_rate_limit.)

Merlin
  • 24,552
  • 41
  • 131
  • 206
lmart999
  • 6,671
  • 10
  • 29
  • 37
  • 3
    Is there a way to increase this from just the notebook itself instead of changing the config on the command line? I'm working in a shared environment and don't have rights to change jupyter on the command line. – Mike Pone Nov 23 '21 at 14:40

13 Answers13

207

Try this:

jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10

Or this:

yourTerminal:prompt> jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10 
Merlin
  • 24,552
  • 41
  • 131
  • 206
98

I ran into this using networkx and bokeh

This works for me in Windows 7 (taken from here):

  1. To create a jupyter_notebook_config.py file, with all the defaults commented out, you can use the following command line:

    $ jupyter notebook --generate-config

  2. Open the file and search for c.NotebookApp.iopub_data_rate_limit

  3. Comment out the line c.NotebookApp.iopub_data_rate_limit = 1000000 and change it to a higher default rate. l used c.NotebookApp.iopub_data_rate_limit = 10000000

This unforgiving default config is popping up in a lot of places. See git issues:

It looks like it might get resolved with the 5.1 release

Update:

Jupyter notebook is now on release 5.2.2. This problem should have been resolved. Upgrade using conda or pip.

Itay Livni
  • 2,143
  • 24
  • 38
  • 2
    After creating the config file, you can run `jupiter notebook`; since the jupyter_notebook_config.py was written to your Jupyter folder (for me: C:\Users\nnd\.jupyter\jupyter_notebook_config.p) ; Jupyter will pick up your changes. – Nate Anderson May 07 '17 at 22:36
  • 2
    If you cannot run jupyter notebook from cmd.exe (windows 10), try doing so from 'Anaconda prompt' (if you have that installed). – andyw Aug 30 '17 at 10:58
  • 2
    @Itay Livni: my windows command line doesn't recognise the command `$ jupyter notebook --generate-config` and says the command is either wrong or couldn't be found. I have windows 10. any suggestions? – artre Oct 22 '17 at 11:00
  • 2
    @artre `notebook --generate-config` should be typed. not the dollar sign – Itay Livni Oct 22 '17 at 15:00
  • @ItayLivni I tried that. but also without the dollar sign i get the same error message. – artre Oct 22 '17 at 16:10
  • 1
    @artre You should browse to the location of your jupyter,exe and then run the command provided above. For me it was in `\documents\anaconda2\scripts` – Salain Nov 21 '17 at 16:51
  • Release 5.2.2 is not in the changelog. The default config is the same in my 6.4.9 version. – liakoyras Mar 16 '22 at 12:55
  • 1
    I'm on 6.4.6, and still have the issue. – Hugh Perkins Jun 30 '22 at 14:23
  • v6.5.2 just gave me this issue. Never seen it previously. – sh37211 Mar 18 '23 at 02:48
32

Removing print statements can also fix the problem.

Apart from loading images, this error also happens when your code is printing continuously at a high rate, which is causing the error "IOPub data rate exceeded". E.g. if you have a print statement in a for loop somewhere that is being called over 1000 times.

azizbro
  • 3,069
  • 4
  • 22
  • 36
9

By typing 'jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10' in Anaconda PowerShell or prompt, the Jupyter notebook will open with the new configuration. Try now to run your query.

Merlin
  • 24,552
  • 41
  • 131
  • 206
6

Some additional advice for Windows(10) users:

  1. If you are using Anaconda Prompt/PowerShell for the first time, type "Anaconda" in the search field of your Windows task bar and you will see the suggested software.
  2. Make sure to open the Anaconda prompt as administrator.
  3. Always navigate to your user directory or the directory with your Jupyter Notebook files first before running the command. Otherwise you might end up somewhere in your system files and be confused by an unfamiliar file tree.

The correct way to open Jupyter notebook with new data limit from the Anaconda Prompt on my own Windows 10 PC is:

(base) C:\Users\mobarget\Google Drive\Jupyter Notebook>jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10
OnceUponATime
  • 450
  • 4
  • 12
3

I have the same problem in my Jupyter NB on Win 10 when querying from a MySQL database.

Removing any print statements solved my problem.

2

I ran into this problem running version 6.3.0. When I tried the top rated solution by Merlin the powershell prompt notified me that iopub_data_rate_limit has moved from NotebookApp to ServerApp. The solution still worked but wanted to mention the variation, especially as internal handling of the config may become deprecated.

desertBorn
  • 31
  • 4
1

For already running docker containers, try editing the file name - ~/.jupyter/jupyter_notebook_config.py uncomment the line - NotebookApp.iopub_data_rate_limit = and set high number like 1e10. Restart the docker, it should fix the problem

Sincole Brans
  • 186
  • 12
0

Easy workaround is to create a for loop and print. Then there wont be any issue. Printing directly wcc would cause if graph is huge. Hence any of below code will work as workaround.

wcc=list(nx.weakly_connected_components(train_graph)) for i in range(1,10): print(wcc[i])

for i in wcc): print(wcc)

0

Using Visual Studio Code, the Jupyter extension will be able to handle big data. launch from anaconda navigator

Skynet
  • 35
  • 6
0

Like others pointed out, print statement at a high rate can cause this. Resolve it by printing modulo a number using if statement. Example in python:

k = 10
if (i % k == 0):
   print("Something")

Increase k if the warning persists.

Toonia
  • 61
  • 1
  • 7
0

In general, trying to print something that is too long will trigger this error. I tried to print a string that was 9221593 characters long (too long), and that triggered the error.

0

I get the same error message in JupyterLab 3.6.3 (on Python 3.10.0 on Windows 10) when I use the help() on Pandas.

Although the help() function does not use print explicitly, the pandas documentation is 100s of pages long, so probably exceeds JupyterLab's or Jupyter Notebook capacity to display it.

The return type of the help() function and it is a NoneType so it likely uses the print() function internally or at least the str() attribute, which is the equivalent result as the print() function.

$ import pandas

$ help(pandas)

IOPub data rate exceeded. The Jupyter server will temporarily stop sending output to the client in order to avoid crashing it. To change this limit, set the config variable --ServerApp.iopub_data_rate_limit.

I get the same error message from Jupyter Classic NB started from the Help menu of JupyterLab 3.6.3.

IOPub data rate exceeded. The Jupyter server will temporarily stop sending output to the client in order to avoid crashing it. To change this limit, set the config variable --ServerApp.iopub_data_rate_limit.

Current values: ServerApp.iopub_data_rate_limit=1000000.0 (bytes/sec) ServerApp.rate_limit_window=3.0 (secs)

Server Information: You are using Jupyter NbClassic.

Jupyter Server v2.5.0 Jupyter nbclassic v0.5.3 (started using the "Launch Jupyter Classic Notebook dropdown menu within JupyterLab "Help" menu)

The solution above ( jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10 ) eliminated the help() function output error message. It gives me the full pandas help documentation within the JupyterLab NB output cell.

Thanks for the answers.

Rich Lysakowski PhD
  • 2,702
  • 31
  • 44