39

I'm trying to open a jupyter notebook and it takes a long time and I see at the bottom it's trying to load various [MathJax] extension, e.g. at the bottom left of the chrome browser it says:

Loading [MathJax]/extensions/safe.js

Eventually, the notebook loads, but it's frozen and then at the bottom left it keeps showing that it's trying to load other [MathJax] .js files.

Meanwhile, the "pages unresponsive do you want to kill them" pop up keeps popping up.

I have no equations or plots in my notebook so I can't understand what is going on. My notebook never did this before.

I googled this and some people said to delete the ipython checkpoints. Where would those be? I'm on Mac OS and using Anaconda.

profhoff
  • 1,017
  • 1
  • 13
  • 21

5 Answers5

34
  1. conda install -c conda-forge nbstripout

  2. nbstripout filename.ipynb. Make sure that there is no whitespace in the filename.

Laurel
  • 5,965
  • 14
  • 31
  • 57
Anas
  • 751
  • 8
  • 6
31

I had a feeling that the program in my Jupyter notebook was stuck trying to produce some output, so I restarted the kernel and cleared output and that seemed to do the trick!

If Jupyter crashes while opening the ipynb file, try "using nbstripout to clear output directly from the .ipynb file via command line"(bndwang). Install with pip install nbstripout

evn
  • 102
  • 2
  • 12
profhoff
  • 1,017
  • 1
  • 13
  • 21
  • 8
    Confirmed this was the problem for me too - I had to use nbstripout to clear output directly from the .ipynb file via command line because I couldn't open it in Jupyter to 'Clear Output' in the first place. – gonfy Jul 24 '18 at 14:15
  • Restarting kernel didn't work for me, (nor did removing the math.jax cookie). I was about to try to open the script in firefox to see what errorr might be raised as [suggested in this thread](https://github.com/jupyter/jupyter/issues/227), oddly, just opening firefox on it's own (without entering the url) seemed to cause my ipynb to load in chrome. – virtualxtc Aug 13 '18 at 00:08
  • 1
    profhoff, I downvoted because restarting the kernel is only available on a working notebook! "nbstripout" is not part of jupyter but is available on PyPI (and through pip). – SteveWithamDuplicate Jun 22 '19 at 03:13
  • For me, I am experiencing serious slowness when working with Jupyter Notebook in Chrome. For instance, Edge browser is fine. Maybe I have too much bookmarks and history in Chrome?! – mah65 Jun 10 '20 at 01:19
  • I confirm that clearing output from the menu worked for an extreme 0.5 GB notebook produced by some stress-tests (reduced it to 200kB, no `nbstripout` was needed). Make sure you save it before closing, or you're back to square one:) – mirekphd Aug 23 '20 at 20:33
  • You just saved me. – Elessar Jan 04 '21 at 19:38
13

I was having the same problem with jupyter notebook. My recommendations to you are as follows:

First, check the size of the .ipynb file you are trying to open. Probably the file size is in MB and is large. One of the reasons for this might be the output of a dataset that you previously displayed all rows.

For example; In order to check the dataset, sometimes I use pd.set_option('display.max_rows', None) instead of the .head() function. And so I view all the rows in the data set. The large number of outputs increases the file size, making the notebook slower. Try to delete such outputs.

I think this will solve your problem.

baligoyem
  • 131
  • 1
  • 4
  • Your first recommendation solved the issue for me. Indeed a had two cells where I ran functions with high verbosity and the Notebook reached 44 MB. I didn't realize it would have this effect. Thank you ! – Mihaela Grigore Apr 13 '21 at 14:19
3

Here restarting your kernel will not help. Instead use nbstripout to strip the output from command line. Run this command -> nbstripout FILE.ipynb Install nbstripout if it is not there https://pypi.org/project/nbstripout/

1

It happened to me the time I decided to print a matrix for 100000 times. The notebook file became 150MB and Jupyter (in Chrome) was not able to open it: it said all the things you experienced and then the page died saying it was "OutOfMemory".

I solved the issue opening it in Visual Studio Code, there is a button "Clear All Output", then I saved the notebook again and it was back to some hundreds of KB, which I could open normally.

If you don't have Visual Studio Code installed, you can open the notebook with another editor (gedit if you use Linux or Notepad++ in Windows) and try to delete the output cells. This is more tricky since you have to pay a lot of attention in what you are deleting, otherwise the notebook will stop working.