4

Jupyter notebook with about 600 lines or so suddenly got very laggy this past week. Previously all cells would run instantly, the file is mostly just simple pandas aggregations and column functions.

Now, when I reset kernel and outputs, even just clicking into a cell, not even running it, takes ~6 seconds. The screen freezes up, and goes back to normal after a bit. If I run the cell the same thing happens, even though the cell could just be a simple column renaming.

Any ideas? I do not recall changing anything about my Python or Jupyter installation, same packages and everything from a week ago when it worked fine.

Edit: I should add that I restarted computer several times as well but still same poor performance

I created a new notebook, it runs fine, but after a couple hours it slows down as well, even if memory usage is low (just a small dataframe)

AxW
  • 582
  • 1
  • 6
  • 20
  • How large is the file? – Derek O Mar 24 '21 at 03:44
  • Only about 80 KB, the file ran totally fine last week, but now is excruciatingly slow. Must be something under the hood, rather than a CPU or memory bottleneck – AxW Mar 24 '21 at 03:45

2 Answers2

3

If you use nbextension for Jupyter try to disable Variable Inspector (Edit -> nbextensions config)

illuminato
  • 1,057
  • 1
  • 11
  • 33
0

The following reasons are possible:

  1. Changes of other software within your systems.
  2. The data to be handled by Pandas is much bigger now and it consume more memory.

Possible ways to find out the cause(s)

  1. Try out same Jupyter Notebook using smaller datasets.
  2. Try out the same Python code using command line instead of from within Jupyter Notebook.

In both cases, you should restart your computer before performing the test. You should also monitor CPU, disk and memory utilization before, during and after.

yoonghm
  • 4,198
  • 1
  • 32
  • 48