2

I am stuck here trying to deal with large python files and run each cell after, this one that i am working have ~70,1KB and when i open it it takes a long time waiting localhost and socket be available, and some seconds (sometimes more minutes loading extensions [MathJax]/extensions/Safe.js), when there are lots of outputs in the file it crashes the jupyter notebook, but recently i have closed them through this command in cmd: jupyter nbconvert --ClearOutputPreprocessor.enabled=True --inplace Notebook.ipynb from this link: How to clear an IPython Notebook's output in all cells from the Linux terminal? and so i was possible to open it with less delay, in compensation, to save and checkpoint and interrupt the kernel the process turns more drastic(these buttons keep blue how they are being clicked and nothing happen), and i finish being supposed to copy them in my notepad separatedly and so, run one by one in another Python3 notebook. Do you know any technique or procedure to fix this Jupyter Notebook slowness and make it faster, or it has something to do with the pc performance? Thanks in advance

Victor
  • 29
  • 1
  • 4

1 Answers1

1

My performance problems on Windows were mitigated by: greatly increasing pagefile size, shutting down the kernel of unused notebooks, installing the memory widget to monitor memory usage. Double check http://localhost:8888/tree to verify all notebooks are shutdown.

BSalita
  • 8,420
  • 10
  • 51
  • 68