1

If I keep open a jupyter notebook / colab one for sometime without any activity, somehow the notebook "forgets" all the values (e.g. csv files loaded into DataFrames) and I need to re-run the whole notebook which is time consuming.

By the way, I diligently save the notebook with each change.

It happens also when I reload the notebook. It seems odd.

Any ideas how to prevent it? Thank you

Dan Mintz
  • 49
  • 2
  • 9
  • 1
    AFAIK there isn't a way to achieve this (in Google Colab at least; not enough experience in Jupyter), this is because it would take up additional space on Google's server to store it in that location permanently vs storing it temporarily at run time as the data set is needed – CyberStems Jan 03 '20 at 18:18
  • 3
    Following on @CyberStems excellent answer, you can work on things to make it easier for you to return to approximately the state you left off in your session. For example, you can pickle (serialize) your active dataframes to a memory efficient file form with `df.to_pickle("data.pkl") so they can be re-read in to pick back up more easily. Then you can build on that to use tar to bundle together items for easier storage and unarchiving later. You can incorporate using "Dill" to retain your session data if you need it, see [here](https://stackoverflow.com/a/50985430/8508004). – Wayne Jan 03 '20 at 21:18

0 Answers0