39

Has anyone figured out a way to keep files persisted across sessions in Google's newly open sourced Colaboratory?

Using the sample notebooks, I'm successfully authenticating and transferring csv files from my Google Drive instance and have stashed them in /tmp, my ~, and ~/datalab. Pandas can read them just fine off of disk too. But once the session times out , it looks like the whole filesystem is wiped and a new VM is spun up, without downloaded files.

I guess this isn't surprising given Google's Colaboratory Faq:

Q: Where is my code executed? What happens to my execution state if I close the browser window?

A: Code is executed in a virtual machine dedicated to your account. Virtual machines are recycled when idle for a while, and have a maximum lifetime enforced by the system.

Given that, maybe this is a feature (ie "go use Google Cloud Storage, which works fine in Colaboratory")? When I first used the tool, I was hoping that any .csv files that were in the My File/Colab Notebooks Google Drive folder would be also loaded onto the VM instance that the notebook was running on :/

Community
  • 1
  • 1
user3424705
  • 391
  • 1
  • 3
  • 5

8 Answers8

31

Put that before your code, so will always download your file before run your code.

!wget -q http://www.yoursite.com/file.csv
stuckoverflow
  • 625
  • 2
  • 7
  • 23
Marcel Pinheiro
  • 423
  • 4
  • 7
16

Your interpretation is correct. VMs are ephemeral and recycled after periods of inactivity. There's no mechanism for persistent data on the VM itself right now.

In order for data to persist, you'll need to store it somewhere outside of the VM, e.g., Drive, GCS, or any other cloud hosting provider.

Some recipes for loading and saving data from external sources is available in the I/O example notebook.

Bob Smith
  • 36,107
  • 11
  • 98
  • 91
4

Not sure whether this is the best solution, but you can sync your data between Colab and Drive with automated authentication like this: https://gist.github.com/rdinse/159f5d77f13d03e0183cb8f7154b170a

Robin Dinse
  • 1,491
  • 14
  • 20
4

Include this for files in your Google Drive:

from google.colab import drive
drive.mount('/content/drive')

After it runs you will see it mounted in your files tab and you can access your files with the path:

'/content/drive/MyDrive/<your folder inside drive>/file.ext'
kidd
  • 61
  • 1
  • 5
2

Clouderizer may provide some data persistence, at the cost of a long setup(because you use google colab only as a host) and little space to work on.

But, in my opinion that's best than have your file(s) "recycled" when you forget to save your progress.

LeandroHumb
  • 843
  • 8
  • 23
2

As you pointed out, Google Colaboratory's file system is ephemeral. There are workarounds, though there's a network latency penalty and code overhead: e.g. you can use boilerplate code in your notebooks to mount external file systems like GDrive (see their example notebook).

Alternatively, while this is not supported in Colaboratory, other Jupyter hosting services – like Jupyo – provision dedicated VMs with persistent file systems so the data and the notebooks persist across sessions.

artur
  • 910
  • 1
  • 9
  • 14
0

If anyone's interested in saving and restoring the whole session, here's a snippet I'm using that you might find useful:

import os
import dill
from google.colab import drive

backup_dir = 'drive/My Drive/colab_sessions'
backup_file = 'notebook_env.db'
backup_path = backup_dir + '/' + backup_file

def init_drive():
  # create directory if not exist
  drive.mount('drive')
  if not os.path.exists(backup_dir):
    !mkdir backup_dir

def restart_kernel():
  os._exit(00)

def save_session():
  init_drive()
  dill.dump_session(backup_path)

def load_session():
  init_drive()
  dill.load_session(backup_path)

Edit: Works fine until your session size is not too big. You need to check if it works for you..

penduDev
  • 4,743
  • 35
  • 37
0

I was interested in importing a module in a separate .py file.

What I ended up doing is copying the .py file contents to the first cell in my notebook, adding the following text as the first line:

%%writefile mymodule.py

This creates a separate file named mymodule.py in the working directory so your notebook can use it with an import line.

I know that by running all of the code in the module would enable using the variables and functions in the notebook, but my code required importing a module, so that was good enough for me.