0

I would like to pay for Google Colab Pro in order to have full access to the GPus, however my google drive is already full, so does that mean that, besides paying the Google Colab, should I also pay for the Google drive space extension? my drive is already full and I have a 30GB dataset to classify in Google Colab. So, is there any Python trick to load and manipulate my dataset in Colab without having to read the dataset from google drive? Can I load the dataset from my computer or, say, from another cloud service like Dropbox?

P.s: there is a related question here, but the solution seems to not work.

mad
  • 2,677
  • 8
  • 35
  • 78
  • 1
    There is a functionality you can use datasets from your local pc. You can directly upload it. you can read more it here, https://colab.research.google.com/notebooks/io.ipynb – Juhil Somaiya Feb 26 '20 at 10:11
  • @JuhilSomaiya but, by uploading, it means my dataset will be uploaded to colab (where I hypothetically have unlimited space), but not to my Google drive, correct? because it seems all the Colab's data are always read from Drive... – mad Feb 26 '20 at 10:14
  • 2
    If the dataset is available online (a link to download dataset is available) you can simply use wget or curl to download the data directly to colab, but you will have to do it everytime you start a session – SajanGohil Feb 26 '20 at 10:17
  • 1
    @mad Agree that colab always save data to drive but i don't think so the dataset is also saved to the drive because when an external dataset uploaded file nootbook you share to someone else data will not be there. he only needs to upload it again by himself. – Juhil Somaiya Feb 26 '20 at 10:20

0 Answers0