0

How to access my local dataset on jupyter notebook on google ML cloud engine?

I have created a VM on Google ML Cloud engine and also installed anaconda on the same VM.

How to access public and private image dataset from jupyter notebook?

I have uploaded very small dataset using upload button on jupyter notebook. But for this I have to upload all images one by one. Is there any way to just upload train and test dataset folders there?

How to access public dataset using a URL?

Maxim
  • 4,075
  • 1
  • 14
  • 23

1 Answers1

0

You may want to look into using gsutil that comes as part of the GCP SDK. This tool enables you to move local data to Google Cloud Storage so you can then access it through your notebooks.

Dan Kowalczyk
  • 4,103
  • 2
  • 18
  • 29
  • Yes, I have created a bucket and uploaded directories and files on it. but how to read those files from jupyter ntebook ? –  Jul 15 '18 at 12:30
  • Have you tried the python Client Libraries from Jupyter?, e.g. [for reading](https://cloud.google.com/storage/docs/downloading-objects#storage-download-object-code_sample): blob.download_to_filename(destination_file_name) for [writing](https://stackoverflow.com/a/43685299/9457843): blob.upload_from_filename(source_file_name). Another option that you may want to consider is [Deep Learning Virtual Machine](https://cloud.google.com/deep-learning-vm/docs/), and then [connect](https://cloud.google.com/deep-learning-vm/docs/jupyter). – rsantiago Oct 16 '18 at 17:58