I want to use kaggle data sets from a google bucket when using colab.
First: Is there a way to directly upload kaggle data sets to google bucket via the kaggle api?
Second: How do I use data in google bucket from colab without copying it to the notebook?
At the moment my experience with using google bucket with colab is through a URI for audio transcription such as this:
gcs_uri = 'gs://bucket_name/file_name.wav'
audio = types.RecognitionAudio(uri=gcs_uri)
I'm guessing I can also do something similar for loading data into python pandas dataframe directly from a URI. My experience with using kaggle api is on my local machine, for example:
kaggle competitions download -c petfinder-adoption-prediction
Which downloads the data using the kaggle api. If I load data to a colab notebook, it is removed between sessions, so my intention in using google bucket is to have it available for multiple sessions.