0

I want to upload nested folders and files to a Google Cloud Bucket.

So far I used this command line to do so:

gsutil -m cp -R [dir] gs://[bucket]

it works, but when I go to Firebase console, I cannot generate an access token for the uploaded files.

If I upload the same file manually the access token is generated automatically.

I wonder if there's a way to make gsutil to upload in a manner that files will have access token.

Appreciate your hints and conversations

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
Kourosh
  • 608
  • 2
  • 14
  • 34

1 Answers1

1

Download URLs are normally only generated by the Firebase SDK for Cloud Storage.

Luckily though somebody figured out that if you set the right metadata for a file, that gives it a download URL too. Since metadata can be set through gsutil, that means it should be possible to also generate download URLs like that.

See:

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
  • Thanks, I ended up writing a client that uses Firebase SDK to upload files recursively. Yes gsutil does work if setmeta is called for each of the uploaded files, but wasn't sure how to automate the whole upload process and setting meta with gsutil. Not sure if that's even possible using command line. – Kourosh Apr 21 '21 at 05:01