0

I have data in Google Cloud Storage that I need to transfer to s3 bucket in a serverless fashion... one possible approach is to use cloud function and transmit data from cloud storage to s3 bucket using gsutil and boto3 for was credentials. I believe their is an extra fee from Google for outbound network request but this approach is possible.

Does anyone has a better approach or a suggestion?

llompalles
  • 3,072
  • 11
  • 20
Karan
  • 11
  • 4
  • Is it necessary the serverless function? I mean, there is already a simple approach defined [in here](https://stackoverflow.com/a/39333278/3058302) – Mangu Jan 11 '19 at 08:28
  • Yup, ideally we would like to go serverless. I have seen many options (inc. the one that you pointed to) that include a server but serverless makes it little tricky. Any help would be really appreciated! – Karan Jan 11 '19 at 19:42

1 Answers1

0

At present there is no better than gsutil: https://stackoverflow.com/a/39333278/10801700

Did you consider the Always Free usage limits in your evaluation? There is "1 GB from North America to each GCP egress destination (Australia and China excluded)" each month. Once you've passed this limit, the general network usage pricing applies.

Note that for Cloud Storage: "Always Free is only available in us-east1, us-west1, and us-central1 regions. Usage calculations are combined across those regions."

alp
  • 642
  • 5
  • 13
  • 2
    Found a way : Using boto3 and google cloud storage as requirements, I was able to write a google cloud function, which basically triggered on landing a file in GCS, read the file in memory and passed it over into s3 bucket. A 150MB file took less than 30 sec. – Karan Feb 09 '19 at 15:01
  • @Karan could you share the snippet? – Chukwuma Nwaugha Feb 11 '21 at 14:39
  • Lol right. I agree, please share – njho Nov 03 '21 at 01:03