0

I am using a Google Cloud Function to send files from Cloud Storage to Sharepoint using the Sharepoint REST API. This works great except for large files because the function runs out of memory (set to max 2gb).

Currently, the Cloud Function is triggered when a new file arrives at the storage bucket. It downloads this file into the /tmp directory and then makes a POST request to the Sharepoint endpoint.

Is there a way to bypass downloading the file to /tmp? Can a POST request be made by specifying a remote file location? Other recommendations?

David Beaudway
  • 794
  • 10
  • 27
  • It is not possible to upload a file to Cloud Storage from a remote URL. See https://stackoverflow.com/questions/54235721/transfer-file-from-url-to-cloud-storage – Frank van Puffelen Mar 26 '20 at 02:51
  • Can you bypass /tmp? yes. How well do you understand TCP, HTTP and REST (programming)? If you are experienced, you can read the Cloud Storage object in chunks and POST the data (chunk encoded) to Sharepoint. It would be cheaper to use Compute Engine to do the actual transfer (download/upload) than to code a function this complex in Cloud Functions. – John Hanley Mar 26 '20 at 03:23

0 Answers0