5

The gsutil command has options to optimize upload/download speed for large files. For example

GSUtil:parallel_composite_upload_threshold=150M
GSUtil:sliced_object_download_max_components=8

see this page for reference.

What is the equivalence in the google.cloud.storage python API? I didn't find the relevant parameters in this document.

In general, does the client API and gsutil have one to one correspondence in terms of functionalities?

nos
  • 19,875
  • 27
  • 98
  • 134

1 Answers1

8

I think it's not natively supported.

However (!) if you're willing to decompose files then use threading or multiprocessing, there is a compose method that should help you assemble the parts into one GCS object.

Ironically, gsutil is written in Python but it uses a library gslib to implement parallel uploads. You may be able to use gslib as a template.

Donnald Cucharo
  • 3,866
  • 1
  • 10
  • 17
DazWilkin
  • 32,823
  • 5
  • 47
  • 88
  • Here's an [open issue](https://github.com/googleapis/python-storage/issues/36) to support parallel operations for copying objects. – Donnald Cucharo May 06 '21 at 02:52