10

Goal

We would like users to be able to upload images to Google Cloud Storage.

Problem

We could achieve this indirectly with our server as a middle man -- first, the user uploads to our server, then our privileged server can upload to Cloud Storage.

However, we think this is unnecessarily slow, and instead would like the user to upload directly to Cloud Storage.

Proposed Solution

To achieve a direct upload, we generate a Signed URL on our server. The Signed URL specifies an expiration time, and can only be used with the HTTP PUT verb. A user can request a Signed URL, and then - for a limited time only - upload an image to the path specified by the Signed URL.

Problem with the Solution

Is there any way to enforce a maximum file upload size? Obviously we would like to avoid users attempting to upload 20GB files when we expect <1MB files.

It seems like this is an obvious vulnerability, yet I don't know how to address it while still using SignedURLs.

There seems to be a way to do this using Policy Documents (Stack Overflow answer), but the question is over 2 years old now.

Ismail Khan
  • 842
  • 2
  • 8
  • 20

5 Answers5

6

For all looking at the answer today be aware that link

x-goog-content-length-range:0,25000

is the way to limit the upload size between 0 and 25000 bytes Cloud Storage.

X-Upload-Content-Length will not work and you are still able to upload larger files

Ramesh R
  • 7,009
  • 4
  • 25
  • 38
Spl45h
  • 81
  • 1
  • 3
  • I tested `x-goog-content-length-range` this and what it does is to truncate the uploaded file and does not cancel on wrong size. – Juancki Feb 02 '21 at 11:12
  • i tried this, but x-goog-content-length-range is not working, where does it go in? `$url = $object->signedUrl(new \DateTime('120 min'),['method' => 'PUT','contentType' => "$type",'version' => 'v4', 'x-goog-content-length-range' => '0,1' ]);` – Delir Aug 11 '21 at 17:07
5

Policy documents are still the right answer. They are documented here: https://cloud.google.com/storage/docs/xml-api/post-object#policydocument

The important part of the policy document you'll need is:

["content-length-range", <min_range>, <max_range>].
Brandon Yarbrough
  • 37,021
  • 23
  • 116
  • 145
  • 1
    The same link says this: "_Note: Unless you need to use HTML forms (usually through a web browser) to upload objects, we strongly recommend using PUT object instead of POST._". However, there is no `content-length-range` defined for PUT objects. Is this a feature that is included, but not documented? – Ismail Khan Aug 23 '17 at 05:32
  • 1
    No, it only exists for POST, unfortunately. – Brandon Yarbrough Aug 23 '17 at 05:35
  • 11
    So Google recommends using PUT for uploading objects to Cloud Storage, but doesn't allow you to specify a maximum file size??? Seems like PUT is totally useless then... – Ismail Khan Aug 23 '17 at 08:50
  • With `PUT` method, you can try using `x-goog-content-length-range` header. More info available on https://cloud.google.com/storage/docs/xml-api/reference-headers#xgoogcontentlengthrange – user482594 Jun 28 '21 at 06:14
  • 1
    i tried this, but x-goog-content-length-range is not working, where does it go in? `$url = $object->signedUrl(new \DateTime('120 min'),['method' => 'PUT','contentType' => "$type",'version' => 'v4', 'x-goog-content-length-range' => '0,1' ]);` – Delir Aug 11 '21 at 17:18
5

Signing content-length should do the trick.

Google Cloud will not allow uploads with larger file size even if the content-length is set to a lower value.

This is how the signed url options should look like:

const writeOptions: GetSignedUrlConfig = {
  version: 'v4',
  action: 'write',
  expires: Date.now() + 900000, // 15 minutes
  extensionHeaders: {
    "content-length": length // desired length in bytes
  }
}
Radu Diță
  • 13,476
  • 2
  • 30
  • 34
1

My worked code in NodeJS was following https://blog.koliseo.com/limit-the-size-of-uploaded-files-with-signed-urls-on-google-cloud-storage/. You must use the version v4

 public async getPreSignedUrlForUpload(
    fileName: string,
    contentType: string,
    size: number,
    bucketName: string = this.configService.get('DEFAULT_BUCKET_NAME'),
  ): Promise<string> {
    const bucket = this.storage.bucket(bucketName);
    const file = bucket.file(fileName);

    const response = await file.getSignedUrl({
      action: 'write',
      contentType,
      extensionHeaders: {
        'X-Upload-Content-Length': size,
      },
      expires: Date.now() + 60 * 1000, // 1 minute
      version: 'v4',
    });

    const signedUrl = this.maskSignedUrl(response[0], bucketName);
    return signedUrl;
  }

In the Frontend, We must set the same number of the Size in the header X-Upload-Content-Length


export async function uploadFileToGCP(
  signedUrl: string,
  file: any
): Promise<any> {
  return new Promise((resolve, reject) => {
    const xhr = new XMLHttpRequest();
    xhr.withCredentials = process.env.NODE_ENV === 'production';

    xhr.addEventListener('readystatechange', function () {
      if (this.readyState === 4) {
        resolve(this.responseText);
      }
    });

    xhr.open('PUT', signedUrl, true);
    xhr.setRequestHeader('Content-Type', file.type);
    xhr.setRequestHeader('X-Upload-Content-Length', file.size);

    xhr.send(file);
  });
}

And also don't forget to config the responseHeader in the GS CORS

gsutil cors get gs://asia-item-images
[{"maxAgeSeconds": 3600, "method": ["GET", "OPTIONS", "PUT"], "origin": ["*"], "responseHeader": ["Content-Type", "Access-Control-Allow-Origin", "X-Upload-Content-Length", "X-Goog-Resumable"]}]
Phat Tran
  • 3,404
  • 1
  • 19
  • 22
0

You can use X-Upload-Content-Length instead of Content-Length. See blog post here.

On the server side (Java):

Map<String, String> extensionHeaders = new HashMap<>();
extensionHeaders.put("X-Upload-Content-Length", "" + contentLength);
extensionHeaders.put("Content-Type", "application/octet-stream");

var url =
  storage.signUrl(
    blobInfo,
    15,
    TimeUnit.MINUTES,
    Storage.SignUrlOption.httpMethod(HttpMethod.PUT),
    Storage.SignUrlOption.withExtHeaders(extensionHeaders),
    Storage.SignUrlOption.withV4Signature()
  );

On the client side (typescript):

const response = await fetch(url, {
  method: 'PUT',
  headers: {
    'X-Upload-Content-Length': `${file.size}`,
    'Content-Type': 'application/octet-stream',
  },
  body: file,
});

You will need to set up a cors policy on your bucket:

[
  {
    "origin": ["https://your-website.com"],
    "responseHeader": [
      "Content-Type",
      "Access-Control-Allow-Origin",
      "X-Upload-Content-Length",
      "x-goog-resumable"
    ],
    "method": ["PUT", "OPTIONS"],
    "maxAgeSeconds": 3600
  }
]
Nacho Coloma
  • 7,070
  • 2
  • 40
  • 43
  • This also worked for me. More detail at this link: https://blog.koliseo.com/limit-the-size-of-uploaded-files-with-signed-urls-on-google-cloud-storage/ – Phat Tran Feb 10 '21 at 04:24
  • This allows for uploads that go above the desired limit, you only need to lie about X-Upload-Content-Length when signing and set the same value when doing the actual upload. – Radu Diță Feb 11 '21 at 06:36
  • @RaduDiță The signing is done on the server side, not the client side. The idea is to validate content length before signing, and throw an exception if it's not valid. – Nacho Coloma Feb 11 '21 at 14:14
  • @NachoColoma Even if the checking is done on the server side, you can lie on the client side when upload. GCS doesn't check if the provided X-Upload-Content-Length is the actual content length. You can get a valid signature for 1MB for instance and upload a 2MB document. The only way I was able to get GCS to actually block the upload is to use Content-Length. – Radu Diță Feb 11 '21 at 14:19
  • exactly what Radu said. Client asks server for a signedUrl for a file size of 10kb. Server responds with a URL saying: you may upload your file but it needs to contain this header: "X-Upload-Content-Length: 10kb". Client uploads 100mb file with header: "X-Upload-Content-Length: 10kb". – creamcheese Feb 09 '23 at 18:12