1

I have a bunch of s3 folders for different projects/clients and I would like to estimate total size (so I can for instance consider reducing sizes/cost). What is a good way to determine this?

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
Sergio Lucero
  • 862
  • 1
  • 12
  • 21

4 Answers4

1

I can do this with a combination of Python and the AWS client:

import os

bucket_rows = os.popen('aws s3 ls').split(chr(10))
sizes = dict()

for bucket in bucket_rows:
    buck = bucket.split(' ')[-1]    # the full row contains additional information
    cmd = f"aws s3 ls --summarize --human-readable --recursive s3://{buck}/ | grep 'Total'"
    sizes[buck] = os.popen(cmd).read()
Sergio Lucero
  • 862
  • 1
  • 12
  • 21
  • `aws s3 ls s3://bucket/path/to/data/ --summarize --human-readable --recursive | grep 'Total'` is an excellent way to quickly get subtotal size. Thanks! – Wassadamo Nov 30 '21 at 07:56
1

As stated here AWS CLI natively supports --query parameter with can determine the size of every object in S3 bucket.

 aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"

I hope it helps.

Vaibhav Jain
  • 2,155
  • 5
  • 27
  • 41
1

This would do the magic

for bucket_name in `aws s3 ls | awk '{print $3}'`; do
  echo "$bucket_name"
  aws s3 ls s3://$bucket_name --recursive --summarize | tail -n2
done
Jorge Tovar
  • 1,374
  • 12
  • 17
0

If you want to check via console.

  1. If you mention a folder by folder and not bucket, then just select that object go to "action" drop down and select "Get total size"
  2. If you mean bucket by folder , then go to management tab and in that go to metrics it will show entire bucket size
Abhishek Kumar
  • 143
  • 1
  • 7