I have AWS account. I'm using S3 to store backups from different servers. The question is there any information in the AWS console about how much disk space is in use in my S3 cloud?
-
1You have to get all objects, then sum up all the files sizes. You can't do it in a single operation. – Layke Jan 23 '12 at 17:37
-
1It's strange that there is no real solution to this problem. Going through all items and calculating is not a solution if you have 10s of millions of files! In AWS's s3 UI you can easily see the usage under Management -> Metrics. Why isn't there a way to get this from the command line? – Sarang Mar 22 '20 at 14:36
19 Answers
The command line tool gives a nice summary by running:
aws s3 ls s3://mybucket --recursive --human-readable --summarize
-
2this does not show the true size with versions. Is there a way to check the total size of the s3 bucket with all versions? – Shanika Ediriweera Oct 29 '19 at 13:20
-
3Print total size of each of your buckets: `for b in $(aws s3 ls | awk '{ print $NF }'); do printf "$b "; aws s3 ls s3://$b --recursive --human-readable --summarize | tail -1; done` – Matt White Jan 05 '20 at 21:47
Yippe - an update to AWS CLI allows you to recursively ls through buckets...
aws s3 ls s3://<bucketname> --recursive | grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | awk 'BEGIN {total=0}{total+=$3}END{print total/1024/1024" MB"}'
-
print total/1024/1024/1024*.03 gives a nice estimate for $ usage if you are under 1TB. @cudds awesomeness - thanks a ton!!! – chrislovecnm Jun 16 '14 at 17:29
-
4
-
11AWS Cloudwatch now has a metric for bucket size and number of objects that is updated daily. About time! https://aws.amazon.com/blogs/aws/amazon-s3-update-delete-notifications-better-filters-bucket-metrics/ – cudds Jul 28 '15 at 23:13
-
3**Example** `aws cloudwatch get-metric-statistics --namespace AWS/S3 --start-time 2015-07-15T10:00:00 --end-time 2015-07-31T01:00:00 --period 86400 --statistics Average --region eu-west-1 --metric-name BucketSizeBytes --dimensions Name=BucketName,Value=toukakoukan.com Name=StorageType,Value=StandardStorage` **Important:** You must specify both StorageType and BucketName in the dimensions argument otherwise you will get no results. – Sam Martin Aug 01 '15 at 10:18
-
@SamMartin what does StorageType need to be? Also this answer takes a very long time to compute for buckets bigger than 100 GB – Vivek Katial May 22 '19 at 00:13
-
@VivekKatial have a look at this answer which is _much_ faster https://serverfault.com/questions/84815/how-can-i-get-the-size-of-an-amazon-s3-bucket/710080#710080 – Sam Martin May 23 '19 at 15:08
-
AWS documentation indicates that if you need to get the size of a bucket use that command which works well in most cases. However, it's not suitable for automation because sometimes you might have a scenario where some of the certain buckets have thousands or millions of records. The command will have to iterate through the complete list before it can render the required bucket size information. It's not suitable for automation. – A. Lartey Oct 12 '21 at 02:43
To find out size of S3 bucket using AWS Console:
- Click the S3 bucket name
- Select "Metrics" tab
- You should see "Bucket metrics" which by default includes "Total bucket size"

- 1,508
- 13
- 10
-
3This works faster in case your bucket has TBs of data. The accepted answers take a lot of time to calculate all the objects in that scale. – sokras Oct 01 '18 at 10:07
-
Note also that this will capture hanging incomplete uploads, with the `ls`-based solutions don't. – David Moles Feb 12 '19 at 18:16
-
1''Metrics'' has its own tab for me. But yeah this is the fastest way for me. – Ahmed Abo 6 Nov 21 '21 at 10:20
s3cmd can show you this by running s3cmd du
, optionally passing the bucket name as an argument.

- 6,477
- 34
- 39
-
1FYI - I tried this and the aws cli version in cudds answer. They both work fine, but s3cmd was significantly slower in the cases I tried as of release 1.5.0-rc1. – DougW Aug 18 '14 at 20:19
-
@DougW: Thanks, useful info. AWS CLI 1.0.0 was [released in September 2013](https://aws.amazon.com/releasenotes/CLI/6178285823861575), so it didn't exist at the time I wrote my answer. – markusk Aug 27 '14 at 07:03
-
s3cmd doesn't support AWS4 hashing so it won't work with any new regions, including the EU region "eu-central-1" – Koen. Nov 04 '14 at 10:33
-
@Koen.: Thanks, I was not aware of this. Seems the s3cmd maintainer is looking into adding support for AWS4: https://github.com/s3tools/s3cmd/issues/402 – markusk Nov 10 '14 at 07:41
-
@Koen.: s3cmd now supports AWS4 hashing as of 1.5.0, which was released 2015-01-12. See http://s3tools.org/news. – markusk Feb 01 '15 at 08:30
-
I know, but I've moved to AWS CLI. Also because of the messy codebase of s3cmd. – Koen. Feb 01 '15 at 15:49
-
Hmm, why does s3cmd have so much more functionality than aws cli yet it's deprecated???? – samthebest Jan 27 '16 at 15:34
The AWS CLI now supports the --query
parameter which takes a JMESPath expressions.
This means you can sum the size values given by list-objects
using sum(Contents[].Size)
and count like length(Contents[])
.
This can be be run using the official AWS CLI as below and was introduced in Feb 2014
aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"

- 6,042
- 2
- 31
- 41
-
I had to use double quotes around the query string in windows command line. Works like a champ though. – endeavor Dec 08 '14 at 05:24
-
Beware: if the bucket is empty the command would fail with the following error: `In function sum(), invalid type for value: None, expected one of: ['array-number'], received: "null"` Otherwise the query works great! – mechatroner Feb 24 '20 at 22:14
-
AWS documentation indicates that if you need to get the size of a bucket use that command which works well in most cases. However, it's not suitable for automation because sometimes you might have a scenario where some of the certain buckets have thousands or millions of records. The command will have to iterate through the complete list before it can render the required bucket size information. It's not suitable for automation. – A. Lartey Oct 12 '21 at 02:44
On linux box that have python
(with pip
installer), grep
and awk
, install AWS CLI (command line tools for EC2, S3 and many other services)
sudo pip install awscli
then create a .awssecret
file in your home folder with content as below (adjust key, secret and region as needed):
[default]
aws_access_key_id=<YOUR_KEY_HERE>
aws_secret_access_key=<YOUR_SECRET_KEY_HERE>
region=<AWS_REGION>
Make this file read-write to your user only:
sudo chmod 600 .awssecret
and export it to your environment
export AWS_CONFIG_FILE=/home/<your_name>/.awssecret
then run in the terminal (this is a single line command, separated by \
for easy reading here):
aws s3 ls s3://<bucket_name>/foo/bar | \
grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | \
awk 'BEGIN {total=0}{total+=$3}END{print total/1024/1024" MB"}'
- the
aws
part lists the bucket (or optionally a 'sub-folder') - the
grep
part removes (using-v
) the lines that match the Regular Expression (using-E
).^$
is for blank line,--
is for the separator lines in the output ofaws s3 ls
- the last
awk
simply add tototal
the 3rd colum of the resulting output (the size in KB) then display it at the end
NOTE this command works for the current bucket or 'folder', not recursively

- 10,203
- 6
- 34
- 58
Cloud watch also allows you to create metrics for your S3 bucket. It shows you metrics by the size and object count. Services> Management Tools> Cloud watch. Pick the region where your S3 bucket is and the size and object count metrics would be among those available metrics.

- 71
- 1
- 1
In addition to Christopher's answer.
If you need to count total size of versioned bucket use:
aws s3api list-object-versions --bucket BUCKETNAME --output json --query "[sum(Versions[].Size)]"
It counts both Latest and Archived versions.

- 91
- 1
- 3
Getting large buckets size via API (either aws cli or s4cmd) is quite slow. Here's my HowTo explaining how to parse S3 Usage Report using bash one liner:
cat report.csv | awk -F, '{printf "%.2f GB %s %s \n", $7/(1024**3 )/24, $4, $2}' | sort -n

- 1,960
- 4
- 26
- 39

- 893
- 9
- 13
Based on @cudds's answer:
function s3size()
{
for path in $*; do
size=$(aws s3 ls "s3://$path" --recursive | grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | awk 'BEGIN {total=0}{total+=$3}END{printf "%.2fGb\n", (total/1024/1024/1024)}')
echo "[s3://$path]=[$size]"
done
}
...
$ s3size bucket-a bucket-b/dir
[s3://bucket-a]=[24.04Gb]
[s3://bucket-b/dir]=[26.69Gb]
Also, Cyberduck conveniently allows for calculation of size for a bucket or a folder.

- 1,860
- 2
- 15
- 16
This is an old inquiry, but since I was looking for the answer I ran across it. Some of the answers made me remember I use S3 Browser to manage data. You can click on a bucket and hit properties and it shows you the total. Pretty simple. I highly recommend the browser: https://s3browser.com/default.aspx?v=6-1-1&fam=x64

- 11
- 1
You asked: information in AWS console about how much disk space is using on my S3 cloud?
I so to the Billing Dashboard and check the S3 usage in the current bill.
They give you the information - MTD - in Gb to 6 decimal points, IOW, to the Kb level.
It's broken down by region, but adding them up (assuming you use more than one region) is easy enough.
BTW: You may need specific IAM permissions to get to the Billing information.

- 1,270
- 26
- 41
The AWS console wont show you this but you can use Bucket Explorer or Cloudberry Explorer to get the total size of a bucket. Both have free versions available.
Note: these products still have to get the size of each individual object, so it could take a long time for buckets with lots of objects.

- 18,538
- 4
- 62
- 85
Mini John's answer totally worked for me! Awesome... had to add
--region eu-west-1
from Europe though

- 1
- 1

- 104
- 8
Well, you can do it also through an S3 client if you prefer a human friendly UI.
I use CrossFTP, which is free and cross-platform, and there you can right-click on the folder directory -> select "Properties..." -> click on "Calculate" button next to Size and voila.

- 739
- 2
- 11
- 23
So I am going to add Storage Lens from AWS on here with the default dashboard.
It is really super useful for identify hidden cost of storage like "incomplete multipart uploads"
It really should probably be now the first port of call for answering this question before you now reach for the code.

- 1,035
- 9
- 16
Depending on how accurate you want your results to be, you can use AWS console, AWS CLI or AWS S3 storage lens to find out the total size of a bucket or how much space it is using.
Way 1: Using Console
Login to AWS Management
Open S3
Click on the “Matrics” tab
The total bucket size matrics show the size of your bucket
Note: Please note that this data is updated every 24 hours so the latest changes will not be there. But the benefit is that it's free to use and you are not charged for any S3 operation.
Here is how it looks like : Total Bucket Size
Using AWS CLI
aws s3 ls s3://bucket-name --recursive --human-readable --summarize
As expected, the CLI is running ls command so It will cost you money. So use as per your requirement.
If you want to check the storage lens way as well. you can check my post.

- 1
- 2
I use Cloud Turtle to get the size of individual buckets. If the bucket size exceeds >100 Gb, then it would take some time to display the size. Cloud turtle is freeware.

- 5,708
- 2
- 40
- 50

- 10,526
- 23
- 70
- 103
-
8Be careful with this software. It installs extra chrome extensions and seems to be rather spammy. – styks Dec 09 '13 at 14:58