Does Amazon provide an easy way to see how much storage my S3 bucket or folder is using? This is so I can calculate my costs, etc.
-
Lots of options here: http://serverfault.com/questions/84815/how-can-i-get-the-size-of-an-amazon-s3-bucket. – jarmod Aug 24 '15 at 22:06
16 Answers
Two ways,
Using aws cli
aws s3 ls --summarize --human-readable --recursive s3://bucket/folder/*
If we omit /
in the end, it will get all the folders starting with your folder name and give a total size of all.
aws s3 ls --summarize --human-readable --recursive s3://bucket/folder
Using boto3 api
import boto3
def get_folder_size(bucket, prefix):
total_size = 0
for obj in boto3.resource('s3').Bucket(bucket).objects.filter(Prefix=prefix):
total_size += obj.size
return total_size
-
6Suitable only for small buckets since it requests metadata for every single object. +1 for that case – geekQ Jul 07 '17 at 08:54
-
31Does not work, lists all files and their respective size regardless of trailing slash. – slothstronaut Dec 25 '17 at 04:01
-
1TO not list all the files and just the total size, add "| grep Size" – Parth Gupta May 26 '20 at 13:37
-
2aws s3 ls s3://bucket_name/folder/ --summarize --human-readable --recursive (option as per the current documentation) – theDbGuy May 31 '20 at 10:25
-
17WARNING: this is going to make a list request to S3. if you're dealing with millions of small objects this can get expensive fast. Currently 1k requests is $.005 you can imagine what this does if you have a few billion objects to gather size meta data on. Using the Get Size button in the console UI could ring up similar charges. – Jake Aug 12 '20 at 18:53
-
AWS documentation indicates that if you need to get the size of a bucket use that command which works well in most cases. However, it's not suitable for automation because sometimes you might have a scenario where some of the buckets might have thousands or millions of records. The command will have to iterate through the complete list before it can render the required bucket size information. It's not suitable for automation. We need a command to fetch only the number of bucket objects/size and also suitable for automation. – A. Lartey Oct 12 '21 at 02:46
Amazon has changed the Web interface so now you have the "Get Size" under the "More" menu.

- 802
- 6
- 4
-
-
2
-
1@kevlarr yes, but you can select all files with the checkbox at the top, and it will recursively calculate size for you – Ian Hunter Apr 04 '18 at 18:13
-
@IanHunter Ah you're right, I think when I tried this I didn't wait long enough for it to calculate.. – kevlarr Apr 04 '18 at 22:22
-
6@kevlarr I need to amend my statement after spending way too much time in S3... The interface pages by 300 objects at a time, so if you have more than 300 root-level objects you'll need to go through page by page and add them up – Ian Hunter Apr 04 '18 at 22:31
-
13@Eduardo You tell me how you feel about that comment when you're comparing the size of 200 separate buckets! – Nick Bull Jul 18 '18 at 09:29
Answer updated for 2021 :)
In your AWS console, under S3 buckets, find bucket, or folder inside it, and click Calculate total size
.

- 3,323
- 3
- 31
- 47
-
7
-
-
2N.B; if versioning is enabled and there are versions present, or there exist incomplete multi-part uploads, these won't be included in the total size: see [knowledge centre post](https://repost.aws/knowledge-center/s3-cli-cloudwatch-metric-discrepancy). Why these file sizes aren't included, or even mentioned by the UI, is beyond me – Arth Feb 17 '23 at 10:31
As of the 28th July 2015 you can get this information via CloudWatch.
aws cloudwatch get-metric-statistics --namespace AWS/S3 --start-time 2015-07-15T10:00:00
--end-time 2015-07-31T01:00:00 --period 86400 --statistics Average --region us-east-1
--metric-name BucketSizeBytes --dimensions Name=BucketName,Value=myBucketNameGoesHere
Name=StorageType,Value=StandardStorage
Important: You must specify both StorageType and BucketName in the dimensions argument otherwise you will get no results.

- 1,445
- 2
- 10
- 10
-
2this is also visible in the Console. Note S3 buckets are regional, so it's important to check any regions where you place buckets. – tedder42 Aug 25 '15 at 01:15
-
Works fine, but CloudWatch implies some delay. It took me hours for my backup bucket just to appear. The availability of a "1 hour" window doesn't convice me too much, since I'm writing this at 09h49, and my latest visible logs are dated from yesterday, 07h03. I suppose using the detailed CW metrics would help. – Balmipour Apr 05 '17 at 07:53
-
Returns nothing for me: { "Datapoints": [], "Label": "BucketSizeBytes" } The bucket has been live for a few days. – abrkn Dec 18 '17 at 14:08
-
1I found that no data would show up until I selected a longer `period` ... ie 3 days or longer. 86400 seconds wasn't long enough of a time slice to get any data points. – Dale C. Anderson Aug 21 '19 at 11:51
-
I'm looking at my metrics right now and it appears to be reporting the metric once per day at 18:00 UTC. This does appear to be the best way to compare aggregate size of many buckets. I checked it in the cloudwatch UI though, but the metric name matches. – Nathan Loyer Nov 18 '19 at 19:39
in case if someone needs the bytes precision:
aws s3 ls --summarize --recursive s3://path | tail -1 | awk '{print $3}'

- 439
- 4
- 13
-
1With powershell: aws s3 ls s3://path/ --recursive --summarize --human-readable | Select -Last 2 – Raj Rao Apr 15 '22 at 18:37
Answer adjusted to 2020:
Go into your bucket, select all folders, files and click on "Actions"->"Get Total Size"

- 500
- 8
- 20
I use s3cmd du s3://BUCKET/ --human-readable
to view size of folders in S3. It gives quite a detailed info about the total objects in the bucket and its size in a very readable form.

- 127
- 1
- 4
Using the AWS Web Console and Cloudwatch:
- Go to CloudWatch
- Clcik Metrics from the left side of the screen
- Click S3
- Click Storage
You will see a list of all buckets. Note there are two possible points of confusion here:
a. You will only see buckets that have at least one object in the bucket.
b. You may not see buckets created in a different region and you might need to switch regions using the pull down at the top right to see the additional bucketsSearch for the word "StandardStorage" in the area stating "Search for any metric, dimension or resource id"
- Select the buckets (or all buckets with the checkbox at the left below the word "All") you would like to calculate total size for
- Select at least 3d (3 days) or longer from the time bar towards the top right of the screen
You will now see a graph displaying the daily (or other unit) size of list of all selected buckets over the selected time period.

- 5,464
- 5
- 42
- 40
-
I don't get why such a long time slice needs to be selected before any data shows up :-( – Dale C. Anderson Aug 21 '19 at 11:49
The most recent and the easiest way is to go to "Metric" tab. It provides clear understanding of the bucket size and number of objects inside it.

- 343
- 5
- 12
-
This is interesting, though if others like me notice that it is empty (I suspect this data to lag behind) it is worth noting that the 'Calculate Total Size' option mentioned in one of the other answers seems to work directly. – Dennis Jaheruddin Jul 27 '21 at 14:09
-
-
I'm glad to see this answer, because all the other options can incur non-trivial costs for querying every object stored in the bucket. – cbreezier Dec 02 '21 at 00:56
If you don't need an exact byte count or if the bucket is really large (in the TBs or millions of objects), using CloudWatch metrics is the fastest way as it doesn't require iterating through all the objects, which can take significant CPU and can end in a timeout or network error if using a CLI command.
Based on some examples from others on SO for running the aws cloudwatch get-metric-statistics
command, I've wrapped it up in a useful Bash function that allows you to optionally specify a profile for the aws
command:
# print S3 bucket size and count
# usage: bsize <bucket> [profile]
function bsize() (
bucket=$1 profile=${2-default}
if [[ -z "$bucket" ]]; then
echo >&2 "bsize <bucket> [profile]"
return 1
fi
# ensure aws/jq/numfmt are installed
for bin in aws jq numfmt; do
if ! hash $bin 2> /dev/null; then
echo >&2 "Please install \"$_\" first!"
return 1
fi
done
# get bucket region
region=$(aws --profile $profile s3api get-bucket-location --bucket $bucket 2> /dev/null | jq -r '.LocationConstraint // "us-east-1"')
if [[ -z "$region" ]]; then
echo >&2 "Invalid bucket/profile name!"
return 1
fi
# get storage class (assumes
# all objects in same class)
sclass=$(aws --profile $profile s3api list-objects --bucket $bucket --max-items=1 2> /dev/null | jq -r '.Contents[].StorageClass // "STANDARD"')
case $sclass in
REDUCED_REDUNDANCY) sclass="ReducedRedundancyStorage" ;;
GLACIER) sclass="GlacierStorage" ;;
DEEP_ARCHIVE) sclass="DeepArchiveStorage" ;;
*) sclass="StandardStorage" ;;
esac
# _bsize <metric> <stype>
_bsize() {
metric=$1 stype=$2
utnow=$(date +%s)
aws --profile $profile cloudwatch get-metric-statistics --namespace AWS/S3 --start-time "$(echo "$utnow - 604800" | bc)" --end-time "$utnow" --period 604800 --statistics Average --region $region --metric-name $metric --dimensions Name=BucketName,Value="$bucket" Name=StorageType,Value="$stype" 2> /dev/null | jq -r '.Datapoints[].Average'
}
# _print <number> <units> <format> [suffix]
_print() {
number=$1 units=$2 format=$3 suffix=$4
if [[ -n "$number" ]]; then
numfmt --to="$units" --suffix="$suffix" --format="$format" $number | sed -En 's/([^0-9]+)$/ \1/p'
fi
}
_print "$(_bsize BucketSizeBytes $sclass)" iec-i "%10.2f" B
_print "$(_bsize NumberOfObjects AllStorageTypes)" si "%8.2f"
)
A few caveats:
- For simplicity, the function assumes that all objects in the bucket are in the same storage class!
- On macOS, use
gnumfmt
instead ofnumfmt
. - If
numfmt
complains about invalid--format
option, upgrade GNUcoreutils
for floating-point precision support.

- 967
- 9
- 14
As an alternative, you can try s3cmd, which has a du command like Unix.

- 2,436
- 1
- 14
- 18
s3cmd du --human-readable --recursive s3://Bucket_Name/

- 30,789
- 47
- 185
- 328

- 367
- 3
- 6
-
`s3cmd` is no longer supported, suggest using `s4cmd` now which is a maintained fork of s3cmd. https://github.com/bloomreach/s4cmd – David Parks May 25 '21 at 21:01
There are many ways to calculate the total size of folders in the bucket
Using AWS Console
S3 Buckets > #Bucket > #folder > Actions > Calculate total size
Using AWS CLI
aws s3 ls s3://YOUR_BUCKET/YOUR_FOLDER/ --recursive --human-readable --summarize
The command's output shows:
- The date the objects were created
- Individual file size of each object
- The path of each object the total number of objects in the s3 bucket
- The total size of the objects in the bucket
Using Bash script
#!/bin/bash
while IFS= read -r line;
do
echo $line
aws s3 ls --summarize --human-readable --recursive s3://#bucket/$line --region #region | tail -n 2 | awk '{print $1 $2 $3 $4}'
echo "----------"
done < folder-name.txt
Sample Output:
test1/
TotalObjects:10
TotalSize:2.1KiB
----------
s3folder1/
TotalObjects:2
TotalSize:18.2KiB
----------
testfolder/
TotalObjects:1
TotalSize:112 Mib
----------

- 42,008
- 16
- 111
- 154

- 2,926
- 2
- 31
- 40
You can visit this URL to see the size of your bucket on the "Metrics" tab in S3: https://s3.console.aws.amazon.com/s3/buckets/{YOUR_BUCKET_NAME}?region={YOUR_REGION}&tab=metrics
The data's actually in CloudWatch so you can just go straight there instead and then save the buckets you're interested in to a dashboard.

- 9,068
- 12
- 64
- 84
In NodeJs
const getAllFileList = (s3bucket, prefix = null, token = null, files = []) => {
var opts = { Bucket: s3bucket, Prefix: prefix };
let s3 = awshelper.getS3Instance();
if (token) opts.ContinuationToken = token;
return new Promise(function (resolve, reject) {
s3.listObjectsV2(opts, async (err, data) => {
files = files.concat(data.Contents);
if (data.IsTruncated) {
resolve(
await getAllFileList(
s3bucket,
prefix,
data.NextContinuationToken,
files
)
);
} else {
resolve(files);
}
});
});
};
const calculateSize = async (bucket, prefix) => {
let fileList = await getAllFileList(bucket, prefix);
let size = 0;
for (let i = 0; i < fileList.length; i++) {
size += fileList[i].Size;
}
return size;
};
Now Just call calculateSize("YOUR_BUCKET_NAME","YOUR_FOLDER_NAME")

- 63
- 2
-
1this method can take days to run, and cost hundreds of dollars if you're not careful and have large buckets. – thisguy123 Dec 07 '21 at 00:44