175

I am using the AmazonS3Client in an Android app using a getObject request to download an image from my Amazon S3 bucket.

Currently, I am getting this exception:

com.amazonaws.services.s3.model.AmazonS3Exception: 
The specified key does not exist.
 (Service: Amazon S3; Status Code: 404; Error Code: NoSuchKey;

Even though I am able to see the object with the specified key in my S3 bucket.

barshopen
  • 1,190
  • 2
  • 15
  • 28
user4592690
  • 1,871
  • 2
  • 10
  • 5
  • 4
    I think this error usually occurs when the object/file does not exist in the specified bucket. Can you double check the name of the bucket you are using for typo and if its the same bucket where you see the object/file ?This is not authentication error for sure. – Shobhit Puri Feb 22 '15 at 02:11
  • 115
    Amazon documentation sucks, this question cannot be downvoted. Its crazy to figure out S3 programming. – Siddharth Feb 22 '15 at 03:33
  • At this time, Index.html isn't found underneath any folder. The AWS S3 buckets permissions don't perceive as AWS documentation says. –  Mar 17 '19 at 20:09
  • 7
    I like how there are a dozen different answers and they are all correct. – Rob Osborne Sep 10 '20 at 12:59
  • A single extra space in between snake case file name had me pull my hair. – Kesava Karri Jun 16 '23 at 09:12

19 Answers19

73

Well this error is actually rather straight forward.  it simply means that your file does not exist up within the S3 bucket.  Several things could be wrong:

  1. You could be trying to reference the wrong file.  Double check the path that you tried to retrieve.

  2. Whenever the file was uploaded it must have failed.  Check the logs for your S3Sync process to see if you can find any relevant output

Source

Kuai Le
  • 162
  • 11
Fahim
  • 12,198
  • 5
  • 39
  • 57
  • 6
    links expire making answers useless after a while, you should extract infromation from that link and update this answer to be complete. feel free to share your source.. thats ok.. – Siddharth Feb 22 '15 at 03:34
  • Thanks, this helps keep SO clean and useful. – Siddharth Mar 06 '15 at 09:44
  • 3
    I ran into this problem with a React app that handles its own routing. My solution ended up being to redirect both `root` and `errors` to the same `index.html` file. That way the frontend app can act as a catch all and make sense of any URL scheme. – sambecker Dec 20 '18 at 17:58
  • @sambecker spot on! Been trying to solve this for ages. Thanks – Newt Apr 02 '23 at 22:00
38

For me, the object definitely existed and was uploaded correctly, however, its s3 url still threw the same error:

<Code>NoSuchKey</Code>
<Message>The specified key does not exist.</Message>

I found that the reason is that my file name contains a # character, which may require special handling according to the documentation.

Removing this character and generating the new s3 url resolved my issue.

lucidyan
  • 3,575
  • 2
  • 22
  • 24
TrieuNomad
  • 1,651
  • 1
  • 21
  • 24
  • 2
    Same issue here. The offending character was a bracket: ( – Johann Jul 20 '16 at 08:08
  • I am facing same issue and my filename is "abcd.jar". Is the download function extension specific also ? – Rahul Munjal Dec 02 '16 at 07:08
  • @RahulMunjal I'm not too sure about file extensions, maybe try putting it in a .zip folder? and see if that works. Also, double check that the permissions for your file are correct (ex. "read only" access for the All Users Group). – TrieuNomad Dec 06 '16 at 00:33
  • 1
    In a similar vein, I had to use encodeURIComponent on the key to get s3 to find it. The file name was already encoded, which I think is the reason it wasn't working. – Sean Oct 19 '18 at 18:32
  • I had the same issue. I had the '@' symbol for subfolder/prefix on S3. removing that symbol removed that error. – Anil Konsal Mar 02 '19 at 08:44
  • For me it was a whitespace – dustytrash Feb 04 '22 at 17:41
  • parentheses () were the issue for me – Jordan Dodson Oct 04 '22 at 15:45
  • I read the object key from a CreateObject event but aws does not find the object because of some "strange" characters. But why the create event object key field is not encoded? – PeterB Mar 11 '23 at 10:24
20

Note that this may happen even if the file path is correct due to s3's eventual consistency model. Basically, there may be some latency in being able to read an object after it's written. See this documentation for more information.

Nick Resnick
  • 591
  • 1
  • 5
  • 14
  • Was also my issue. Now I implemented an automatic retry for a short time. – moritz.vieli Nov 10 '19 at 12:30
  • 2
    At the time of this question, iirc, S3 provided strong read-after-write consistency for PUTs of new objects in all AWS Regions (unless you made a HEAD or GET request to the key name before creating the object). As of December 2020, S3 provides strong read-after-write consistency for PUTs (and DELETEs) of all objects in all AWS Regions (not just new objects). – jarmod May 24 '21 at 20:25
  • 1
    What strong read-after-write consistency AWS claims to have. I need to wait 15 seconds before running a lambda reading a recently uploaded file. Otherwise I get 404. – W.M. May 24 '22 at 22:50
10

People with this issue might also want to check out this stack thread.

Going to the bucket settings, and changing the error document to the same as the index document might do the trick:

Example: enter image description here

The Blind Hawk
  • 1,199
  • 8
  • 24
7

I encountered this issue in a NodeJS Lambda function that was triggered by a file upload to S3.

My mistake was that I was not decoding the object key, which contained a colon. Corrected my code as follows:

let key = decodeURIComponent(event.Records[0].s3.object.key);
Mullins
  • 2,304
  • 1
  • 19
  • 18
6

In my case it was because the filename was containing spaces. Solved it thanks to this documentation (which is unrelated to the problem):

from urllib.parse import unquote_plus
key_name = unquote_plus(event['Records'][0]['s3']['object']['key'])

You also need to upload urllib as a layer with corresponding version (if your lambda is Python 3.7 you have to package urllib in a python 3.7 environment).

The reason is that AWS transform ' ' into '+' (why...) which is really problematic...

Antonin GAVREL
  • 9,682
  • 8
  • 54
  • 81
3

Don't forget buckets are region specific. That might be an issue.

Also try using the S3 console to navigate to the actual object, and then click on Copy Path, you will get something like:

s3://<bucket-name>/<path>/object.txt

As long as whatever you are passing it to parses that properly I find that is the safest thing to do.

P Burke
  • 1,630
  • 2
  • 17
  • 31
  • Thanks this helped me as my files were in a subfolder. The subfolder needs to be part of the key when using powershell to access. Example: Get-S3Object -BucketName "$Bucket" -Key "$subFolder/$fileName" – Garrett Jul 06 '21 at 14:37
3

In my case the link to file was wrong - location missed. You may check the correct link by copying one from AWS backend. enter image description here

Vitalii Mytenko
  • 544
  • 5
  • 20
2

In my case the error was appearing because I had uploaded the whole folder, containing the website files, into the container.

I solved it by moving all the files outside the folder, right into the container.

Darush
  • 11,403
  • 9
  • 62
  • 60
1

The reason for the issue is wrong or typo in the Bucket/Key name. Do check if the bucket or key name you are providing does exists.

Gaurav Sharma
  • 1,983
  • 18
  • 18
1

If your file is in a subfolder within a bucket, the subfolder should be part of the key rather than the bucket. Using the following command I was able to read the file in PowerShell.

Example:

Get-S3Object -BucketName "$Bucket" -Key "$subFolder/$fileName"
Garrett
  • 617
  • 12
  • 30
1

I was having this error in some countries, but not all (tested by selecting different countries with my VPN).

What fixed this for me was creating a "Custom Error Response" within CloudFront.

I redirected a 404 error code to /index.html with a response code of 200.

I then created an invalidation and everything started working again.

Devin
  • 465
  • 1
  • 6
  • 15
0

Step 1: Get the latest aws-java-sdk

<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.660</version>
</dependency>

Step 2: The correct imports

import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.ListObjectsRequest;
import com.amazonaws.services.s3.model.ObjectListing;

If you are sure the bucket exists, Specified key does not exists error would mean the bucketname is not spelled correctly ( contains slash or special characters). Refer the documentation for naming convention.

The document quotes:

If the requested object is available in the bucket and users are still getting the 404 NoSuchKey error from Amazon S3, check the following:

Confirm that the request matches the object name exactly, including the capitalization of the object name. Requests for S3 objects are case sensitive. For example, if an object is named myimage.jpg, but Myimage.jpg is requested, then requester receives a 404 NoSuchKey error. Confirm that the requested path matches the path to the object. For example, if the path to an object is awsexamplebucket/Downloads/February/Images/image.jpg, but the requested path is awsexamplebucket/Downloads/February/image.jpg, then the requester receives a 404 NoSuchKey error. If the path to the object contains any spaces, be sure that the request uses the correct syntax to recognize the path. For example, if you're using the AWS CLI to download an object to your Windows machine, you must use quotation marks around the object path, similar to: aws s3 cp "s3://awsexamplebucket/Backup Copy Job 4/3T000000.vbk". Optionally, you can enable server access logging to review request records in further detail for issues that might be causing the 404 error.

AWSCredentials credentials = new BasicAWSCredentials(AWS_ACCESS_KEY_ID, AWS_SECRET_KEY);
AmazonS3 s3Client = AmazonS3ClientBuilder.standard().withRegion(Regions.US_EAST_1).build();
ObjectListing objects = s3Client.listObjects("bigdataanalytics");
System.out.println(objects.getObjectSummaries());
ForeverLearner
  • 1,901
  • 2
  • 28
  • 51
0

I also ran into this issue, but in my case I was inadvertently changing the internal state of my source object key when constructing the destination key:

  source_objects.each do |item|
    key = item.key.sub!(source_prefix, dest_prefix)
    item.copy_to(bucket: dest_bucket, key: key)
  end

I'm new to Ruby and missed that sub! has side effects and sub should have been used instead.

ask417
  • 186
  • 2
  • 6
0

I had the same issue, the file I was trying to read from S3 was not there. I would check if the file path you are looking for is correct and the file is present.

mia
  • 73
  • 10
0

I just have the error provided:

404 Not Found
Code: NoSuchKey
Message: The specified key does not exist.
Key: index.html
RequestId: 3C8J01Y73CKKJSCQ
HostId: QiQ6bqe3Cff/XjDDGm10IAArp9j6kajGKFIv4/JiJBfOFjLLxsiE796TuoLviPsCQl3KOma+Ma0=

The reason behind this is I select the access point index.html, but, rather than dragging the files for uploading, I selected the whole fodler.

So, the index.html file is indeed in the configuring-s3-code/index.html location than inside directly in the root folder.

So, please, carefully check the access point for the codebase and the problem might be there.

Arefe
  • 11,321
  • 18
  • 114
  • 168
0

In my case it was as simple as using the wrong slashes (Wrong: Microsoft-style like "foo\bar.txt". Correct: Unix-style like "foo/bar.txt") Can easily happen if you use helper functions like Path.Combine on Windows...

MadDave666
  • 45
  • 8
0

In my case was even wireder: In my angular app, hosted in S3 under the with CF origin path /prod/v1.2, I was pointing to an URL like: example.com/token/sdkjnasdalkjsd.asdasdasd.asdad where my token has some periods

In this case for some reasons Cloudfront does not seem adding the Origin path, and tries to resolve it as a file (because the toke has periods?) ending in a 404

I had to redirect 404 to / to solve the issue

albanx
  • 6,193
  • 9
  • 67
  • 97
-5

In my case I had the wrong key name. make sure you have the correct key name. enter image description here

ilibilibom
  • 497
  • 4
  • 9