81

I'm trying to upload a image in Windows Azure Blob and I'm geting the following error which I can't handle.

Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

The error occurs when I try to create a container.

container.CreateIfNotExists()

Here is my code

try
{
    Microsoft.WindowsAzure.Storage.CloudStorageAccount storageAccount = Microsoft.WindowsAzure.Storage.CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); 

    // Retrieve a reference to a container. 
    CloudBlobContainer container = blobClient.GetContainerReference("samples");

    // Create the container if it doesn't already exist.
    // here is the error
    if (container.CreateIfNotExists())
    {
        container.SetPermissions(
            new BlobContainerPermissions
            {
                PublicAccess = BlobContainerPublicAccessType.Blob
            });
    }
    
    CloudBlockBlob blockBlob = container.GetBlockBlobReference("Image1");
    using (var fileStream = System.IO.File.OpenRead(@"Path"))
    {
        blockBlob.UploadFromStream(fileStream);
    }
}
catch (StorageException ex1)
{
    throw ex1;
}

I have tried a lot of options in my code but still getting the error.

Brandon Minnick
  • 13,342
  • 15
  • 65
  • 123
Fábio Henrique
  • 821
  • 1
  • 6
  • 5
  • 1
    Which version of storage client library are you using? Are you getting this error when trying to create a container in cloud or local storage emulator? If it is local storage emulator, which version of emulator are you using? – Gaurav Mantri Jun 30 '14 at 14:53
  • Hi @GauravMantri , I'm getting this erros in my dev machine. Version 4.0.1.0. – Fábio Henrique Jun 30 '14 at 14:58
  • What version of storage emulator are you using? – Gaurav Mantri Jun 30 '14 at 15:01
  • I'm not using storage emulator, I'm trying to create it in the cloud. – Fábio Henrique Jun 30 '14 at 15:05
  • 19
    OK. Please check for 2 things - 1) your account key is correct and 2) Clock on your computer is correct. These are the two reasons which could result in this error. – Gaurav Mantri Jun 30 '14 at 15:07
  • Which should be the correct one? Is there any right format also? – Fábio Henrique Jun 30 '14 at 15:11
  • 1
    I mean, which time should be the correct one? I'm in Brazil, and now we are in 12:13 pm . And the Date format here is 30/06/2014. – Fábio Henrique Jun 30 '14 at 15:13
  • 4
    I see ... I don't think this matters. What you need to see is if your computer's clock is slower than the GMT time. Please check the GMT time on your computer (DateTime.UtcNow) and compare it with the actual GMT time (you would need to find a site which will tell you correct GMT time). If the difference is more than 15 minutes, then you will get this error. – Gaurav Mantri Jun 30 '14 at 15:16
  • 2
    I don't think time is the problem, my current Utc time is + {30/06/2014 15:23:57} , and i check in this site the GMT time http://wwp.greenwichmeantime.com/info/current-time/ and seems to be the same. :/ – Fábio Henrique Jun 30 '14 at 15:23
  • 3
    Then please check the account key. – Gaurav Mantri Jun 30 '14 at 15:26
  • 1
    Thank you very much for your help, it was my problem. I was using all my character in Account key with uppercase mode. It is sense casetive. – Fábio Henrique Jun 30 '14 at 17:00
  • Make sure you are referencing the correct container. I just had this issue and it was due to me switching between containers and forgetting to change that value. I had the value hard coded in the source and forgot all about it despite flipping the key. #prototypecode – Chris Love Jun 03 '16 at 14:03

29 Answers29

55

My PC's time was off by 1 hour as suggested by others in the comments. Correcting it solved the problem.

Ashton
  • 1,265
  • 14
  • 23
21

I am using .NET SDK for Azure blob file uploading with metadata. I got an error while uploading files into Azure Blob storage with metadata, the error is "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature." But these errors were only a few files not all of them.

Issue here If you have metadata for the file, metadata should not contain special characters(�) or additional space( ) starting of the value and end of the value.

If you correct the metadata values then the file will upload successfully.

Muni Chittem
  • 988
  • 9
  • 17
  • 1
    This was exactly my issue aswell. Weird error message though – Niek Jannink Nov 11 '20 at 15:37
  • 2
    This fixed my issues as well. One of the metadata values had "mymetadatavalue " – ttaylor27272727 Feb 04 '21 at 17:06
  • It's very confusing. If you have Chinese or Arabic characters, you can simply `utf8_decode()` them when you add them to the headers, but if you have a simple `é` in your string, Azure won't like it one bit, even if you decode them first. – dearsina Apr 08 '21 at 19:24
  • I used dotnet sdk to add metadata for documents – Muni Chittem Apr 09 '21 at 11:09
  • 3
    A very confusing error message from Microsoft, i now store all metadata string values as base64 (string.ToBase64()) en decode them when retrieving the data .FromBase64() to prevent all problems – Sander Oosterwijk Apr 15 '21 at 07:27
  • 1
    Thank you for figuring out the problem here. I would've never guessed by the error message that it was caused by some trailing whitespace in my metadata. – Quiver Jul 21 '21 at 16:07
  • I passed in headers with content-type set. It hat a \t in it (came from another system), which caused this error as well (using .Trim() now to remove at least these whitespace characters) `var httpHeaders = new BlobHttpHeaders { ContentType = file.MimeType?.Trim() }; await client.UploadAsync(fileStream, httpHeaders);` – uTILLIty Apr 24 '22 at 15:05
16

I got this message when I was trying to access BLOB Storage through REST API Endpoint.

Below is the response that I got when invoked list container operation with Authorization header

<?xml version="1.0" encoding="utf-8"?>
<Error>
    <Code>AuthenticationFailed</Code>
    <Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:096c6d73-f01e-0054-6816-e8eaed000000
Time:2019-03-31T23:08:43.6593937Z</Message>
    <AuthenticationErrorDetail>Authentication scheme Bearer is not supported in this version.</AuthenticationErrorDetail>
</Error>

solution was to include below header

x-ms-version: 2017-11-09
user2243747
  • 2,767
  • 6
  • 41
  • 61
  • 1
    This worked for me, I think it defaults to using an earlier version of this, so setting to a later version that supports the header / bearer token fixes the problem. – cedd Feb 03 '21 at 16:41
  • 1
    Can I ask how you did this via the sdk api (in the OP)? – smatthews1999 Nov 08 '21 at 22:16
12

in my case it was actually the shared access signature (SAS) that expired. updating (actually making a new one) the shared access signature in portal.azure.com by adding a year (or more) for end date in the future. And all problems fixed.

Ewald Bos
  • 1,560
  • 1
  • 20
  • 33
  • 3
    @Amit there is not much code to share other as that you go into your portal, go to storage, go to SAS and create a new code. That was it, copy the new key in your code – Ewald Bos Sep 09 '19 at 21:37
  • 1
    refreshing the key did not help me :( any other ideas? It was working 10 minutes ago. – Numan Karaaslan Oct 01 '19 at 20:00
10

In my case I was passing storage connection string with access signature as an argument to console application. '%' in command line is an special character 'command line parameters'. '%' appears in access signature (SAS). You have to escape percent %, double it %%.

  • Same problem here, the SAS worked fine in local shell but gave the "malformed authentication header" in a pipeline. Escaping by doubling it solved the issue. – SaschaM78 Jul 29 '22 at 13:03
6

ERROR MESSAGE


Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

SOLUTION


I was facing the same issue in my application and i resolved it by generating a shared access signature (in azure portal) for key2 instead of key1. Changing the key fixed the error. (Settings > Shared access signature) Also keep in mind that the connection string should be updated as well - if used. (Settings > Access keys)

emi
  • 2,786
  • 1
  • 16
  • 24
gmavridakis
  • 368
  • 4
  • 13
4

We ran into the same error when tried to connect to a Storage account from a simple Azure AppService. After lot of investigation, we asked the official Microsoft Support to help. They checked our resources and infrastructure in Azure, and pointed out that the problem is related to Application Insights, here is the official answer:

there is an additional header ‘x-ms-request-root-id‘ in the request which gets added after the authorization header is signed for storage client request. As a result, when the request is authenticated by Azure Storage it results into 403. This seems to be getting added by the Application Insights, if the dependency tracking is enabled..

So after disabling the dependency tracking in /dev/wwwroot/ApplicationInsights.config, the error disappeared, and AppService can connect to storage account without any problem.

gmaklari
  • 49
  • 1
  • This fixed my issue as well. Did microsoft ever provide a solution to use dependency tracking when connecting to storage account? – Jon Koivula Aug 19 '22 at 10:38
  • @JonKoivula yes, see this link https://github.com/Azure/azure-sdk-for-net/issues/3460. Outgoing requests to Microsoft domains, for example `core.windows.net` can be excluded from adding Application Insights headers: ```core.windows.net``` – Anna Gevel Sep 05 '22 at 17:53
3

My webapp didn't get access to Table Storage when running as an Azure App Service, even though the connectionstring was exactly the same as I used during development on my local machine.

To solve the problem, I added a system assigned identity to my Azure App Service and gave it Storage Account Contributor role on the Storage Account.

The application was .NET Core 3.1 using .NET library WindowsAzure.Storage version 9.3.2.

Add System assigned identity to Azure App Service Assign Storage Account Contributor role to Store Account

Eivind Gussiås Løkseth
  • 1,862
  • 2
  • 21
  • 35
3

Check your URL.

Blob SAS URL from shared access token page, is base container URL it is not valid URL and will always result with "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature." error

example of base SAS URL with token https://[storage account].blob.core.windows.net/[container]?[blob sas token]

following valid URL should work:

  1. listing blobs https://[storage account].blob.core.windows.net/[container]?restype=container&comp=list&[blob sas token]
  2. getting blob https://[storage account].blob.core.windows.net/[container]/[blob name]?[blob sas token]
Aren Dober
  • 31
  • 1
2

In my case (python), had to encode the upload content from string to bytes. Not sure why the error message says this though:

azure.core.exceptions.ClientAuthenticationError: 
Server failed to authenticate the request. 
Make sure the value of Authorization header is formed correctly including the signature.

Here is what worked.

Versions:

$ python -V
Python 3.7.7
$ pip list | grep azure
azure-core               1.8.1
azure-storage-blob       12.4.0

and the python function to upload:

def upload_blob(blob_service_client, content: str, container_name, blob_name):
    blob_client = blob_service_client.get_blob_client(container_name, blob_name)
    try:
        content_bytes = content.encode('utf-8')
        blob_client.upload_blob(content_bytes)
        return True
    except Exception as e:
        logger.error(traceback.format_exc())
        return False        
arun
  • 10,685
  • 6
  • 59
  • 81
1

Check the timezone of your computer or mobile phone.

1

Ir your debugging locally, but connecting to a remote azure storage and get this error check that your AzureWebJobsStorage string is up to date in your local.settings.json. Seems there are multiple reasons why this can happen and this is one of them.

Liam
  • 27,717
  • 28
  • 128
  • 190
1

If you see this error while using PUT request, make sure new blob name is provided in URL.

https://{storageAccount}.blob.core.windows.net/{containerName}/{**NEW BLOB NAME TO CREATE**}?{SAS Token}
Hemant Sakta
  • 615
  • 7
  • 8
1

(python) I've had the same error on a resource GET request with a SAS token, but when trying to GET via local machine (python, browser etc) it always worked fine. I saw @gmaklari's and @user2243747 comments which led me to use requests lib instead of urlretrieve (My initial intent was to add headers to the request). It works fine now. didn't have to add any headers.

wakafa
  • 31
  • 5
1

The very same message will popup if you have problem with the connectivity with the Storage Account.

A quick way to check that on the Storage Account side is to go to the Storage Account in the portal and then open the Networking on the left side bar. Then, check whether the option Enalbed from all networks is enabled in the Storage Account side.

enter image description here

If you use private endpoints you will need to ensure that your DNS is also properly configured.

Daniel Bonetti
  • 2,306
  • 2
  • 24
  • 33
0

I migrated an app from one machine to another, migrating the connection string across. The public IP was unchanged and the URL was not expired, yet I got this error.

It appears that once used a connection string URL is tied to that machine - perhaps other fingerprints are added on first use.

I have no evidence for this but it worked after generating a new connection string.

fiat
  • 15,501
  • 9
  • 81
  • 103
0

I was getting the same error, but what is really strange, I was getting the error in 2 of 3 storage accounts that I was running some code with. What fixed this for me is to update the Azure.Storage.Files.DataLake library to a preview version 12.2.2. That fixed the issue. Tried all other suggestions, time sync, etc. None of it worked. Really weird issue.

FabianVal
  • 374
  • 1
  • 7
0

I had the same error. Storage connection worked when debugging locally but didn't work when deployed on Azure App Service. I had deleted a storage account with same name a bit earlier and recreated it on different Azure subscription. Could there be a bug in Azure, such that even though the resource name was made available again, it was not "freed correctly" inside Azure whatever that means?

When I created a storage account with a different name, I didn't get the error anymore.

jtCoder
  • 23
  • 5
0

To form your auth header for blob storage, you need to use sdk or refer to this document, and the error is usually cause by the SAS token validation error.

I will suggest you to print your SAS token or catch the header to analyze if you miss anything. And I share my example(upload) for SAS token and auth header generation(shell script) for your reference. You can check this link if you want more details.

generatesig() {
#######
# From: https://learn.microsoft.com/en-us/rest/api/storageservices/create-account-sas?redirectedfrom=MSDN
#
# StringToSign = signedPermissions + "\n" +  
#                signedStart + "\n" +  
#                signedExpiry + "\n" +  
#                canonicalizedResource + "\n" +  
#                signedIdentifier + "\n" +  
#                signedIP + "\n" +  
#                signedProtocol + "\n" +  
#                signedVersion + "\n" +  
#                signedResource + "\n"
#                signedSnapshotTime + "\n" +
#                rscc + "\n" +  
#                rscd + "\n" +  
#                rsce + "\n" +  
#                rscl + "\n" +  
#                rsct  
    local ST=$(TZ=GMT date -u +"%Y-%m-%dT%H:%M:%SZ")
    local SE=$(TZ=GMT date -u +"%Y-%m-%dT%H:%M:%SZ" -d "next day")
    local canonicalizedResource="/blob/${AZURE_STORAGE_ACCOUNT}/${AZURE_CONTAINER_NAME}/${FILE_NAME}"
    # StringToSign="${AZURE_STORAGE_ACCOUNT}\nr\nb\no\n${ST}\n${SE}\n\nhttps\n2020-08-04"
    # decoded_hex_key="$(echo -n $AZURE_ACCESS_KEY | base64 -d -w0 | xxd -p -c256)"
    # sig=$(printf  "$StringToSign" | openssl dgst -sha256 -mac HMAC -macopt "hexkey:$decoded_hex_key" -binary | base64 -w0)

local StringToSign="r\n${ST}\n${SE}\n${canonicalizedResource}\n\n\nhttps\n2020-08-04\nb\n\n\n\n\n\n"
echo ${StringToSign}
local decoded_hex_key="$(echo -n $AZURE_ACCESS_KEY | base64 -d -w0 | xxd -p -c256)"
local sig=$(printf  "$StringToSign" | openssl dgst -sha256 -mac HMAC -macopt "hexkey:$decoded_hex_key" -binary | base64 -w0)


}

local request_date=$(TZ=GMT date "+%a, %d %h %Y %H:%M:%S %Z")
local storage_service_version="2020-10-02"

# HTTP Request headers
local x_ms_date_h="x-ms-date:$request_date"
local x_ms_version_h="x-ms-version:$storage_service_version"
local signature=$(generatesig)
local authorization_header="Authorization: SharedKey $AZURE_STORAGE_ACCOUNT:$signature"

curl -X ${HTTP_METHOD} \
        -o /dev/null -w '%{http_code}' \
        -H "$x_ms_date_h" \
        -H "$x_ms_version_h" \
        -H "$authorization_header" \
        -H "Content-Length: 0" \
        ${REQUEST_URL}
T.R
  • 126
  • 7
0

My situation is a bit different to the OP's, but I was getting the same error. Since this thread has turned into a nice repository of possible solutions for this vague error, here's what fixed it for me:

I was originally asking for permissions/scopes from multiple resources at the same time. Eg Graph + Storage. When I realised the problem and switched to only using the https://storage.azure.com/user_impersonation scope, the error went away.

Relevant docs: "An access token can be used only for a single resource"

kostasvs
  • 391
  • 4
  • 12
0

I found yet another source of this error. In my case, this error was thrown by the Event Hub integration with Function apps, where it tries to read the blobs to track the progress in the partition.

On each blob, when a lease is taken out a metadata key called OWNINGHOST is added to the blobs. For some reason in my case this had changed to Owninghost - editing the metadata to OWNINGHOST fixed the problem.

pjrharley
  • 35
  • 1
  • 5
  • As it turns out I had copied the blobs using azcopy, so the lowercasing was caused by that: https://github.com/Azure/azure-storage-azcopy/issues/113 – pjrharley Mar 16 '22 at 15:38
0

I got this error when using Azurite on my local development machine and I misconfigured the connection string to be used by the blob service client. Instead of using the connection string for blob, I used the one for queue storage and got this error.

DeMaki
  • 371
  • 4
  • 15
0

Documenting another source of this error.

I got the following error

azure.core.exceptions.ClientAuthenticationError: Server failed to authenticate the request. 
Make sure the value of Authorization header is formed correctly including the signature.
authenticationerrordetail:The MAC signature found in the HTTP request is not the same as any computed signature

The thing causing this error was using incorrect blob_file_path when creating blob_client

blob_client = blob_service_client.get_blob_client(
    container=self.container_name,
    blob=blob_file_path)

Please ensure blob_file_path exists and is correct

Sniper
  • 1,428
  • 1
  • 12
  • 28
0

I resolved the error by upgrading my Azure Functions project from netcoreapp3.1 to net6.0.

  <PropertyGroup>
      <TargetFramework>net6.0</TargetFramework>
      <AzureFunctionsVersion>v4</AzureFunctionsVersion>
  </PropertyGroup>
Scott Nimrod
  • 11,206
  • 11
  • 54
  • 118
0

I was also facing same issue and after investigation, I found that instead of accessing storage using "AccountName & SaS token" based approach I was using "AccountName & AccountKey" based approach. After using SaS token based approach this issue resolved.

0

enter image description here

You should check the start and expiry date/time of the shared access token, as it has an expiration date. If your shared access token has expired, you should generate a new SAS token and URL. An expired token will prevent access to the specified storage resource.

ruddy simonpour
  • 133
  • 1
  • 1
  • 8
0

I faced this problem in a pipeline, where the login was made with one version of az cli and the SA operation with another one (specifically, the login was made with 2.28.0 and the SA operation with 2.48.0). Just ensure both operations are run with the same az cli version.

emi
  • 2,786
  • 1
  • 16
  • 24
0

I was having the same issue, I rotated the key I was using and it worked after that. Maybe the key expired.

enter image description here

f4d0
  • 1,182
  • 11
  • 21
0

I have been using the logic app and accessing Azure blob storage using an HTTP connector with Managed Identity authentication. BUT getting the same error as all you have noticed - "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature."

Finally, I found a HACK - Open the Blob Container and changed the access level from "Private no anonymous access" TO "Container anonymous read access for container and blobs" Run the application (which won't give you the above error) once the run get successful, reset the access level (back to Private no anonymous access) which fixes the issue.

enter image description here

I would be very eager to now if this HACK helped someone or if I'm only the one :)

Deepak Shaw
  • 461
  • 3
  • 6