I am running a certificate store off of S3 for an ASP.NET cloud application. The class S3CertificateStore reads .pfx files and password files from S3, and creates the certificates in memory.
private void LoadPrivateCerts(X509Certificate2Collection certificates)
{
var s3Files = S3Facade.ListObjects(Config.Bucket, Config.PrivatePath).ToList();
foreach (var filePath in s3Files)
{
if (filePath.EndsWith(".pass") || filePath.EndsWith("/"))
{
continue;
}
try
{
var certBytes = S3Facade.GetObject(Config.Bucket, filePath);
var pwdBytes = S3Facade.GetObject(Config.Bucket, filePath + ".pass");
var pwd = Encoding.UTF8.GetString(pwdBytes);
var cert = new X509Certificate2(certBytes, pwd, X509KeyStorageFlags.Exportable); // needs to be exportable!
certificates.Add(cert);
}
catch (Exception e)
{
exceptions.Add(e);
}
}
}
When I run locally, all the certificates are pulled from S3 and reconstituted correctly. BUT... when I run the code on an EC2 instance, SOME certificates are fine, and others fail. (The same ones always fail).
EXCEPTION: The system cannot find the file specified.
at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
at System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromBlob(Byte[] rawData, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx)
at System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromBlob(Byte[] rawData, Object password, X509KeyStorageFlags keyStorageFlags)
I'm baffled. Could there be some kind of character encoding difference at work? I don't think any of the passwords have high-bit characters, but I may be seeing them after something has already been munged.
Any suggestions?