0

I am running a certificate store off of S3 for an ASP.NET cloud application. The class S3CertificateStore reads .pfx files and password files from S3, and creates the certificates in memory.

private void LoadPrivateCerts(X509Certificate2Collection certificates)
        {
            var s3Files = S3Facade.ListObjects(Config.Bucket, Config.PrivatePath).ToList();

            foreach (var filePath in s3Files)
            {
                if (filePath.EndsWith(".pass") || filePath.EndsWith("/"))
                {
                    continue;
                }
                try
                {
                    var certBytes = S3Facade.GetObject(Config.Bucket, filePath);
                    var pwdBytes = S3Facade.GetObject(Config.Bucket, filePath + ".pass");
                    var pwd = Encoding.UTF8.GetString(pwdBytes);

                    var cert = new X509Certificate2(certBytes, pwd, X509KeyStorageFlags.Exportable);    // needs to be exportable!
                    certificates.Add(cert);
                }
                catch (Exception e)
                {
                    exceptions.Add(e);
                }
            }
        }

When I run locally, all the certificates are pulled from S3 and reconstituted correctly. BUT... when I run the code on an EC2 instance, SOME certificates are fine, and others fail. (The same ones always fail).

EXCEPTION: The system cannot find the file specified.

   at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
   at System.Security.Cryptography.X509Certificates.X509Utils._LoadCertFromBlob(Byte[] rawData, IntPtr password, UInt32 dwFlags, Boolean persistKeySet, SafeCertContextHandle& pCertCtx)
   at System.Security.Cryptography.X509Certificates.X509Certificate.LoadCertificateFromBlob(Byte[] rawData, Object password, X509KeyStorageFlags keyStorageFlags)

I'm baffled. Could there be some kind of character encoding difference at work? I don't think any of the passwords have high-bit characters, but I may be seeing them after something has already been munged.

Any suggestions?

Mike Kantor
  • 1,400
  • 4
  • 24
  • 45
  • All the files are in the same bucket, and all have the same owner. If I add to the logging, I can see that the .pfx and .pfx.pass files have the same number of bytes when the code runs on prod as when it runs locally. – Mike Kantor Nov 17 '16 at 15:34
  • Have you tried using File.ReadAllBytes() to load it into a byte array and pass the array to the X509Certificate2 constructor? The underlying cryptography library is known to not like symbolic links, for example, and so maybe these files have an attribute that it's not happy with. – bartonjs Nov 17 '16 at 17:43
  • Thanks bartonjs. In this setting, the files are actually byte arrays, as you suggest, at the time we try to construct the certs. – Mike Kantor Nov 17 '16 at 18:15

1 Answers1

0

I found a workable solution. If I set the IIS AppPool setting "Load User Profile" to True, then certificate construction works. The following script in an .ebextension file seems to do the trick:

c:\windows\system32\inetsrv\appcmd.exe set config /section:applicationPools "/[name='DefaultAppPool'].processModel.loadUserProfile:true"

I still don't understand why certificate construction was consistently succeeding for some certs and consistently failing for others, before this change.

Mike Kantor
  • 1,400
  • 4
  • 24
  • 45
  • Here is a different answer that explains why Load User Profile affects certificate loading. http://stackoverflow.com/a/17149834/492405 – vcsjones Nov 17 '16 at 18:19