0

We have an ASP.Net MVC application that uses server-to-server communication for retrieving some info.

When we run an installation in the AWS cloud, the request fails because, by default, WebRequest uses TLS 1.0, which we have disabled on our environment. Using the same code in another project defaults to TLS 1.2. Also, hardcoding the protocol in the ServicePointManager fixes the issue.

Does anyone have experience with a similar problem and the underlying cause? I would like to fix this without hardcoding the protocol because it is not future-proof.

Will
  • 413
  • 6
  • 23
  • Problem is that the default is also hardcoded to be SSL 3 or TLS 1.0. At some point, you must hardcode something, if not for disabling protocols becoming obsolete. At best, provide some configuration option for setting protocols, and update as new ones appear. – Alejandro Sep 19 '19 at 13:50
  • The issue is that, by Microsofts own documentation, compiling an application in .Net 4.6 should set the default to TLS 1.2 – Aleksandar Trajkov Sep 19 '19 at 13:53
  • 1
    I'm not sure what MS documentation you are citing, but looking at the docs for `ServicePointManager.SecurityProtocol` (https://learn.microsoft.com/en-us/dotnet/api/system.net.servicepointmanager.securityprotocol?view=netframework-4.8) - the following are some important details. Starting with the .NET Framework 4.7, the default value of this property is `SecurityProtocolType.SystemDefault`. This allows....inherit the default security protocols from the operating system..... – Tommy Sep 19 '19 at 14:13
  • For versions of the .NET Framework through the .NET Framework 4.6.2, **no default value is listed for this property.** The security landscape changes constantly, and default protocols and protection levels are changed over time in order to avoid known weaknesses. **Defaults vary depending on individual machine configuration, installed software, and applied patches.** – Tommy Sep 19 '19 at 14:14
  • Lastly - "Your code should **never implicitly depend on using a particular protection level, or on the assumption that a given security level is used by default.** If your app depends on the use of a particular security level, you must explicitly specify that level and then check to be sure that it is actually in use on the established connection. Further, **your code should be designed to be robust in the face of changes to which protocols are supported**, as such changes are often made with little advance notice in order to mitigate emerging threats. " – Tommy Sep 19 '19 at 14:14
  • I was reading [this page](https://learn.microsoft.com/en-us/dotnet/framework/network-programming/tls), specifically, this paragraph: **Because the SecurityProtocolType.SystemDefault setting causes the ServicePointManager to use the default security protocol configured by the operating system, your application may run differently based on the OS it's run on. For example, Windows 7 SP1 uses TLS 1.0 while Windows 8 and Windows 10 use TLS 1.2.**. The issue happens in Win 10, and Windows Server 2016 – Aleksandar Trajkov Sep 19 '19 at 14:23
  • Fair enough, but that default was not used until .NET 4.7 and above. For 4.6.2, there is no default - based on your question and comment response above, it sounds like you are on .NET 4.6.X. Also, relying on the underlying OS configuration makes moving your app to other servers could cause your default to not be the same as what you are expecting. I believe explicitly setting this is recommended due to the reasons cited above (quick changing security landscape, etc). – Tommy Sep 19 '19 at 14:31
  • Compiling with 4.7.2 did not change the default, and also, it only happens for this one application, others work fine. Don't get me wrong, I agree that we should not rely on the OS defaults as a definitive source of truth, but it really bothered me that only this one application showed this problem. – Aleksandar Trajkov Sep 19 '19 at 14:54

1 Answers1

1

I had a similar problem, and ended up simply making it a configuration setting:


//read setting as comma-separated string from wherever you want to store settings
//e.g. "SSL3, TLS, TLS11, TLS12"
string tlsSetting = GetSetting('tlsSettings')

//by default, support whatever mix of protocols you want..
var tlsProtocols = SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;

if (!string.IsNullOrEmpty(tlsSetting))
{
    //we have an explicit setting, So initially set no protocols whatsoever.
    SecurityProtocolType selOpts = (SecurityProtocolType)0;

    //separate the comma-separated list of protocols in the setting.
    var settings = tlsSetting.Split(new[] { ',' });

    //iterate over the list, and see if any parse directly into the available
    //SecurityProtocolType enum values.  
    foreach (var s in settings)
    {
        if (Enum.TryParse<SecurityProtocolType>(s.Trim(), true, out var tmpEnum))
        {
            //It seems we want this protocol.  Add it to the flags enum setting
            // (bitwise or)
            selOpts = selOpts | tmpEnum;
        }
    }

    //if we've allowed any protocols, override our default set earlier.
    if ((int)selOpts != 0)
    {
        tlsProtocols = selOpts;
    }
}

//now set ServicePointManager directly to use our protocols:
ServicePointManager.SecurityProtocol = tlsProtocols;

This way, you can enable/disable specific protocols, and if any values are added or removed to the enum definition, you won't even need to re-visit the code.

Obviously a comma-separated list of things that map to an enum is a little unfriendly as a setting, but you could set up some sort of mapping or whatever if you like of course... it suited our needs fine.

GPW
  • 2,528
  • 1
  • 10
  • 22
  • I like this approach, and I will recommend we do this for future versions. However, it does not explain the cause of the underlying issue, which is really driving me crazy :). – Aleksandar Trajkov Sep 19 '19 at 14:51
  • more detail found here: https://stackoverflow.com/q/28286086/3658528 – Joel Dec 10 '20 at 01:05