In most (if not all) C# (and F# and VB) library and executable projects created in Visual Studio there is an automatically added app.config
file that specifies runtime version and target framework moniker (TFM):
<configuration>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.0" />
. . .
Even absent the app.config
file entirely, the compiler seems to always generate an assembly-level attribute, as ILDASM shows:
.custom instance void [mscorlib]System.Runtime.Versioning.TargetFrameworkAttribute::.ctor(string) = ( 01 // ....NETFramework
.. // ,Version=v4.6.1.
bytes snipped-> .. // .T..FrameworkDis
.. // playName..NET Fr
61 ) // amework 4.6.1
The .csproj
file does specify the target frameworks, and my guess this is where the target is passed from on to the compiler during build.
The executable seems to run just fine without the <startup>
section in the config file. The documentation explains what do the attributes mean, but, seeing them for many years, I never understood why they are needed in the configuration file. I mostly dealt with desktop applications for Windows, however.
This answer explicitly states that “making a program compiled to target .NET 4.0 behave like it runs on a higher version is not possible,” and I would be really surprised if, conversely, running a program on a lower version of the framework were also possible.
So, under what scenarios does the application developer need to specify the version and TFM of the runtime in the .config
file of the application, and does it have to always duplicate information that is hardcoded into the binary by the compiler? The requirement seems counterintuitive at first sight.
UPDATE 2018-06-29: X-ref: I asked for a clarification of the documentation in the GitHub issue dotnet/docs#6234.