To be able to investigate an issue reported for one of my apps, I need to know the (runtime) environment in which my application runs on the user's side. Besides application's version and operating system's version, I also want to display the .NET CLR version and (if possible) the .NET Framework version to the user so the user can report these information back to me.
I found several solutions for detecting installed .NET Frameworks, but this is not what I need.
I don't care (at first) what other software is installed on the user's computer. And a .NET Framework installation that is actually not used for running my application is by definition other software, not related software.
To cover the CLR version requirement, I found Environment.Version.ToString()
to be usefull.
However, reading the documentation make me anxious. The Remarks section of the Environment.Version documentation points out, that
For the .NET Framework 4.5 and later, we do not recommend using the Version property to detect the version of the runtime[...]
I understand CLR in something like the API, and the Framework in something like the components and actual logic that can be updated. So knowing the CLR version is a good start, but the Framework version may be also required as it can point out incompatibilities due to bugs in the actual Framework (logic). An error in my application can be a result of a bug in the Framework so there is no bug (but possible workaround potential) in my application. This can drastically speed up the debugging process.
With the remarks about Environment.Version
in mind, what can I safely use to get the CLR version for application that runs on .NET Framework version 4.0 up to .NET Framework version 4.7.2?
Is there any way to detect the the .NET Framework version (components and logic) that is used to running my application? Maybe via reflecting the loaded assemblies and take the assembly version from specific assemblies, but is such an approach reliable?