The main confusion is there's really 2 important version numbers with any .NET version: the version number and the CLR version. An application actually cares less about the .NET version than it does the CLR version. Confusing? I agree. Let's talk more.
When .NET released, it was .NET 1.0 running on CLR 1.0. Nothing confusing about that. Soon after, .NET 1.1 released with CLR version 1.0. While for the most part you didn't need to rewrite code to go from one to the other, CLR 1.1 could not run code compiled for CLR 1.0 and vice versa. Still, that's not too confusing: a 1.0 app works only if you have 1.0, and a 1.1 app only if you have 1.1.
Then came .NET 2.0, and with it CLR 2.0. Again, CLR 2.0 was not backwards compatible so .NET 1.x apps could not run natively on it. Still clear, though now people were agitated they had to juggle 3 different huge installers. MS had a lot of plans for .NET in this period, so they changed how they deployed things and this is where it gets real fun, if your idea of "fun" is "difficult".
.NET 3.0 and .NET 3.5 released real close to each other. They both use CLR 2.0. If you had .NET 3.5 installed, because it used CLR 2.0, you could run any .NET 3.5, 3.0, or 2.0 application on it. But it couldn't run .NET 1.0 or .NET 1.1 applications, they required their repsective CLRs. Interestingly enough, you could compile an application targeting .NET 3.5 and run it on a machine with only .NET 2.0 installed, because they were compatible CLRs. You only got in trouble if you tried using libraries that were new in .NET 3.5, or the rare instances where they broke behavior.
.NET 4.0 released with CLR 4, I guess because they ran out of decimals for version numbers. Since it's not CLR 2, it couldn't really run applications configured for anything but .NET 4.0. But that's sort of a lie. You can use the "supportedRuntime" tag in your app.config to tell .NET that your application supports CLR 4. Since CLR 4 and CLR 2 are MOSTLY compatible in large ways, MOST .NET 2.0-3.x applications can run just fine on CLR 4. But it's not safe unless you verify you aren't relying on something that changed between the two, so they make you have to explicitly declare it as a desired feature. It's generally easier to just recompile your application for CLR 4, but mucking with the app.config is a good band-aid while you test and verify.
.NET 4.5, 4.5.1, 4.5.2, and 4.5.6 all run on CLR 4 and jeez that's a lot of versions.
In terms of what you can rely on, it's a mess. The 2.0 CLR is installed by default on Windows Vista and Windows 7, but it's 2.0 and 3.5 respectively. The 4.0 CLR is only available by default as of Windows 8 (ships with 4.5) and Windows 10 (ships with 4.6). Neither of these versions ship a 2.0 CLR, but I know 8 generally responded to attempts to run apps that needed it by offering to install it for you. This is confusing but until Vista shipped, you couldn't assume a client machine had any .NET framework and they all represented 500+MB downloads. So at least we're making progress.
General rules that have always held for .NET deployment since the 3.0 release:
It's harder than you think. Always.
It's easiest to deploy if you compile with whatever is the latest .NET framework, and often you don't need to update your code to make that happen.
When you can't do that, it's easiest to deploy if you're running on whatever the latest CLR happens to be. It is currently 4.
When you can't do that, sometimes your app will just work if you use app.config to force it onto the new CLR. Don't unleash this on customers without a lot of testing and verification first, or at least a warning that you're not 100% certain of it.
If you aren't using .NET 4.x and can't rebuild for CLR 4 and can't use the config file to run, make sure your customers understand they need to install CLR 2, which is the .NET Framework 3.5.
Keep your favorite beverage handy at your desk, always, if you are an installer developer.
ref: More in depth thread