I recently started a new job, and one of the first things I noticed everybody talking about was "updating" All our .NET apps to x64. I initially thought this odd since we all know .NET compiles to platform agnostic IL and the specific CLR runs the code.
Looking a bit further, I found this helpful article and this SO post which helped explain things.
So now I understand that the IL isn't changed, only meta data basically saying to run in WOW64 or not on a x64 system (in a nutshell).
So if I'm on a x64 system, I can specify "Any CPU" to run natively, but won't support 32bit dlls; I can specify "x86" which will support 32bit dlls (since they both would be running under WOW64); but when would I specify "x64"? It seems that 64bit dlls would be supported in the "Any CPU" scenario on a x64 system. Is this if I want to prevent someone from running my app on a 32bit system or ensuring failure when trying to load 32bit dlls?
It would also seem to me that you would only need to set it to something other than "Any CPU" if you have some 3rd party dll to worry about in your project. Is it best to leave it as "Any CPU" for every other project not dealing with others dlls?
If I do happen to set my target to "x86" because I have a 32bit 3rd party dll, is my application actually considered to be running in 64bit if on a 64bit system just under WOW64?