I downloaded Visual Studio 2013 from DreamSpark but it's the 32-bit version and I couldn't find any 64-bit version. Is there none, and if so why is there no 64-bit version of Visual Studio?
-
6For the same reasons as for VS 2012: http://stackoverflow.com/questions/13603854/visual-studio-2012-64-bit – Vertexwahn Dec 12 '13 at 14:42
-
1@Vertexwahn, looks like to search for `recursion` on Google – Rubens Farias Dec 12 '13 at 15:08
3 Answers
Update (May 2021) Visual Studio 2022 will ship as a 64-bit build: https://visualstudiomagazine.com/articles/2021/04/19/vs-2022.aspx
Original answer (Dec 2013)
First, there is a 64-bit C++ compiler that comes with Visual Studio tool set. So you can always change your project settings to make 64-bit builds of your app as needed.
Now, to answer the original question.
Think of it from a cost and ROI perspective. From years of shipping software at Microsoft, here's how I've seen the consideration for 64-bit builds get made.
When the 32-bit app works just fine on 64-bit, it's almost a non-starter to consider 64-bit.
Most of the projects at Microsoft aren't simple little Visual Studio projects in which the developer can just flip the Project settings from 32-bit to 64-bit. (I actually don't know if the Visual Studio team compiles Visual Studio with a VS project.) They are often well over a million lines of code that build with the VS compiler set, but from a command line and Makefile environment. Switching to 64-bit means updating a lot of this build infrastructure.
There is a cost of porting from 32-bit to 64-bit. The first cost is just fixing the bugs, getting the code to compile, restructuring the build environment, and all the upfront work just to get the initial build going.
There is an ongoing cost you pay for having separate 32-bit and 64-bit builds of an application. You have to build it twice every day. You have to run the test collateral on it twice every day. It's not a 2x cost, but it's not free either.
With more SKUs from the same code base, it increase that chances that a developer will break something when he checks in. Of course there can be automated tests to prevent this, but it will slow the developer down since he will have to go back and fix the other SKU that he doesn't have installed locally on his test machine.
Now here are some of the motivations for moving to 64-bit:
You really need to take advantage of 64-bit performance and memory architectures. Large database servers that use as much memory as possible will benefit from accessing more than 2GB limit imposed on a 32-bit Windows process.
You need to integrate with something already compiled with 64-bit. For example, if you want to write a shell extension for Windows, you will need a 64-bit build to run on 64-bit Windows. That doesn't mean the entire app has to be ported, but it does mean this component will need a separate 64-bit build.
You have a platform or API story for external developers to consider. Usually, they have their own needs for 64-bit builds. Hence, they may need a 64-bit ready API from you even if your native app can get away with 32-bit support.
Your team has just been re-organized into the Windows division and your team's code has been deemed necessary to be included into the next Windows release. There's no decision to be made anymore - your code will be compiling for 32-bit, 64-bit, and ARM (Surface RT).

- 100,020
- 15
- 103
- 173
-
I'd be happy if the next VS was 64-bit only .. this would eliminate the points about dual builds, but not about the investment for the switch itself :} – user2864740 Mar 22 '16 at 23:04
Source code files should not be multiple gigabytes -- there's no reason for a text editor / development environment to use 64-bit pointers, which consume twice as much RAM for no benefit. Larger pointers make data structures containing pointers larger, requiring more memory bandwidth to move them around, and fitting fewer inside the CPU's data cache, so that the number of cache misses may increase as well.
The 32-bit editor is perfectly capable of launching and interacting with the 64-bit compilers, linkers, and debuggers when needed. Having only a 32-bit editor also simplifies the plugin model greatly.

- 277,958
- 43
- 419
- 720
-
31 million 64bit pointers only costs you 4MB more ram than the equivalent 32bit pointers. Ram is plenty, most computers I work with have at least 2 GB ram thus you're loosing: 4 / (2*1024) = 1 / 512 or roughly 0.2% of your ram. Put it this way you loose more memory from accidentally opening clippy in office. However 64bit long mode in amd64 allows you to use more registers for more efficient code with less memory stores/loads. – Emily L. Jan 08 '14 at 14:35
-
@Emily: 4MB more ram is half your L3 cache, and all of your L1 and L2 cache several times over... – Ben Voigt Jan 08 '14 at 15:53
-
4You never said anything about CPU caches in OP and I never commented on it. But I'll bite, if you have 1M pointers that you access so frequently that it causes you to trash your L1/L2/L3 cache, you have bigger problems than the 32bits per pointer you waste. If fact if your pointer-data ratio is so high you are having performance issues, it's not due to running 64bit mode. – Emily L. Jan 09 '14 at 08:38
-
1@Emily, you might well have data structures that fit in a single cache line with 32 bit pointers and no longer do with 64 bit ones. I shouldn't have to mention that increased RAM usage implies more cache usage, more memory bandwidth needed, and more page faults. Expert programmers know the cost of RAM usage significantly exceeds the penny per megabyte you pay for DRAM modules. – Ben Voigt Jan 09 '14 at 13:21
-
Using 64 bit pointers instead of 32 bit could just as well pad out your data structures in such a way that your application avoids false sharing, improving your performance. Your point about exceeding cache line sizes is perfectly valid, but you fail to see that my objection is more against the blanket statement that "64 bit pointers consume twice as much ram for no benefit". Some new developer is going to see that and take it as a universal truth while it in fact is more complicated than that as you are probably aware. Ans yes L1 latency is on the order of 6 cycles and DRAM 130-1000 cycles... – Emily L. Jan 09 '14 at 15:31
-
@Emily: Padding exists in both 32-bit and 64-bit environments. And I only said that compiling as 64-bit is no benefit *in a text editor that isn't used on huge files*. Developers who are prone to cargo-cult programming and stretching rules beyond the explicitly stated application area are lost anyway. On the other hand, this is fairly widely applicable. Take a look at performance numbers for Linux x86_32 systems. The wide pointers are harmful in all except very specialized cases. – Ben Voigt Jan 09 '14 at 15:54
-
On systems without x86_32, the additional and wider registers balance the cost of the increased pointer size... but text editors aren't benefiting from 64-bit arithmetic any more than from huge address space. In this case, the tradeoff is pretty clearly in favor of 32-bit compilation. (Remember that modern x86 CPUs perform register renaming, so that 32-bit code has just as many registers available as 64-bit, just fewer names.) – Ben Voigt Jan 09 '14 at 16:00
-
I'm not arguing that 64bit can be slower and/or faster than 32bit depending on your code. I'm disagreeing with your statement "there's no reason for a text editor / development environment to use 64-bit pointers and consume twice as much RAM for no benefit." And as I said on my original comment the amount of RAM consumed is negligible on any modern PC. I added the extra comment that 64 bit may bring more than just extra memory usage so it's not all bad. You deflected the argument into a performance argument of 64 bit vs 32 bit. – Emily L. Jan 10 '14 at 10:10
-
@Emily: You still haven't mentioned any benefit of making a source code editor 64-bit. And I didn't "deflect" the discussion into performance, your very first comment was about performance (or is "more efficient code with less memory stores/loads" about power saving?) – Ben Voigt Jan 10 '14 at 15:05
-
1Sweetie, I'm not arguing to make a source editor 64 bit. I'm saying that the argument you used, "pointers consume twice the amount of RAM" is a silly argument to make when arguing to *not* make a source editor 64 bit. If you think that the (possible) performance regression that may arise from using 64 bit long mode is a good argument then state performance as your key gripe instead of RAM usage. I already stated what that comment was about in my previous comment. – Emily L. Jan 10 '14 at 16:55
-
@Emily. Ah, yes. I will make the performance implications of additional memory usage explicit. When I have time, right now I have a plane to catch. – Ben Voigt Jan 10 '14 at 17:00
-
Tools, plugins, and everything else that make a full IDE awesome uses memory. Not lines of code loaded in a few windows. Considering that right now, *in a real environment*, my VS instance is using 2.7+GB of memory - on a 16GB box - and trying not to die (again) .. makes this feels like a "640k" answer. – user2864740 Mar 22 '16 at 22:57
-
@user2864740: You've got two things going on there. One, badly written plugins that try to keep everything in memory. Two, extensions that are written for scalability using process memory as a cache for an actual database. Caches tend to expand to fill available memory -- having twice as much in the Intellisense database won't result in twice the RAM usage. And the SQL database isn't limited by the 4GB virtual address space anyway. It's not as convenient as direct pointer access, but mapping pages gives you a 32-bit window into a much larger allocation. Without paying the 2x pointer size – Ben Voigt Mar 23 '16 at 14:40
The reason is the same as it has always been. It would require a significant effort to port a code base as large as Visual Studio to 64-bit and according to Microsoft, the benefits would be few and far in between.
In fact, MS claims that such a port could slow down Visual Studio due to the consumption of more memory. There would be poorer cache locality due to 64-bit pointers being stored in various places in the code. There is much code in VS that uses custom arena based allocators, although MS is trying to get rid of them. These could also possibly result in poorer performance, since pointer management within the arena would deal with 64-bit pointers which would occupy twice the space of their current 32-bit counterparts.
Given the tens of millions of lines of code that are Visual Studio, the effort to convert, test and tune a 64-bit version seems fraught with delays while having a seemingly small chance of having a positive outcome. If anything, MS seems more intent on porting Visual Studio to managed code in order to reap the benefits present there - a decision that is hard for us C++ developers to swallow.
For the present term, Microsoft recommends running Visual Studio in a 64-bit version of Windows, thus doubling the available address space (2 GB to 4 GB) without paying a 2x penalty for pointer storage within the VS process.

- 71,784
- 24
- 131
- 181
-
could you give some links to the bits talking about the "custom arena based allocators". I'd like to read up on that just for personal education and a quick google search was of limited help. Thanks – KitsuneYMG Dec 12 '13 at 14:56
-
@KitsuneYMG, if you mean a general description of what that means, here is a link to Wikipedia: http://en.wikipedia.org/wiki/Region-based_memory_management – Michael Goldshteyn Dec 12 '13 at 15:11
-
I'm not sure why the pointer size penalty would be higher with an arena allocator than pointer-intensive data structures using any other allocator... – Ben Voigt Dec 12 '13 at 17:46
-
@Ben Voigt, The storage of the actual (larger) pointers within the fixed size arena as part of certain data structures could invalidate size assumptions about the arena. – Michael Goldshteyn May 13 '14 at 13:50