I searched the difference between von Neumann and Harvard architecture and came to conclusion, that comparing to Harward architecture von Neumann has no positive sides, only negative such as "bottleneck" and vulnerability. So why most computers use von Neumann architecture today?

- 54,432
- 29
- 203
- 199

- 880
- 11
- 16
-
1To those downvoting and voting to close - how is this an opinion-based question? The computer industry has overwhelmingly chosen von Neumann (cache notwithstanding). Asking what reasoning and research caused that choice isn't opinion-based. – Matt Kline Jun 11 '14 at 19:58
-
1Perhaps, but it's not really a programming question either. More of a history-of-computing question. And it is pretty open-ended. – Tom Zych Jun 11 '14 at 22:48
2 Answers
With a Harvard architecture, the ratio of memory allocated for instructions vs. data is determined by hardware. Once the chip is made, you cannot adjust the ratio. Allowing both to reside in the same memory is far more flexible. This flexibility is important since modern computers (and even microprocessors for embedded devices) are designed to be able to perform a wide variety of tasks.
Modern processors maintain separate instruction and data caches on the processor die as well, giving you the best of both worlds.

- 10,149
- 7
- 50
- 87
-
1Apparently von Neumann archs spend two out of three cycles waiting for memory (I.E blocking), and parallelism isn't exactly improving this bottleneck. The only computers that are fully able to sidestep this problems are super computers with an specialized arch that has support for a lot of parallelism. – theodore hogberg Feb 01 '15 at 03:00
Because Von Neumann used storage not only for data but also for commands. Von Neumann only handles one task at a time and Harvard can do more actions simultaneously. Therefore the Harvard structure has the problem of the race-condition which doesn't occur in the von Neumann architecture. So that's a plus for Neumann.
Today computers use a combination of both, although the Neumann part is bigger. Also the Von Neumann architecture is always deterministic. If you do more tasks at once in the Harvard structure it becomes not deterministic. So the correctness of the execution of your tasks depends on your luck when using the Harvard structure.

- 56,041
- 24
- 146
- 247

- 1
- 1
-
I doubt "luck" has anything to do with correctness of programs on Harvard designs. Besides, I think, if anything, _von Neumann_ would just as well be non-deterministic (it can process imperative languages), but basically, determinism has nothing to do with either, it's a property of a programming language instead. – Abel Jan 03 '17 at 17:19