The explanation I got for this was that it was common practice for programmers to simply represent the year with 2 digits. But why would anyone do that? If anything I imagine it would take more effort to make a program roll back to 1900 instead of going on to 2000.
-
That _is_ the explanation. What more are you looking for? – Matt Ball Nov 17 '10 at 04:20
-
1@Matt Ball: I think it's the difference between the how and the why. I could tell you *how* World War II started, but *why* is started is significantly more complex. – Matchu Nov 17 '10 at 04:22
3 Answers
Premium of storage space + lack of foresight = Y2K bug.
Saving bytes was very important in many older systems. Plus, a common fallacy in software development is "no one's going to be using this in X years". So why not save those bytes now? 10/20/30 years down the line this will SURELY be scrapped for a completely new system.
To quote Lex Luthor -- "Wrong."

- 51,193
- 8
- 108
- 113
-
2That's what I'd heard but then I thought, "Wait, how do you allocate just enough bits to count up to 100?" Would uncorrected computers literally roll back to 1900 or would they think the year was some nonsense like like 19(100), 19(101),...,19(255), 1900. – user405163 Nov 17 '10 at 05:11
-
1There were various results of this bug. Some programs were working with integers and were adding 1900 to user input, some were adding a string "19" in front. In these days the year 2000 was still considered "science-fiction future", so the problem really was a little hard to spot. Two-digit-year dates were useful, since there was no conflict with day and month, and an integer (in contrast to byte) was too expensive. Ah!.. the good, old days ;-) – Arsen7 Nov 17 '10 at 14:01
-
1@user405163: Some languages (notably COBOL) use *decimal* arithmetic by default, so you'd declare a 2-*digit* variable instead of an 8-*bit* one. You could also have this bug in C by doing things like `strftime("19%y", ...)` instead of `strftime("%Y", ...)`. – dan04 Mar 30 '11 at 23:42
Why? Their question was likely, "Why not?" If it saved a few bits in a world where memory usage was significantly more limited, then they figured they may as well save that space.
Obviously, the "why not" was because "your software might actually be in use for significant amounts of time." Some programmers had the foresight to plan ahead, but not all.

- 83,922
- 18
- 153
- 160
The story goes that this was back in the day when 1 kilobyte of RAM cost more than $1000. Omitting the extra digits meant really money savings.

- 3,904
- 7
- 31
- 43