9

As we know that all dates using Javascript Date constructor are calculated in milliseconds from 01 January, 1970 00:00:00 Universal Time (UTC) with a day containing 86,400,000 milliseconds. This implies that JS uses UNIX timestamp. I set my timer to a date beyond 2038 (say 14 Nov 2039) and run the script:

    <script>
      var d = new Date();
      alert(d.getFullYear()+" "+d.getMonth()+" "+d.getDate());
    </script>

It alerts 2039 10 14 successfully unlike PHP which prints "9 Oct, 1903 07:45:59"

How JS handles this? Explanation is appreciated as I am confused!

A-Tech
  • 806
  • 6
  • 22
Parveez Ahmed
  • 1,325
  • 4
  • 17
  • 28

5 Answers5

14

32bit PHP uses 32bit integers, whose max value puts the last UNIX timestamp that can be expressed by them in 2038. That's widely known as the Y2K38 problem and affects virtually all 32bit software using UNIX timestamps. Moving to 64bits or libraries which work with other timestamp representations (in the case of PHP the DateTime class) solves this problem.

Javascript doesn't have integers but only floats, which don't have an inherent maximum value (but in return have less precision).

deceze
  • 510,633
  • 85
  • 743
  • 889
  • I don't understand why PHP wouldn't just use unsigned integers. That would effectively double the date range, right? – Alex W Nov 14 '13 at 14:53
  • 2
    @Alex Because then you couldn't represent any dates *before* 1970, because PHP tries to keep types simple and doesn't want to bother programmers with the difference between signed and unsigned ints, because using *only* unsigned ints means you couldn't represent *any* negative numbers in PHP, because that would only extend the deadline by some ~70 years instead of solving it entirely. – deceze Nov 14 '13 at 14:55
  • @AlexW it is in order to hold dates prior to 1970 which is considered as an epoch in UNIX timestamp,right? – Parveez Ahmed Nov 14 '13 at 14:55
  • one more thing to ask @deceze, as a long int is 4 bytes and a float is also 4 bytes (although in C), what about JS while it uses float instead of int...? – Parveez Ahmed Nov 14 '13 at 15:05
  • @rosemary What about it? Not sure what you're asking. – deceze Nov 14 '13 at 15:07
  • @deceze i mean to say that a long int requires 4 byte whereas a float requires also 4 bytes.Then, had JS used int like that in PHP it would have same problem as that of PHP. But float is itself 4 byte long, confused,right? – Parveez Ahmed Nov 14 '13 at 15:16
  • @rosemary Please read the linked article about floats, or *any* article about floats, really. Ints are *exact* numbers, where `...00001` is "1", `...00010` is "2" etc, for all possible combinations of 32 bits. Floats on the other hand store numbers as an expression of a mantissa and an exponent, which allows them to express much larger numbers and scale both parts up or down as needed. But in return they cannot express many numbers *exactly*. Floating point rounding errors are a problem you have to content with. – deceze Nov 14 '13 at 15:31
5

Javascript doesn't have integer numbers, only floating point numbers (details can be found in the standards document).

That means that you can represent some really large numbers, but at the cost of precision. A simple test is this:

i = 1384440291042
 => 1384440291042
i = 13844402910429
 => 13844402910429
i = 138444029104299
 => 138444029104299
i = 1384440291042999
 => 1384440291042999
i = 13844402910429999
 => 13844402910430000
i = 138444029104299999
 => 138444029104300000
i = 1384440291042999999
 => 1384440291043000000
i = 13844402910429999999
 => 13844402910430000000

As you can see the number is not guaranteed to be kept exact. The outer limits of integer precision in javascript (where you will actually get back the same value you put in) is 9007199254740992. That would be good up until 285428751-11-12T07:36:32+00:00 according to my conversion test :)

The simple answer is that Javascript internally uses a larger data type than the longint (4 bytes, 32bit) that is used for the C style epoc ...

  • fun fact, `Number.MAX_SAFE_INTEGER` reports `9007199254740991` now in chrome. so this issue seems to be resolved already. – GottZ May 21 '19 at 22:53
3

It can. Try out new Date(8640000000000000)

Sat Sep 13 275760 03:00:00 GMT+0300 (Eastern European Summer Time)

Year 275760 is is a bit beyond 2038 :)

Read the spec section 15.9.1.1

http://ecma-international.org/ecma-262/5.1/#sec-15.9.1.1

A Date object contains a Number indicating a particular instant in time to within a millisecond. Such a Number is called a time value. A time value may also be NaN, indicating that the Date object does not represent a specific instant of time.

Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day. ECMAScript Number values can represent all integers from –9,007,199,254,740,992 to 9,007,199,254,740,992; this range suffices to measure times to millisecond precision for any instant that is within approximately 285,616 years, either forward or backward, from 01 January, 1970 UTC.

The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.

The exact moment of midnight at the beginning of 01 January, 1970 UTC is represented by the value +0.

Lukas Liesis
  • 24,652
  • 10
  • 111
  • 109
  • 5
    Worth noting that `new Date(Number.MAX_SAFE_INTEGER)` yields "Invalid Date". But you can definitely use `Number.MAX_SAFE_INTEGER / 2`, which gets you through year 144,683 AD. Hopefully that's good enough for most use cases, though I suppose you never know. If there's any language that'll be around in year 144k AD, it'll be JavaScript! ☢️️‍ – mikermcneil Sep 24 '19 at 15:43
  • @mikermcneil True, but your answer is not accurate either, updated mine to represent the real limit :) Thanks for making me do it :D – Lukas Liesis Sep 24 '19 at 17:08
2

This implies that JS uses UNIX timestamp.

Just a sidenote: Unix timestamp are seconds since 1970. JS time is milliseconds since 1970. So JS timestamp does not fit in a 32 bit int much earlier (but JS does not use 32 bit int for this)

HolgerJeromin
  • 2,201
  • 20
  • 21
1

The year 2038 problem applies to signed 32 bit timestamps only, which PHP and some other systems use. A signed 32-bit timestamp's range runs out with the number of seconds in 2038.

From the Wikipedia article (emphasis mine):

The year 2038 problem may cause some computer software to fail at some point near the year 2038. The problem affects all software and systems that both store system time as a signed 32-bit integer, and interpret this number as the number of seconds since 00:00:00 UTC on Thursday, 1 January 1970.1 The furthest time that can be represented this way is 03:14:07 UTC on Tuesday, 19 January 2038.[2] ... This is caused by integer overflow. The counter "runs out" of usable digits, "increments" the sign bit instead, and reports a maximally negative number (continuing to count up, toward zero). This is likely to cause problems for users of these systems due to erroneous calculations.

Storing a timestamp in a variable with a greater range solves the problem.

Pekka
  • 442,112
  • 142
  • 972
  • 1,088