0

Why is it that Chrome and Firefox convert a date string to a different value than Safari does as you can see in this example?

var date = new Date("2015-10-22T01:35:17");

https://jsfiddle.net/dkazsvhm/2/

This is just using pure javascript. The date string value is being obtained from Twitter's API, so I don't believe it's in a format that should cause any issues.

MattSavage
  • 778
  • 1
  • 7
  • 17
  • 1
    Because JavaScript doesn't define any specific date string parsing rules. It's up to the implementers to decide how to parse a date string. – gilly3 Oct 22 '15 at 02:04
  • 1
    `yyyy-MM-dd` isn't supported by safari. Look into this: http://stackoverflow.com/questions/4310953/invalid-date-in-safari – Max Oct 22 '15 at 02:04
  • 1
    I recommend using moment to parse dates: http://momentjs.com/ – justspamjustin Oct 22 '15 at 02:17
  • 1
    @gilly—that's incorrect, see [*ECMA-262 §20.3.3.2*](http://www.ecma-international.org/ecma-262/6.0/#sec-date.parse). – RobG Oct 22 '15 at 02:55
  • 1
    @MaxMastalerz—also incorrect. The quoted answer is from 2010, Safari has supported ISO 8601 format date strings for a number of years now. However, I agree with the duplicate answer that *Date.parse* should not be used for parsing strings. – RobG Oct 22 '15 at 02:56
  • 1
    Prior to ES5, parsing of date strings was entirely implementation dependent. In ES5, parsing ISO 8601 long format strings without a timezone assumed UTC. In ECMAScript 2015, such strings are parsed based the system timezone (aka local). So the string in the OP might return 3 different values in browsers in use: NaN, UTC equivalent, local timezone equivalent, and all are "standards compliant". – RobG Oct 22 '15 at 03:04

0 Answers0