0

I'm on the Pacific Coast (PST) -8 hours (-420 minutes) from UTC and

var d = new Date();
var n = d.getTimezoneOffset();

returns 420. Shouldn't it return -420?

I'm trying to calculate a local time to save on the server by passing a local utcoffset. My server is in UTC time

so I wanted to do something like

group.DateCreatedLocalTime = DateTime.UtcNow.AddMinutes(viewModel.UtcOffset);

but if the time is 420 and not -420 then how will I know to add or subtract minutes from DateTime.UtcNow?

chuckd
  • 13,460
  • 29
  • 152
  • 331
  • look at reply of https://stackoverflow.com/questions/1091372/getting-the-clients-timezone-in-javascript – Dror Aug 08 '18 at 04:15
  • so I assume I need to subtract minutes, instead of add minutes? – chuckd Aug 08 '18 at 04:20
  • 1
    Possible duplicate of [Why does JavaScript Date.getTimezoneOffset() consider "-05:00" as a positive offset?](https://stackoverflow.com/questions/21102435/why-does-javascript-date-gettimezoneoffset-consider-0500-as-a-positive-off) – Dror Aug 08 '18 at 04:24

1 Answers1

-1

In Javascript You can use setMinutes() function.

Example :

var d = new Date();
var n = d.getTimezoneOffset();
d.setMinutes(d.getMinutes() - n);

you can subtract this way.

Bharat
  • 24
  • 2