0

There is a code to set an expiration time for a cookie setting in JavaScript:

d.setTime(d.getTime() + (exdays*24*60*60*1000));

Can someone explain me what's the 1000 is for since 1 day = 24*60*60 (24 hours * 3600 sec) so why do we multiply it by 1000?

VLAZ
  • 26,331
  • 9
  • 49
  • 67
Ringger007
  • 25
  • 3

1 Answers1

0

Cookie expirations are in epoch time, also known as unix timestamp, which is the time elapsed since January 1st, 1970.

This number is usually in seconds or milliseconds, so it is very common for you to require to figure out which you need. Sometimes you need to divide by 1000 to convert from ms to s, and other times you need to multiply by 1000 to convert from s to ms.

The difference is dependant on the precision you need. For example in a chat application, you may send two messages in one second, so unix time with second precision is inadequate to order those messages properly with respect to time.

In other cases, the extra 3 digits precision may be overkill, such as the timestamp of a user signing up for your website. Each character in the database has a storage cost associated with it.

On top of this, you are putting the expiration timestamp somewhere, and in this case, the function expects the value to be in milliseconds. You can compare any two timestamps as long as they are of the same precision.

For example, consider these two timestamps:

1592339616920

1592340380 

Later timestamps are larger, so unix time is typically used for comparisons, such as "how long ago from now?" or "in how long from now?".

You can first observe how many characters there are: 10 or 13. Then you can make them both 10, or both 13, and then subtract the smaller one to see how much time has elapsed. If you subtract both at seconds precision, you lose 3 digits of precision.

In JavaScript, you can quickly gain access to the ms timestamp using:

console.log('current timestamp:', new Date().valueOf());
agm1984
  • 15,500
  • 6
  • 89
  • 113