1

I do not know which one is right for get UTC time

My code is

System.currentTimeMillis()

for java android

Is the result correct for international?

Maybe user can change device time and result be different? (Does it affect on UTC?)

Omid Aghakhani
  • 57
  • 2
  • 11
  • FYI, Unix is an operating system. – Federico klez Culloca Jul 26 '19 at 09:39
  • @FedericoklezCulloca Correct. A count of seconds since the Unix epoch of 1970-01-01T00:00:00Z is known as a Unix timestamp and is widely used outside the Unix world too. [Wikipedia article: Unix time](https://en.wikipedia.org/wiki/Unix_time). – Ole V.V. Aug 04 '19 at 10:13
  • Related: [How to use an Internet time server to get the time?](https://stackoverflow.com/questions/4442192/how-to-use-an-internet-time-server-to-get-the-time) There are a number of similar questions. – Ole V.V. Aug 04 '19 at 10:17

2 Answers2

3

On Linux platform, the system clock should be set to UTC. Whether it actually is, and whether it is accurate, is ultimately up to the user.

Calling System.currentTimeMillis() will give the time in UTC since 1970-01-01T00:00:00Z.

Is the result correct for international?

Yes. Provided that the clock is synced with a decent network time source and the user hasn't messed with it.

Maybe user can change device time and result be different?

Yes they can. There is not much you can do about it.

You could attempt to connect to a network time server, but the user could block that, or cause your game to connect to a fake time server. If they "own" the platform that your game runs on, you probably have no way to get guaranteed reliable time.

(The Wikipedia page on NTP talks about man-in-the-middle attacks. Unfortunately, standard NTP doesn't have a way to deal with that. There is a draft RFC for Network Time Security, but they have been working on it since 2015, and it still hadn't concluded at the time of writing.)

Community
  • 1
  • 1
Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
1

There is a common misconception around getting the correct "local" or "international" time. Time itself is unaware of these concepts, and I'll try to give an explanation here since other people still search for it.

Note that the following applies assuming that the machine clock is accurate in counting time, and initial time setup at boot was also correct. Also note that most connected devices will sync this internal clock with a network source from time to time to make it run accurately.

Time itself is something we don't control, it passes along one moment after another. But time presentation (seconds, hours, days, years, decades) is a human-created concept, made simply to understand the passage of time more clearly. You can imagine the confusion if we discussed time in moments - "Hey, I'm going to the store in 901400203150 moments, would you like to join?".

Anyway, the same is true for time zones, they are a human-created concept. We live on Earth, near the Sun and the Moon, and there is some revolving happening in space that made us recognize a longer time period called "year" and shorter ones called "month". We also recognized that morning comes earlier in some places, and later in others - thus, we introduced time calculations and time zones to make time tracking easier for everyone on the planet. But 10000 years ago, nobody knew about time zones, and yet time was passing by.

So, to answer the original question with that in mind: yes, that method will work, if you're interested in getting the absolute current time.

As mentioned, there is no such thing as "international" or "local" time, time is the same for everyone. We use these terms when referring to time value converted to and presented in a certain time zone format. Today we have something called Epoch (for humans: 00:00:00 UTC on 1 January 1970, for most machines: 0) - so fetching "milliseconds since Epoch" will give you the amount of raw time, expressed in milliseconds, passed since Epoch.

Time itself does not know about time zones or years or months, this is a human construct that you have to convert to on your own. Basically, what you get from the OS is raw time, and then you convert it to the desired time zone, desired format and language, to make it easier for the user to read. And how to convert time - that's a completely different question. :)

milosmns
  • 3,595
  • 4
  • 36
  • 48
  • 1
    Wooow thank you for your more details,Yes you right – Omid Aghakhani Jul 26 '19 at 13:02
  • This is interesting and correct in what it is saying about timezones and "local" versus "international". But it misses the fact that time sources (clocks) can drift. In the 1800's people needed to adjust their pocket watches on a daily basis ... to a more reliable clock; e.g. at the local clock maker's shop, or from the townhall clock. Otherwise, they might miss the stage-coach. Modern computer clocks drift a lot less ... but for the OP's application he probably needs time to be accurate to a fraction of a second. This can be a problem. – Stephen C Jul 27 '19 at 04:29
  • @StephenC Yeah of course, assuming that the internal OS clock is correctly measuring time and wasn't set to something random at boot. :) – milosmns Jul 27 '19 at 18:24
  • Also, if not reconfigured, Android will by default do a time sync from time to time to adjust its internal clock - so in this sense I don't think we can expect a more accurate time in millis than what we get from the OS. – milosmns Jul 27 '19 at 18:29
  • It is true that Android tries to sync with an external time source. However, the point of the OP's question is whether the user (a cheater) can do things to stop that (yes they can) and whether there is anything the game developer can do anything to get a guaranteed reliable time source on the user's machine (no they can't). But I was actually commenting on your historical explanation of time and time measurement. Clock syncing became a historical requirement back when people needed to know the time accurate to a few minutes. – Stephen C Jul 28 '19 at 01:44