In my server and client main loops, I have custom timers. They're pretty basic things. I'm using System.currentTimeMillis() to get the milliseconds and then comparing it to different variables for different timers. If the timer variable is less than tickCount, it runs the code, then sets the timer to tickCount + UpdateTime.
Here is an example:
long tickCount = System.currentTimeMillis();
if (LastUpdateTime_WoodCutting < tickCount) {
woodcutting();
LastUpdateTime_WoodCutting = tickCount + UpdateTime_WoodCutting;
}
UpdateTime_WoodCutting is set to 10. In theory, this should update this timer every 10ms. I'm sure it's not exactly that accurate, but the problem I'm having is, overall, this timer is meant to be a 10 second timer, which would be 10000ms.
The timer seems to be taking anywhere from 20-30 seconds to get there. The woodcutting method just checks if the timer in the player class is less than 10000, and if so, it adds 10 to it and once it's at 10000 or more, it executes the code for cutting down a tree in the game.
Another problem is that the client uses the exact same code for timers as the server, yet even while running on the same machine, they do not line up. The client's timer seems to finish about halfway through the server's timer. I've tried a bunch of alternatives to System.currentTimeMillis() but they all pretty much work exactly the same, so it hasn't been very fruitful.
Basically, what I'm trying to figure out is how I should handle these timers. It doesn't appear that I'm handling them properly. Before a bunch of changes to my code, these timers worked flawlessly, but all of a sudden, they do not. I don't know if it is the result of updating Gradle or Java (from 1.7 to 1.8), but I'm very frustrated with this and it's a pretty game breaking issue.
My source code is easily almost 40k lines of code and I am unable to share all of my code, but anything someone may need to see in order to better help me with this, I will provide what I can.