1

I have a game in which users must answer questions within 10 second intervals. Which means that when the 10 seconds is up the user is not allowed to answer the question if he/she has not already answered it. At this point the server should also not allow the user to answer the question because the server as well knows the 10 seconds has elapsed.

For validity and making sure the client is not manipulated in any way i have decided to put the timer on the server and simply display the timer to the client (displaying the timer is important).

The question is how do i display the timer to the user and also make sure that the timer is valid on the client.

Socket IO handles the interaction between the client and server as the user is actually playing, http handles everything else the user does in the game(login, signup, account, payment etc).

My stack is android with java, node js server, socket io on both. I have setup all, the only issue is the approach to take when it comes to the synchronization of the timer on both the client and the server.

I was thinking one approach might be to serve a question and basically set a variable on the socket like this (on the server)

 var t = new Date();
 t.setSeconds(t.getSeconds() + 12); //2 seconds extra for network lag
 socket.allowed_time = t

Then send the question to the client and show a 10 second count down to the user. Whenever the client responds check if the current time is greater than socket.allowed_time. This method seems unreliable considering the network lag.

A game like 8 ball pool i play lot has real time game play between 2 users and the time is synchronized between both users (in realtime), i noticed that network lag affects the game a lot as well.

Joshua Majebi
  • 1,110
  • 4
  • 16
  • 34

1 Answers1

0

I don't think there is a way that will both 1) cope with network lag1, and 2) cope with clients pretending there is network lag.

If you don't care that a lagged client / user sometimes gets penalized, then you simply run a 10 second timer on the server side. If the client's reply is not received before the timer goes off, then it is rejected.

You can just start the user's on-screen timer when the question is received by the client. If there is substantial lag, the timer may not reach zero until it is (noticeably) too late to reply. You could compensate for this by estimating the average lag in both directions and adjusting the number of seconds on the clock.


1 - ... without "unfairly" penalizing the lagged client.

Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
  • I think i can live with a situation where if there is network lag the user will be penalized. I believe network lag is the world we live in. That comes to the situation of at-least making sure the server is the only source of truth when it comes to the time. This is where i can be certain of no manipulation. – Joshua Majebi Jul 04 '18 at 13:00