1

If a server receives two incoming requests, from two different clients, or from the same client twice, and both arrive at exactly the same time, does it process them simultaneously, or process one and then the other?

Is it even possible for a server to perceive two requests as arriving at the same exact time, or will there always be a minute time difference? If there is always a difference, how granular is that difference? Milliseconds, nanoseconds? Attoseconds?

I did look for an answer to this question before I posted, but most of the information I found is either too simplistic or too complex to sufficiently answer the question.

Deimyts
  • 374
  • 2
  • 3
  • 13
  • http://stackoverflow.com/questions/16952625/how-can-a-web-server-handle-multiple-users-incoming-requests-at-a-time-on-a-sin – bobs_007 May 27 '15 at 03:45
  • 2
    This is entirely dependent on whether the server uses blocking vs non-blocking vs async socket I/O, whether the server handles the requests within its own process or if it forks out secondary processes to handle them, etc. Your question is too broad. – Remy Lebeau May 27 '15 at 03:45

1 Answers1

-1

Source port from clients are random and unique. (source_ip_addr, source_port,dest_ip_addr,dest_port) tuples are unique.