1

How do web servers implement rate-limiting, for example, from a particular client/user?

The doubt is, the server has to first "get" the request, and if the threshold has reached, deny. In this process, isn't the server using its resources.

For example:

In the server, we may have logic like this (assuming socket programming).

void acceptConnection() {

    while(true) {

       int i = accept();
        
       // rest of logic

    }
}

My doubt starts with this: The server will be listening on a socket/port, so even if it may deny a client, but the server is still listening to fake calls, doesn't that also contribute to DOS? For example, doesn't the server execute accept() before rejecting a fake call; so in a way, a part of code is getting executed?

halfer
  • 19,824
  • 17
  • 99
  • 186
CuriousMind
  • 8,301
  • 22
  • 65
  • 134

0 Answers0