-1

I was looking at this question:

How do you extract POST data in Node.js?

The second and third answer append data to the "body" var, and kill the connection when it is too large.

This makes sense to me, but what I don't understand is why no one does this for GET requests? There isn't any limit to the length of the parameter string of a GET request either. Can someone explain ?

Also, in the question above, why do they wait till 1e6 ? Isn't that a really high number, and high enough to slow down node ?

Community
  • 1
  • 1
Rahul Iyer
  • 19,924
  • 21
  • 96
  • 190

2 Answers2

4

There is a built-in request size limit enforced by node itself, so there is no need for application developers (you) to do anything extra.

Specifically, the HTTP parser will accept a maximum of 80 kB in total combined headers. This means the request URI + all headers (Cookie, User-Agent, etc) + protocol overhead must be less than 80 kB. Therefore, a request with no headers can have up to 81,903 bytes in the request URL.

As soon as the number of bytes received exceeds the limit, node just closes the socket. It does not report an error (ie a 414).

Because this limit is defined in C, it is not possible to change the limit. (You'd have to compile a custom build of node to do so.)

josh3736
  • 139,160
  • 33
  • 216
  • 263
  • In UTF-8 "The first 128 characters (US-ASCII) need one byte. The next 1,920 characters need two bytes to encode". So even if one used a non-latin alphabet in the query string, 80kB = 40,000 characters. Which should suffice for must purposes. XD – Rahul Iyer Nov 09 '14 at 11:12
0

There is a limit to the size of GET requests (What is the maximum length of a URL in different browsers?) so there is no real need to protect against excessively large request.

Community
  • 1
  • 1
takinola
  • 1,643
  • 1
  • 12
  • 23
  • 1
    While it is true that browsers won't send a request larger than a couple thousand characters, this does not stop a malicious attacker from using other tools to send a request with a URL of arbitrary length. So yes, there is a very real need to protect against excessively large requests. – josh3736 Nov 07 '14 at 20:12