0

I can see that there are some URLs with >11000 characters.

Theoretically, provided the web developer's choices allow for it, is there a maximum limit on how long a URL can be?

A mathy way of asking the same thing is min(max(S), max(C)) where

  • S is the set of all max URL lengths across all server technologies and
  • C is the set of all max URL lengths across all clients, including curl, browsers, python/R libraries like beautiful soup (python), rvest (R) and other available clients

Background

I'm curious because I am using an API which can accept multiple comma separated values as a parameter value (e.g. like soparam=value1,value2,value3 etc).

I'm curious as to (theoretically) how many comma separated values it could accept

Notes

  • I understand that in the case of a parameter passed in through a URL, that the parameter most likely goes into some query, and that different databases may balk long queries (I know, for instance, that sql server won't accept more than a few thousand values in an IN clause)
  • I also understand that different browsers will have limitations, but I am not interested in that (so long as some browser or other software can access the URL)
Community
  • 1
  • 1
stevec
  • 41,291
  • 27
  • 223
  • 311
  • Do the [answers](https://stackoverflow.com/a/2914825/2225787) to the [linked question](https://stackoverflow.com/questions/417142/what-is-the-maximum-length-of-a-url-in-different-browsers/11551718) not address your question. If not, please clarify why not. – Agi Hammerthief Aug 15 '19 at 11:54
  • Possible duplicate of [What is the maximum length of a URL in different browsers?](https://stackoverflow.com/questions/417142/what-is-the-maximum-length-of-a-url-in-different-browsers) – Agi Hammerthief Aug 15 '19 at 11:56
  • @Agi those answers talk practically (about the fact that if a web developer makes urls longer than 2000 characters most browsers won't be able to access them). I am just wondering what the maximum length could be if *only one* browser/tool has to access it - obviously whatever tool allows has the greatest ability to handle long URLs – stevec Aug 15 '19 at 11:56
  • That's all well and good, but in the field, you can't guarantee what browser will be used. (Some people *still* use IE/Edge, despite every decent developer avoiding it.) The effective maximum is the smallest value that any of the common browsers can handle. *id est*: If IE can only handle 2000, while Firefox handles 5000, then the maximum is 2000. The fact that Firefox can handle the most is irrelevant (unless you don't care that your Support team's going to be fielding calls from customers complaining that your API doesn't work with IE/Edge), even if you specify that IE/Edge isn't supported. – Agi Hammerthief Aug 15 '19 at 12:11
  • 1
    @AgiHammerthief If I make an API and the thing it serves (e.g. an application or simple request through curl), then I can choose the server tech that allows the max and client that allows the max of all clients. So a mathy way of asking my question is `min(max(s), max(c))` where `s` is the set of all max URL lengths across all servers and `c` is the set of all max URL lengths across all clients (including curl, browsers, python/R libraries like beautiful soup (python), rvest (R) etc etc) – stevec Aug 15 '19 at 12:14
  • @AgiHammerthief the linked quesiton *sounds* like what I'm asking, but what it's really asking is "what's a *sensible* upper limit on URLs when designing a website that will be accessed from a variety of browsers/clients". What I'm asking is far more specific – stevec Aug 15 '19 at 12:17

0 Answers0