I wonder if this is a mistake, but right now I'm using GET for all my search URLs. The reason is that with GET Url the users can simple copy the link on the address bar and share or save easily. It seems like Google for example also uses GET Url (form).
Since it's a search form with filters, sorters and such the length of the generated URL can be unpredictable. I'm worried about the edge cases where the URL may exceed the length limit. On possible solution is to actually POST the data, then generate a unique hash based on that data, then use that has as the URL for the search. This solution does require saving the actual form data to database for example, then re-querying it on every search request which seems to be wasteful.
I wonder if there is any other way I haven't thought of yet?
Edit: I would like to say thank to all your answers here, they helped alot. I will summarize below what I did to solve this, hope it helps someone else:
- As pointed out, if the URL gets too long (over 2000 chars) then perhaps I'm doing something wrong. So I went back and revisited my algorithm and managed to cut the info I have to pass via the GET string more than half of what I had (previously, it can easily get over 500 chars which got me worried)
- I also jsonified my string. The reason is that deep nested array does not work well on query string, and by jsonifying the array I actually got a rather shorter and easier to read result
- There is another solution, that is to write up your own parser, for example if you want to get even shorter url you could write: category=1,2,3,4,5 and since you already know your query structure you can do the parsing in your backend. It takes a bit more work so I haven't tried that yet till I really have to
- I haven't tried the hashing/token route yet, but I believe it's also a good solution if you really really have to handle huge input. You can POST the input, then issue back a hash string token to use as the search URL.