0

A REST question. Let's say I have a database will a million items in it. I want to retrieve say 10,000 of them via an REST GET, passing in the GET request the ID's of the 10,000 items. Using URL request query parameters, it'll quickly exceed the maximum length of a URL. How does people solve this? Use a POST instead and pass it in the body? That seems hacky.

1 Answers1

0

You should not address this form through the URL parameters, it has a limit: 2000 characters Url limit

I guess what you are doing is something like this:

https://localhost/api/applicationcollections/(47495dde-67d2-4854-add0-7e25b2fe88e4,1b470799-cc8a-4284-9ca7-76dc34a5aebb)

If you are planning to get more than 10k records you can pass the information on the body of the request which doesn't have any limit. Technically speaking you should do it through a POST request, but that is not the intent with the semantic of the POST verb. Even for the GET you can include a body:HTTP GET with request body but it should not consider as part of the semantic.

Normally you don't filter 10k elements by id, instead, you get 10k elements on a request, passing a pagination parameter if you want through the URL, but that can kill your app, especially considering that the DTO has more than one field, like

ApplicationDto
 field1
 field2....
 field15

Bellow, you have an example of how to pass pagination parameters and get the first 10k records

https://localhost:44390/api/applications?pageNumber=1&pageSize=10000

Also, the APIs should return an extra header, let's call it X-Pagination where you can get the information if you have more pages to paginate, including as well the total amount of elements.

As an extra effort to reduce the size of the request, you can shape the data and only get the fields you need. ApplicationDto should bring only: field1, field3 see bellow:

https://localhost:44390/api/applications?fields=field1,field3

See how Twitter address this problem as well: Twitter cursor response

Hope this helps

Zinov
  • 3,817
  • 5
  • 36
  • 70
  • Thanks, that helps somewhat. But it’s difficult since in our application the 10,000 ID’s comes from one application and are used to retrieve data in a different application. I’m leaning now towards doing multiple REST GETs, splitting it up into say batches of 100 at a time. Not really happy with that either. – Stefan Edlund Jun 22 '19 at 02:39
  • @StefanEdlund is this application sending 10k records via HTTP? at once? Make sense that you act as a proxy of the data or transform these records into another output. What happens if the request gets interrupted? you lose 10k records because the request didn't come right no? – Zinov Jun 24 '19 at 13:55
  • Also, my suggestion is to use streams while posting or reading data to your API – Zinov Jun 24 '19 at 20:29