0

I have a table with bunch of user ID. I want to take those user ID and make api request and get their full name. The table will contain duplicate IDs. What is the most elegant/proper way to so I don't bash the database with 100s or 1000s of API calls for same user ID?

Should I store all the USER ID into an array and filter out the duplicates and make the API call? But even if I did filter out duplicates I still have 100s or 1000s of unique IP it would still bash the server with requests. I was looking at share() but from my little understanding of share() is for api with multiple information on one api call and that api can be reused for different info correct? for example: api/{2} would respond data: { name: jon, id: 2, job: cashier }, so one async call can user the same api for name in one instance and job for another instance on that same api call correct?

shareReplay is also regards to the data returned correct?

Maybe I'm not understanding RxJS correctly.

What i'm trying to get at how can i properly/elegantly make unique API calls without redundancy and limit bashing possibly 1000s of calls back to back.

thanks in advance for any suggestions and help.

edit: i forgot to mention that i want to take the full name and put it back into the original api and mergemap??

E Doe
  • 15
  • 3
  • 1
    If your API acept one id at time you have to do multiple calls to your api otherwise you can send an array of ids ad return an array of [{ID: number, Name: string}] from your API – Raycas Feb 24 '21 at 06:17
  • 2
    Agree with @Raycas. Additionally, try using [Set](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set)s to obtain unique data. – Kaustubh Badrike Feb 24 '21 at 06:25
  • 1
    this might help you, just try. https://stackoverflow.com/questions/61087291/multiple-api-calls-with-same-request – rdr20 Feb 24 '21 at 06:35
  • 1
    As suggested you could use Set to obtain unique elements in the array. After that you could follow my post [here](https://stackoverflow.com/a/62872189/6513921) to do a bunch of buffered parallel requests instead of multiple parallel requests at once. – ruth Feb 24 '21 at 10:45
  • 1
    `mergeMap` has a `concurrent` argument that can limit the number of concurrent inner subscriptions at once. You can use this to put a max on how many calls are being made to your server at a time. `concatMap` is really just `mergeMap` with `concurrent = 1` – Mrk Sef Feb 24 '21 at 17:17

0 Answers0