0

I have hundreds of element to get from MongoDB database and print them in the front-end. Fetch all into single one request could decrease performance as it carries big payload in the body response. So I'm looking for a solution to split my Angular request into several and with the constraint to be simultaneous.

Example :

MONGODB Collection: Elements (_id, name, children, ...) Documents: 10000+

But we only need ~100-500 elements

ANGULAR :

    const observables = [];
    const iteration = 5, idParent = 'eufe4839ffcdsj489483'; // example
    for (let i = 0; i < iteration; i++) {
      observables.push(
        this.myHttpService.customGetMethod<Element>(COLLECTION_NAME, endpoint, 'idParent=' + idParent + '&limit=??')); // url with query
    }

    forkJoin(observables).subscribe(
      data => {
           this.elements.push(data[0]);
           this.elements.push(data[1]);
      },
      err => console.error(err)
    );

I use forkJoin because I need simultaneous requests for better performance. The idea is to send multiple requests to the back-end with different limit parameter values and get the whole set of elements into the data object at the end. The only purpose is to split request to avoid big latency with maybe errors due to the size of the payload body.

How can I proceed with the given stack to perform such this operation ? I would like to avoid the use of websockets.

mtnp
  • 314
  • 7
  • 24

1 Answers1

0

I think fork.join is used when you want to resolve all the observables in parallel, but you need to wait for all the request what if one fails? forkJoin will complete on first error as soon as it encounters it and you kinda can't know from which observable it came from , but if you handle errors inside the inner observables you can easily achieve that.

const observables = [];
    const iteration = 5, idParent = 'eufe4839ffcdsj489483'; // example
    for (let i = 0; i < iteration; i++) {
      observables.push(
        this.myHttpService.customGetMethod<Element>(COLLECTION_NAME, endpoint, 'idParent=' + idParent + '&limit=??')).pipe(catchError(() => {
       throw `an Error request #: ${i}`;
     }); // url with query
    }

    forkJoin(observables).subscribe(
      data => {
           this.elements.push(data[0]);
           this.elements.push(data[1]);
      },
      err => console.error(err)
    );

The other way could be to introduce the infinite-scroll or ngx-infinite-scroll if you want to show the data as list.

You can also add the pagination in the frontend if that matches your requirement. There is one lib which might help you: Syncfusion grids. There can be other ways too to improve performance at the backend side too.

Apoorva Chikara
  • 8,277
  • 3
  • 20
  • 35
  • Thanks for the error handler tips. Yes, I want to parallelize my requests because each one query different chunks of the global parent object. The problem is : I need to be careful to do not get the same entities and do not forget one. This is why the limit parameter is very important here. But I don't know how to implement this as a reliable Angular/Node/MongoDB architecture (taking account concurrent request which can update DB, MongoDB sort implementation, ...) – mtnp May 03 '21 at 12:25
  • What does it mean `taking account concurrent request which can update DB, MongoDB sort implementation` and `Node/MongoDB` this is really good for handling concurrent request thats why you have Node.js. You should use mongoose which is well versed and efficiently gives the data and update it. – Apoorva Chikara May 03 '21 at 12:37
  • Concurrent request for update elements can cause troubles during the fetch time. It's an example but 5 requests with limit 0-100, 100-200, ..., 400-500 cannot be reliable because one of these elements can be changed at any moment. So, I'm finding a way to avoid that. One solution is to get 500 elements from DB in a single operation and then split the response with NodeJS to Angular requests. If it's possible. – mtnp May 03 '21 at 14:08
  • https://stackoverflow.com/a/60917515 The second paragraph can be an idea of what I'm looking for, but for a GET request. – mtnp May 03 '21 at 14:15
  • Yep, I see but you want to do reverse show the data on the front-end. Whenever there is any update in the data just send that part to the front-end, it must be having some sort of primary key and using that replace that. I am just trying to think wide what else could be done for such scenarios. – Apoorva Chikara May 03 '21 at 14:19