0

I want to load and process files in parallel. My Problem is that the data processing in my subscription isn't run asynchrously, only the http fetch. How can I also make the data processing in parallel?

await Promise.all(requestsArray.map((request) => {
      return new Promise((resolve, reject) => {
        this.http.post(request, new FormData()).subscribe(async json => {
          console.log('load data in parallel from API for ' + request);

          mylongprocessingfunction(json); // blocks further processing of other files

          resolve(json);

        }, error => {
          console.log(error)
          reject({ error: error });
        });

      });
    }));
daniel
  • 34,281
  • 39
  • 104
  • 158
  • What's in `mylongprocessingfunction(json)`? What exactly is it doing? Can you show us the code for that because the advice on appropriate techniques depends entirely upon what that is doing? – jfriend00 Jan 08 '21 at 18:29
  • time consuming data processing of the json – daniel Jan 08 '21 at 18:30
  • I believe you're looking for web workers, though I've never knowingly used them – rwusana Jan 08 '21 at 18:32
  • Is this code running in the browser or in nodejs? – jfriend00 Jan 08 '21 at 18:36
  • 1
    If this is running in the browser, then this would be relevant [How to process large array with blocking the UI](https://stackoverflow.com/questions/10344498/best-way-to-iterate-over-an-array-without-blocking-the-ui/10344560#10344560) as it covers both chunking the work and using webWorkers. – jfriend00 Jan 08 '21 at 18:37
  • it runs in the browser – daniel Jan 08 '21 at 18:42
  • Then, for actual parallel processing of CPU intensive work, you will have to offload the processing to a webWorker. Only then can you get more than one CPU involved in the processing. – jfriend00 Jan 08 '21 at 19:03
  • Or, get your server to do more of the processing for you so the data is already in an easy-to-consume state. – jfriend00 Jan 08 '21 at 19:11

0 Answers0