0

I'm building a data scraping Script, which will fetch millions of records in a single URL call and inserts to the database. But it takes lots of memory increasing issues. What is the best way to handle this? My code looks something like:

const request = require('request');
request('call to url', response, function({
     //process other stuff based on response
     //call other url in looping
     for(let i = 0; i < response.length; i++) {
          request('call to other url', response(function({
              //enter to database
          })
     }
})
jfriend00
  • 683,504
  • 96
  • 985
  • 979
Manohar Kumar
  • 605
  • 9
  • 12
  • If `response.length` in your `for` loop is a big number, then you can't run a huge number of `request()` operations all at once without consuming large amounts of memory. Instead, you need to run a measured number of requests in parallel at the same time such as 10-20 requests at once. – jfriend00 Oct 27 '19 at 05:31
  • For an idea how to run N requests at a time see: https://stackoverflow.com/questions/34802539/node-js-socket-explanation/34802932#34802932. – jfriend00 Oct 27 '19 at 05:55

0 Answers0