I'm building a data scraping Script, which will fetch millions of records in a single URL call and inserts to the database. But it takes lots of memory increasing issues. What is the best way to handle this? My code looks something like:
const request = require('request');
request('call to url', response, function({
//process other stuff based on response
//call other url in looping
for(let i = 0; i < response.length; i++) {
request('call to other url', response(function({
//enter to database
})
}
})