I have an array of tracking links (about 30), which I want to open piece by piece and find out the real URLs hidden behind them. Once that's done, I want to save the 'real' URLs to a JSON file.
The URLs look something like this before they have been "checked": https://www.trackinglink.com/1
and something like this afterwards: https://www.amazon.com/
I have solved the "uncovering" of the tracking links using request and it works. However, what I can't manage to get to work, is waiting with writing the JSON file until all the URLs have been "requested"/checked.
I know that the solution involves Async/Await or Promises, but I can't get it to work in node. For someone more experienced, this is probably a matter of a few minutes.
The concept of asynchronous programming is pretty much new to me, but I have spent my fair share of hours researching it. I think I have difficulties transferring the knowledge out there to my specific problem.
I'd really appreciate the help. Cheers!
const request = require('request');
const fs = require('fs');
let listWithRealUrls = [];
function grabAndSaveRealUrls() {
let Urls = ['https://www.trackinglink/1', 'https://www.trackinglink/2', 'https://www.trackinglink/3']
for (const Url of Urls) {
request.get(Url, function () {
let realUrl = this.uri.href;
listWithRealUrls.push(realUrl)
});
}
fs.writeFile('data.json', JSON.stringify(listWithRealUrls), function (err) {
if(err) {
console.log(err);
} else {
console.log('success');
}
})
}
grabAndSaveRealUrls();