I am building a script that retrieves data from an API to build my own database. I'm rate limited to a relatively low number calls per hour.
I must make several sequential calls to different databases to gather the data I need, and this must be done in sequentially, because I don't know what to call from the next database until I receive the value from the first database.
I need to run these sequential fetches for each item in an array, that I need to gather additional data for.
I have written this script as a boilerplate, but I feel like it's not as simple, or as best practice as it should be, and I'm looking for critique.
The setTimeout is incorporated to slow down the fetches so I don't exceed the rate limit and get blocked. Using chained promises in a reduce function seemed to be the best way... But that's what I'm not sure about. Is there a simpler way to accomplish this?
let myArray = ["beans", "soup", "peanuts", "artichokes"];
myArray.reduce((promise, item) => {
return promise.then(() => {
return itemsPromise(item);
});
}, Promise.resolve()).then(() => {
console.log("ALL DONE");
})
let itemsPromise = (item) => {
console.log("Item: ", item);
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve();
}, 2000);
});
}
This script successfully logs the item to the console with the expected 2 second delay. In real life, I'll be running an api call instead of logging to console.