You might want to check out puppeteer-cluster
(I'm the author of that library), which supports your use case. The library runs tasks in parallel, but also takes care of error handling, retrying and some other things.
You should also keep in mind that opening 10 pages for 10 URLs is quite costly in terms of CPU and memory. You can use puppeteer-cluster
to use a pool of browsers or pages instead.
Code Sample
You can see a minimal example below. It's also possible to use the library in more complex settings.
const { Cluster } = require('puppeteer-cluster');
(async () => {
const cluster = await Cluster.launch({
concurrency: Cluster.CONCURRENCY_PAGE, // use one browser per worker
maxConcurrency: 4, // Open up to four pages in parallel
});
// Define a task to be executed for your data, this function will be run for each URL
await cluster.task(async ({ page, data: url }) => {
await page.goto(url);
// ...
});
// Queue URLs (you can of course read them from an array instead)
cluster.queue('http://www.google.com/');
cluster.queue('http://www.wikipedia.org/');
// ...
// Wait for cluster to idle and close it
await cluster.idle();
await cluster.close();
})();