0

I am using a puppeteer and a puppeteer-cluster to take screenshots of the received HTML. To "bypass" the Node.js' single-thread processing, I use several Docker containers.

The problem is that it randomly prolongs a render time - once the same HTML is rendered in 200 ms and the next time (same configuration, environment etc.) it is 8.2 s. Node's logs does not contain any errors.

Does it have any solution?

KRiSTiN
  • 429
  • 1
  • 4
  • 11

1 Answers1

0

There should be no reason to "bypass the Node.js' single-thread processing" as most puppeteer actions are executed asynchronously. This sounds to me like you are solving a problem that does not really exist.

When you are using docker in addition to your script, it also makes sense that the processes are slower (prolonging the render time) as in addition to your script, the OS also needs to run docker.

Check system resources

I'm assuming your system is not capable of running multiple docker containers in addition to multiple Chrome instances in each container. Monitor CPU and memory to see if you are hitting any limitations.

Thomas Dondorf
  • 23,416
  • 6
  • 84
  • 105
  • Don't think so, the CPU usage goes up to 60 % in the peak. In addition, there is not a difference in processing time whether the `maxConcurrency` equals to `4` or `128` (I deliberately chose two "extreme" values). – KRiSTiN Aug 20 '19 at 15:46
  • @KRiSTiN Pretty hard to help without seeing any information about your system (CPU, memory, ...). Also, it would be nice to know the load average of your system. – Thomas Dondorf Aug 20 '19 at 15:50