0

I've got about 30 databases (on different machines) with the same structure and I want to query them with the same very query.

Normally I am preparing connections and then doing foreach connecting to every database and sending query, waiting for result.

I was thinking about running those queries in parallel processes, so instead of waiting for results sum up (ie. 1 second per query per server), it would be a time of longest running query.

First I though about mysqli::poll / MYSQLI_ASYNC, but it heavily depends on mysqli.

I've found similar question: PHP asynchronous mysql-query but it's over 3 years old. Maybe somebody found other way?

Only one independent solution I can think of right now is using pnctl_fork to split queries to parallel processes, and then collect data using shared memory.

Is there any other method in PHP to work it around and achieve desired result?

Sofyan Thayf
  • 1,322
  • 2
  • 14
  • 26
DevilaN
  • 1,317
  • 10
  • 21
  • IMO, PHP is not a good solution for that. Java / C# would be way better. – Kevin Kopf Feb 02 '18 at 15:45
  • why not go with the MYSQI ASYNC feature? It is tailor made to solve your problem. – Erik Kalkoken Feb 02 '18 at 15:49
  • You are asking with the tag "mysql", so why not use mysqli? You shouldn't have any server on with mysql anymore, since mysql is deprecated in every php version, that has at least community support. If you use a php version that still uses mysql you should think of updating, since its a security vulnerability. – SophieXLove64 Feb 02 '18 at 15:53
  • @SophieXLove64 `mysql` is a MySQL database tag, not `php-mysql` extension tag... – Kevin Kopf Feb 02 '18 at 15:58
  • @AlexKarshin Read the opening post. He was thinking about using mysqli and he don't uses mysqli yet, but mysql. Its all in the opening post. – SophieXLove64 Feb 16 '18 at 07:37

0 Answers0