2

I have 2 websites, hosted on 2 different servers. They are kind of interlinked. Sometimes I just do stuff on Website-1 and run a script on Website-2. Like I edited something on Website-1 and now I want to run a script on Website-2 to update accordingly on it's server.

Till now I am using following code on website 1.

$file = file_get_contents('Website-2/update.php');

But the problem with this is that my Website-1 server script stops running and wait for the file to return some data. And I don't wanna do anything with that data. I just wanted to run the script.

Is there a way where I can do this in a better way or tell PHP to move to next line of code.

  • You want the other server to kick off a background job. So you call the 2nd server and give it a command. It will start a background job that can execute that command... This way the direct call can return and the independent background job will do it's thing. – ArtisticPhoenix Oct 06 '18 at 02:27
  • @ArtisticPhoenix I don't get what you are trying to say –  Oct 06 '18 at 02:36
  • If you want the request to execute in the background maybe you are locking for multithreating. check this post https://stackoverflow.com/questions/70855/how-can-one-use-multi-threading-in-php-applications –  Oct 06 '18 at 02:55

3 Answers3

0
If you want to call the second site without making your user wait for a response,
I would recommend using a message queue.
Site 1 request would put a message to the queue.
Cron job to check queue and run update on site 2 when message exists.

Common queues apps to look at:  
[https://aws.amazon.com/sqs/?nc2=h_m1][1]  
[https://beanstalkd.github.io/][2]  
[https://www.iron.io/mq][3]


  [1]: https://aws.amazon.com/sqs/?nc2=h_m1
  [2]: https://beanstalkd.github.io/
  [3]: https://www.iron.io/mq
user3720435
  • 1,421
  • 1
  • 17
  • 27
0

What you're trying to achieve is called a web hook and should be implemented with proper authentication, so that not anybody can execute your scripts at any time and overload your server.

On server 2 you need to execute your script asynchronously via workers, threads, message queues or similar.

You can also run the asynchronous command on your server 1. There are many ways to achieve this. Here are some links with more on this.

(Async curl request in PHP)

(https://segment.com/blog/how-to-make-async-requests-in-php/)

ege
  • 774
  • 5
  • 19
  • As a user of RabbitMq in a production server that does over 100 million searches a day, I would say this is a bit overkill for the problem at hand. But that is my Opinion – ArtisticPhoenix Oct 06 '18 at 05:33
  • If the script is executing so long that he doesn't want to wait for it, then it wouldn't take many requests to lay the server down. There's obviously the question "who would?", but it's still something to think about – ege Oct 06 '18 at 11:58
-1

Call your remote server as normal. But, In the PHP script you normally call, Take all the functionality and put it in a third script. Then from the old script call the new one with (on Linux)

exec('php -f "{path to new script}.php" $args  > /dev/null &');

The & at the end makes this a background or non-blocking call. Because you call it from the remote sever you don't have to change anything on the calling server. The php -f runs a php file. The > /dev/null sends the output from that file to the garbage.

On windows you can use COM and WScript.Shell to do the same thing

$WshShell = new \COM('WScript.Shell');
$oExec = $WshShell->Run('cmd /C php {path to new script}.php', 0, false);

You may want to use escapeshellarg on the filename and any arguments supplied.

So it will look like this

  • Server1 calls Server2
  • Script that was called (on Server2) runs exec and kicks off a background job (Server2) then exits
  • Server1 continues as normal
  • Server2 continues the background process

So using your example instead of calling:

file_get_contents('Website-2/update.php');

You will call

file_get_contents('Website-2/update_kickstart.php');

In update_kickstart.php put this code

<?php
 exec('php -f "{path}update.php" > /dev/null &');

Which will run update.php as a separate background (non-blocking) call. Because it's non-blocking update_kickstart.php will finish and return to searver1 which can go about it's business and update.php will run on server2 independantly

Simple...

The last note is that file_get_contents is a poor choice. I would use SSH and probably PHPSecLib2.0 to connect to server2 and run the exec command directly with a user that has access only to that file(Chroot it or something similar). As it is anyone can call that file and run it. With it behind a SSH login it's protected, with it Chrooted that "special" user can only run that one file.

ArtisticPhoenix
  • 21,464
  • 2
  • 24
  • 38