0

Let's say I have a webserver running something like Wordpress for the forward-facing interface and have Laravel in the background to generate some dynamic content.

Currently I use an iframe in Wordpress to insert the laravel page, I'd rather use file_get_contents but the server runs multiple webpages so I can't just download localhost/laravel/page, it has to be the server's URL/laravel/page

Is there a way to get the HTML/results of the Laravel page through the command line (in PHP code) without going through the internet to just download the page?

ie: the command to execute public/index.php but with the data necessary to trigger the router properly

NeoTechni
  • 158
  • 2
  • 10
  • It's not really clear what you mean to be honest, or what you're trying to achieve, or what problem you're trying to solve. Yes, you can download a web page from the command line, using cuRL or file_get_contents. But you said you didn't want to download it. But you can't access it without downloading it (even when you put it in an iframe, it still gets downloaded, the only difference is the browser downloads it and then displays it). – ADyson Jul 21 '21 at 16:04
  • @ADyson to get the HTML/result of a laravel route through a PHP function/command line WITHOUT just downloading it from the URL cause that requires going through the internet which is slower – NeoTechni Jul 21 '21 at 16:07
  • @Markus Zeller probably, ty. – NeoTechni Jul 21 '21 at 16:08
  • 1
    Downloading from http://localhost will be local and not going thru the internet. The overhead is the webserver, but that should also be no problem. That is common use between docker containers and microservices. – Markus Zeller Jul 21 '21 at 16:14
  • localhost isnt possible due to the server hosting multiple sites and using the URL to determine which one gets accessed – NeoTechni Jul 22 '21 at 19:41

0 Answers0