1

I have a webrequest processing real-time calls (needs processing time within 100ms) and certain operations are taking a long time (250ms). I was checking whether it's possible to fire-and-forget a web request while processing the original request? So far, I have gathered:

  1. One can make an async http post in php using: Asynchronous PHP calls?

  2. Apache stops processing as soon as the request connection is closed: http://activelamp.com/blog/development/asynchronous-php-with-message-queues/

  3. Use php threads vs message-queues (similar to #2)

If it is possible, please let me know how and point me to the source.

ssk
  • 9,045
  • 26
  • 96
  • 169
  • yes it is possible. You can use `curl` on linux via something like `exec()` (command line) with the `> /dev/null &` at the end of the command to detach the shell. – ArtisticPhoenix Mar 12 '18 at 23:10
  • @ArtisticPhoenix Can you please tell me how to accomplish this if yes. Thanks. – ssk Mar 12 '18 at 23:12
  • 1
    I would try something like `exec('curl http://example.com/index.php > /dev/null &')` I never tried it before, but it might work. This will probably only work on Linux. – ArtisticPhoenix Mar 12 '18 at 23:15
  • thanks, one of the comments to this answer points your suggestion:https://stackoverflow.com/a/124557/376742 exec("curl $url > /dev/null 2>&1 &"); – ssk Mar 12 '18 at 23:43
  • wget is different then curl, but they do both make HTTP requests. Either should work if your not worried about the return. Just don't put user supplied data into `exec` without using `escapeshellarg` or similar, https://unix.stackexchange.com/questions/47434/what-is-the-difference-between-curl-and-wget – ArtisticPhoenix Mar 12 '18 at 23:47

1 Answers1

1

As @ArtisticPhoenix posted on the comments, one can use:

exec("curl $url > /dev/null 2>&1 &");

For scaling issues, I didn't go further with that solution.

ssk
  • 9,045
  • 26
  • 96
  • 169