0

This is my code:

ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
$fullcontent = file_get_contents("https://website.co.uk/feeds/google-feed-all-options.php");

file_put_contents("direct/google-feed.xml", $fullcontent);

In my browser it times out, however.

In the file google-feed-all-options.php I have this:

ignore_user_abort(true);
set_time_limit(0);

the code in https://website.co.uk/feeds/google-feed-all-options.php outputs to a file google-feed.xml if i put it in a crontab it works correctly and the file is complete, if i go to the url https://website.co.uk/feeds/google-feed-all-options.php in my browser, the data gets cut off and only gets to about 20% of the list of items.

meaning that if i echo out $fullcontent onto the page the content doesnt loop through the list correctly. meaning the code does work, but the main issue is that im accessing it using the browser instead of running it internally on the server.

so i have set ignore user abort which continues the script running even if it times out in the browser like it is doing, then the file completes it just doesnt output it. but ideally i need the output to come into the browser.

it is not a duplicate if you see i mention set time limit this is unlimited and the php.ini is set to 600 and the script barely reaches 20 seconds which is not a default of php if the other two didnt work.

Any ideas? I'm at a loss.

QuantumArchi
  • 73
  • 2
  • 12
  • Possible duplicate of [How to increase maximum execution time in php](https://stackoverflow.com/questions/16171132/how-to-increase-maximum-execution-time-in-php) – node_modules Dec 03 '17 at 14:58
  • no i have it set to unlimited i have set it to 600 in php ini and it times out before 20 seconds @Jer – QuantumArchi Dec 03 '17 at 15:00
  • 2
    Do you have a reliable internet connection? Are you running on shared hosting? Does this answer help? https://stackoverflow.com/questions/1590441/#1590644 Does the connection get reset from somewhere else? Try to get the data locally first to see if it's a PHP issue or not, using `wget` or `curl`. Why does it take so long to get the info? How big is the file? Can it be a memory limit? What is the status code in the browser? Please document every observation related to the failing part. Because if you just echo stuff from PHP, as far as PHP is concerned, everything is fine. – Alex Dec 03 '17 at 15:12
  • It's on a dedicated aws system, its 100% not my internet the file is relatively big 20mb for an xml thats not small... but not huge . technically there is no error that displays, it just stops looping through the data at a certain point, i have increased the limits in the php.ini then checked the data to see if there is an increase there isnt. i have restarted the server to ensure the changes are activated file_get_contents is similar to curl. it gets to about 4mb and stops @AlexanderMP – QuantumArchi Dec 03 '17 at 15:22
  • What's your memory limit? What's your PHP version? Where do you output to stdout? The code only shows that you output the data to the file system. Does outputting to stdout happen before or after that? Did you try to get the result of the request without a browser but from your local computer as well? I can keep asking, or you can test yourself every issue to see if it influences the problem or not. – Alex Dec 03 '17 at 15:51

0 Answers0