0

I'm running a facebook fql query which returns a big json string. I then file_get_contents the URL. If I limit the fql query to 100 results, my file_get_contents() returns fine. If I up it to 200 results, file_get_contents() returns false.

I have tested it through the facebook open graph explorer and both 100 and 200 queries run fine. Also, on my localhost i can get back 200 results fine. On my production, i can only get 100. when i try 200, file_get_contents returns false.

Is there a php.ini setting on my production environment that is preventing file_get_contents from working for too large of a file?

Nate
  • 2,720
  • 2
  • 16
  • 17
  • 1
    f_g_c will suck in anything it's told to. There's no limit to what it can do, as long as PHP has the memory to hold it. – Marc B Jun 23 '14 at 17:05
  • Maybe it's somehow connected to memory limit? You should maybe also try some other values and check memory usage also before running file_get_contents. I don't know if it's possible but maybe somehow PHP stops reading if it knows the memory limit has been almost sed? What happens if you launch this url simple in browser not using file_get_contents ? – Marcin Nabiałek Jun 23 '14 at 18:18

1 Answers1

0

As found in the manual file_get_contents returns the read data or FALSE on failure.

So, if it returns FALSE, there must be something wrong somewhere else and you probably don't know what it is because errors are hidden.

Try enabling error log to E_WARNING level, surrounding the file_get_contents code with a try...catch block that logs in some file on your disk. This will show you where the error is.

(Remember to upload on production the try..catch updated code, before changing the php.ini file)

Here you find an example of somebody else facing a possibly similar problem.

Community
  • 1
  • 1
clami219
  • 2,958
  • 1
  • 31
  • 45