2

I'm trying to use @fopen with PHP.

In the file A, I'm using @fopen to call file B which is supposed to send me a json object from a database query.

This query is returning about 1 900 000 rows.

If in the file B I stop at, for example 1000000 rows, everything is working fine and I'm able to recieve the json object with no errors in file A but if I let the query return 1 900 000 rows I've got this message: fopen(the address): failed to open stream: HTTP request failed! HTTP/1.1 500 Internal Server Error

Do you have any idea?

Thank you beforehand.

Nicolas Cortell
  • 659
  • 4
  • 16
  • 2
    Check the Apache's `error_log` and the PHP's error log on the server that hosts the `B` script. The reason of the error is listed in one of them. I bet is it says: *"Fatal error: Allowed memory size of `xxx` bytes exhausted (tried to allocate `yyy` bytes)"*. – axiac Jan 29 '16 at 08:48
  • 1
    Or it can be a server script timeout error. Check apache and php logs both. You might find your answer there. – Rahul Kate Jan 29 '16 at 08:50
  • I really don't think because if I request directly file B it's working. The problem happends only if I go to file B through file A using fopen – Nicolas Cortell Jan 29 '16 at 08:52
  • @axiac, maybe an option exists for the fopen's context to do this. I already changed the 'timeout' and in both files A and B the memory limit is set to 2048M which should be enough? – Nicolas Cortell Jan 29 '16 at 09:01
  • `500 Internal Server Error` is an error on the remote server, most probably the `b` script aborted. `fopen()` has very little impact on the way the `b` script runs. You can try to get the exact HTTP request the browser sends (use the browser's developer tools to get it) and replicate it in PHP using `curl` instead of `fopen()`. Maybe you can replicate it using `fopen()` too, crafting a stream context, I don't know; I never did it. – axiac Jan 29 '16 at 09:20
  • @axiac, I'll try. Thx. I'll keep you informed – Nicolas Cortell Jan 29 '16 at 09:21

2 Answers2

1

actually you are crossing the limit of array. You have to set limit like : Divide your 19 lacks records in 2 parts. it means 1st time it will create the new txt file and will write 10 lack record in this and 2nd time write rest records and fetch data from that files.

Monty
  • 1,110
  • 7
  • 15
0

Ok, I've founded a solution:

Instead of sending the datas through the stream of fopen, I store the json object in text files and when I recieve the confirmation of the end of the process from file B to A, in A I use the generated files then I delete them. Be carefull if you use this method, if jour json object is too big you wont be able to insert the content in your file, you will have to split it in multiple parts.

Thanks to @Monty and @Axiac, they helped explorating different ways.

Nicolas Cortell
  • 659
  • 4
  • 16