0

I want to display an external pdf on my own site. To do so, I have this code:

$bigPDF = 'https://www.courtlistener.com/recap/gov.uscourts.dcd.219024/gov.uscourts.dcd.219024.9.0_1.pdf';
$smallPDF = 'https://www.courtlistener.com/recap/gov.uscourts.dcd.219024/gov.uscourts.dcd.219024.9.1_1.pdf';
header('Content-type: application/pdf');
readfile($bigPDF);

When I try to run it on my Google Cloud Platform server, I get this error several times, and then the server crashes:

ERROR: unable to read what child say: Bad file descriptor (9)

However, if I change readfile($bigPDF) to readfile($smallPDF), the code works fine and the pdf displays properly.

Also, both pdfs display fine on my local dev server. It is only when I upload to the Google Cloud Platform that the big docket crashes the server. To me, this means that the error probably has something to do with my Google Cloud server settings, but with such an undescriptive error code, I don't know where to start looking.

Noah K
  • 77
  • 1
  • 8
  • Have you tried chunking or streaming the file? Try this answer: https://stackoverflow.com/a/6914978/3103434 – Spudly Jun 20 '20 at 01:35
  • From the manual, Note:readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level(). have you checked ob_get_level? – bumperbox Jun 20 '20 at 01:48
  • @bumperbox I have output buffering turned off in my php.ini, and ob_get_level() returns 0 right before the above code. Is the bad file descriptor error related to memory issues? – Noah K Jun 22 '20 at 17:32
  • @NoahK The output buffer thing was a bit of a long shot. It's not a code issue, the code is fine. Given it only effects large files, you would have to assume it is hitting a memory limit or size limit of some sort. If you host the same pdf on github, and get the code to download it from there, does it also fail?. Maybe the source server is slow and something is timing out? – bumperbox Jun 22 '20 at 17:52
  • @Spudly I tried out that answer and got it working on my local server, but on the live server it still gives me an HTTP ERROR 500 (although it no longer gives me any error message, not even 'bad file descriptor'). Not sure what that means. – Noah K Jun 22 '20 at 18:09
  • @NoahK, What is the memory limit set to in php.ini and how large is the file? It has to be GCP config, so it seems like a memory issue makes sense. If the memory looks like it's configured right, maybe trying another approach might be helpful in reporting more details or at least narrowing down what the config issue on GCP might be. I normally use Guzzle for handling PHP requests, but you could try running curl in the cli to see if that gives you an error and might be more descriptive. There is also [stream_copy_to_stream() with fopen()](https://stackoverflow.com/a/42372831/3103434) – Spudly Jun 22 '20 at 18:43
  • @bumperbox I tried changing pdf host and still no success. Once again works on local server but not live server. – Noah K Jun 22 '20 at 19:31
  • @Spudly The memory limit is set to 1024 MB and the file is 37 MB, so that seems fine to me. I've tried curl, as well as other php methods for loading/displaying files and so far I either get no error or the same 'bad file descriptor'. But I'll keep trying different things. – Noah K Jun 22 '20 at 21:28
  • 1
    If you get the same error with different approaches, then there is some kind of config issue with GCP. I'm not sure which product you're using, but did you see [this issue](https://stackoverflow.com/a/58521098/3103434)? – Spudly Jun 23 '20 at 16:21
  • Are you deploying your service in GCP compute engine or using other service of GCP? – Mustafiz Jul 02 '20 at 13:40

0 Answers0