1

My PHP script is receiving large data (100 - 500 MB) from a client. I want my PHP script run fast, without using too much memory.

To save traffic, I don't use Base64 or form data. I send binary data directly in a POST request.

The data consists of two parts: 2000 Bytes header, and the rest, that has to be stored as a file on the server.

$fle = file_get_contents("php://input",FALSE,NULL,2000);
file_put_contents("file.bin", $fle);

The problem is, that file_get_contents ignores the offset parameter, and reads the data from byte 0. Is there any better way to do it?

** I don't want to read the whole data and slice off the last N-2000 bytes, as I am afraid it would use too much memory.

Ivan Kuckir
  • 2,327
  • 3
  • 27
  • 46
  • `substr($fle, 2000)` – AbraCadaver Jan 07 '19 at 18:15
  • First of all don't create another variable -$fle- `file_put_contents("file.bin", file_get_contents("php://input",FALSE,NULL,2000));` – Accountant م Jan 07 '19 at 18:21
  • 2
    @Accountantم still reads everything into memory before flushing it to disk. – Sammitch Jan 07 '19 at 18:32
  • @Sammitch Yes I know, this will just prevent creating another copy of the data in the memory. great answer you post by the way, I did similar buffer [before](https://stackoverflow.com/questions/53437729/allowed-memory-size-exhausted-in-php-for-loop/53438993#53438993) – Accountant م Jan 07 '19 at 18:39
  • @Sammitch I'm sorry I confused what is in here with the predefined variables I read before [here](https://phptherightway.com/pages/The-Basics.html#variable-declarations) which is not applied in this case. Using this variable will not duplicate the memory consumed by the script – Accountant م Jan 07 '19 at 18:52
  • @Accountantم yes it's 1x very large rather than 2x very large. 1x very large is still too much though. In cases like this I tend to ignore the number itself and worry about how many digits it has instead. – Sammitch Jan 07 '19 at 18:59
  • @Sammitch Um, now I'm more confused. Do you mean that using the variable `$fle` in the question will **do** make the data 2x times larger in memory ? – Accountant م Jan 07 '19 at 19:02
  • 1
    @Accountantم ah, I misread your previous comment. It should be 1x with or without the variable assignment. – Sammitch Jan 07 '19 at 19:04

1 Answers1

3

Use the lower-level file IO functions and read/write a little bit at a time.

$bufsz = 4096;

$fi = fopen("php://input", "rb");
$fo = fopen("file.bin", "wb");

fseek($fi, 2000);

while( $buf = fread($fi, $bufsz) ) {
  fwrite($fo, $buf);
}

fclose($fi);
fclose($fo);

This will read/write in 4kB chunks.

Sammitch
  • 30,782
  • 7
  • 50
  • 77
  • Things to note with this is your execution time - ensure you have set this in your php ini. Great answer with explanation though! Didn't know you could specify the size to read consistently, always used `filesize()` as the second arg. Great to know for future reference. – Jaquarh Jan 07 '19 at 18:41
  • BTW. is there a simple way to read N bytes from `$fi` ? In my case, fread() often returns less than the value of the second parameter. – Ivan Kuckir Jan 07 '19 at 18:59
  • @IvanKuckir that should only ever happen *once* per file, at most, and only ever at the end of the file when it's not an even multiple of the buffer size. This is fine. Any other observed case is a different problem that needs to be addressed. – Sammitch Jan 07 '19 at 19:02
  • @Sammitch I ran your code with $bufsz = 1000000; (which is larger than the size of my file). The file is saved, but it never reads more than 8192 bytes at once. – Ivan Kuckir Jan 07 '19 at 19:27
  • @IvanKuckir `php://input` is a special stream and may have some peculiarities associated with it. Unless you have some special requirement to hold ~1MB of data in memory for some reason, a buffer size of 8192 bytes or smaller should work just fine. – Sammitch Jan 07 '19 at 19:46