A media file can be huge so you cannot process it with techniques that involve loading the whole file in RAM, let alone have several copies of the data at the same time. Your code is doing that several times.
$json_str = file_get_contents('php://input');
$json_obj = json_decode($json_str);
$Video = $json_obj->Video;
$video_decode = base64_decode($Video);
At this point, you have 4 copies of the video loaded in RAM. For a 50 MB media file that means that $video_decode
holds 50 MB of data and the other variables are even larger. You also need to add up the internal memory used by functions like json_decode()
. To get an idea of the RAM being used you can insert this between each operation:
var_dump(memory_get_usage(true));
Additionally, modern video codecs are highly optimised binary streams. If you convert that binary data to plain text the size will grow. Base64 makes data grow by 4/3.
JSON adds a little overhead, though nothing serious at this point.
If I had to design this from scratch I'd use a POST request with a binary format (either multipart/form-data
or a custom encoding, maybe just the file as-is) and then just read input in chunks:
If you prefer or need to stick to current design, you need to address all the file reads and decoding operation in chunks:
Last but not least, don't ever create SQL code by injecting untrusted external input:
$sql = "UPDATE tblCaf SET Video = '$video_dbfilename' WHERE CAFNo = '$CAFNo'";
Use prepared statements instead. The hordes of tutorials that suggest otherwise should have been burnt long ago.
Update: It seems that the concept of processing data in small chunks (vs. loading everything in memory) versus is not as straightforward as I thought so I'll provide runnable code to illustrate the difference.
First of all, let's produce a 100 MB file to test:
$test = __DIR__ . '/test.txt';
$fp = fopen($test, 'wb') or die(1);
for ($i = 0; $i < 1839607; $i++) {
fwrite($fp, "Lorem ipsum dolor sit amet, consectetur adipiscing elit.\n") or die(2);
}
fclose($fp);
If you resort to easy-to-use one-liner helper functions:
ini_set('memory_limit', '300M');
$test = __DIR__ . '/test.txt';
printf("Current memory usage: %d MB\n", memory_get_usage(true) / 1024 / 1024);
$data = file_get_contents($test);
printf("Current memory usage: %d MB\n", memory_get_usage(true) / 1024 / 1024);
file_put_contents(__DIR__ . '/copy.txt', $data);
printf("Current memory usage: %d MB\n", memory_get_usage(true) / 1024 / 1024);
printf("Maximum memory usage: %d MB\n", memory_get_peak_usage(true) / 1024 / 1024);
... it comes at the price of requiring a potentially large amount of memory (there's no thing as free food):
Current memory usage: 0 MB
Current memory usage: 100 MB
Current memory usage: 100 MB
Maximum memory usage: 201 MB
If you split the file in small pieces:
$test = __DIR__ . '/test.txt';
$chunk_size = 1024 * 1024;
$input = fopen($test, 'rb') or die(1);
$output = fopen(__DIR__ . '/copy.txt', 'wb') or die(2);
printf("Current memory usage: %d MB\n", memory_get_usage(true) / 1024 / 1024);
while (!feof($input)) {
$chunk = fread($input, $chunk_size) or die(3);
fwrite($output, $chunk) or die(4);
}
printf("Current memory usage: %d MB\n", memory_get_usage(true) / 1024 / 1024);
fclose($input);
fclose($output);
printf("Current memory usage: %d MB\n", memory_get_usage(true) / 1024 / 1024);
printf("Maximum memory usage: %d MB\n", memory_get_peak_usage(true) / 1024 / 1024);
... you can use a fixed amount of memory for files any size:
Current memory usage: 0 MB
Current memory usage: 1 MB
Current memory usage: 1 MB
Maximum memory usage: 3 MB