1

Apparently large files do not work with php's file_get_contents. What is the alternative if it apparently is a "stack overflow" issue?

HTML File

<form id='test'>
 <input type='file' name='Receipt' />
</form>
<button onClick='save_Receipt();'>Save</button>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
<script>
function save_Receipt(){
 var ticketData = new FormData($('form#test')[0]);
 $.ajax({
  cache: false,
  processData: false,
  contentType: false,
  url:"cgi-bin/php/post/testimage.php",
  method:"POST",
  data:ticketData,
  success:function(code){}
 });
}
</script>

PHP File

ini_set('memory_limit','512M');
$img = file_get_contents($_FILES['Receipt']['tmp_name']);

echo base64_encode($img);
echo $img;
sqlsrv_query($Portal,
" INSERT INTO Portal.dbo.[File]([Name], [Type], [Data], [User], Ticket)
  VALUES(?, ?, ?, ?, ?);", array($_FILES['Receipt']['name'], $_FILES['Receipt']['type'],  array(base64_encode($img), SQLSRV_PARAM_IN, SQLSRV_PHPTYPE_STREAM(SQLSRV_ENC_BINARY), SQLSRV_SQLTYPE_VARBINARY('max')), $_SESSION['User'], 2094378));

To better clarify, the echo's aren't resulting in anything

coder42
  • 92
  • 10
  • https://stackoverflow.com/questions/6409462/downloading-a-large-file-using-curl might help – Chris Nov 28 '18 at 19:02
  • The only limit here should be local memory, you could try to adjust the `memory_limit` php param – john Smith Nov 28 '18 at 19:03
  • also there are many reasons not to store large binary files/ blobs in database, just store the path to the file in you db – john Smith Nov 28 '18 at 19:05
  • @chris not quite as it's writing to disk not to sql database (which is so wrong I don't want to talk about it but it has to be done this way which works for small files) – coder42 Nov 28 '18 at 19:05
  • https://stackoverflow.com/questions/5775571/what-is-the-maximum-length-of-data-i-can-put-in-a-blob-column-in-mysql – john Smith Nov 28 '18 at 19:06
  • @johnsmith It's varchar(max) which has a 2gb limit and the large file is not "echoing" which is the reason it's not working not the sqlsrv_query -- also not that it should matter but it's an ms sql 2008 server – coder42 Nov 28 '18 at 19:08
  • @johnSmith "increase memory_limit" should not be the first answer to every question, it should be the last resort. – Sammitch Nov 28 '18 at 19:09
  • @Sammitch You are right, that was just a comment to `Apparently large files do not work with php's file_get_contents.` i didnt read the rest and question title says nothing about storing to db – john Smith Nov 28 '18 at 20:59

1 Answers1

1

Pretty much every DB driver has a method to deal with oversize BLOB data without having to needlessly bloat your memory requirements. Sqlsrv's appears to be that you can specify a PHP stream as an input type.

I don't have a SQL server to test this with, but your code would look something like:

$fh = fopen($theFile, 'rb');
$stmt = sqlsrv_prepare(
  $conn,
  'INSERT INTO blobTable VALUES (?)',
  [
    [$fh, SQLSRV_PARAM_IN, SQLSRV_PHPTYPE_STREAM(SQLSRV_ENC_BINARY), SQLSRV_SQLTYPE_BLOB]
  ]
);
$stmt->execute();
fclose($fh);

Ref:

Also, I don't know if you need to output it as base64, but you can also stream that output as well. Base64 can encode 3 bytes of input into 4 bytes of output without padding, so as long as you're using input chunks that are sized at a multiple of 3 you can safely concatenate them together.

eg:

$fh = fopen($theFile, 'rb');
while( $buf = fread($fh, 3072) ) {
  echo base64_encode($buf);
}
fclose($fh);
Sammitch
  • 30,782
  • 7
  • 50
  • 77
  • This is what I was looking for on stack overflow, however, the answer to my problem was the "php.ini" "image_upload_size" or w/e since I was already doing the streaing into sql srv – coder42 Nov 28 '18 at 19:44
  • @coder42 ah, all the fun bits got lopped off by the scrollbar on your code. You should really switch that VARCHAR to a VARBINARY or BLOB type and ditch the base64. You can clearly manage the data types, and you'll save 25% on your storage requirements. – Sammitch Nov 28 '18 at 19:48
  • I wish I could however the system I've created is running "in parallel" with another which forces us to use this specific schema :( – coder42 Nov 28 '18 at 20:35