I have checked Stack overflow for the solution to the above,but none of the solution seems to be working on the shared live server.I have a data of about 20,000 rows, which i wants to insert MySQL db.Anytime i try record close to 20k, it often gives the error of "Request Entity Too Large". Or is there a way i can break the do the insertion batch by batch, say 5000 records, until it finishes the whole record. Please, i need your help, if this is the case, as i don't know how to break it into batches for insertion. See my ini data:
memory_limit = 5G
max_execution_time = 10000
max_input_time = -1
post_max_size = 5G
max_input_vars = 100000
file_uploads = On
max_file_uploads = 35
upload_max_filesize = 5G
max_allowed_packet=100000
Also, see my insertion code:
<?php
if(isset($_POST['exportBtn']) && isset($_POST['sflt'])){
$arr = array();
foreach($_POST['sflt'] as $key => $value) {
set_time_limit(0);
$eflt = mysql_prep($_POST['sflt'][$key]);
$emodel = mysql_prep($_POST['smodel'][$key]);
$eengine = mysql_prep($_POST['sengine'][$key]);
$eloc = mysql_prep($_POST['sloc'][$key]);
$estye = mysql_prep($_POST['sstye'][$key]);
$ensvr = mysql_prep($_POST['snsvr'][$key]);
$eehd = mysql_prep($_POST['sehd'][$key]);
$epname = mysql_prep($_POST['spname'][$key]);
$epn = mysql_prep($_POST['spn'][$key]);
$ecu = mysql_prep($_POST['scu'][$key]);
$eqty = mysql_prep($_POST['sqty'][$key]);
$ett = mysql_prep($_POST['stt'][$key]);
$mtyp = mysql_prep($_POST['sstye'][$key]);
$mtyp2 = $mtyp=='T'?'T':'S';
$cby = $_SESSION['username'];
$ct = date('Y-m-d H:i:s');
array_push($arr,"('$eflt','$emodel','$eengine','$eloc','$estye','$ensvr','$eehd','$epname','$epn','$ecu','$eqty','$ett','$cby','$ct','$mtyp2')");
}
$inExp = mysqli_query($link,"INSERT INTO tab_mydbtrans(fltno,model,engine,loc,serviceType,nextSvr,usageHr,partName,partNo,costUnit,qty,total,createdBy,created_at,mtype)VALUES".implode(',', $arr));
Please, note that I have searched and none has been able to solve my question, hence this question, so as not to marked it as duplicate.