-1

I have checked Stack overflow for the solution to the above,but none of the solution seems to be working on the shared live server.I have a data of about 20,000 rows, which i wants to insert MySQL db.Anytime i try record close to 20k, it often gives the error of "Request Entity Too Large". Or is there a way i can break the do the insertion batch by batch, say 5000 records, until it finishes the whole record. Please, i need your help, if this is the case, as i don't know how to break it into batches for insertion. See my ini data:

memory_limit = 5G
max_execution_time = 10000
max_input_time = -1
post_max_size = 5G
max_input_vars = 100000
file_uploads = On
max_file_uploads = 35
upload_max_filesize = 5G
max_allowed_packet=100000

Also, see my insertion code:

<?php 
       if(isset($_POST['exportBtn']) && isset($_POST['sflt'])){
        $arr = array();
        foreach($_POST['sflt'] as $key => $value) {
         set_time_limit(0);
         $eflt = mysql_prep($_POST['sflt'][$key]);
         $emodel = mysql_prep($_POST['smodel'][$key]);
         $eengine = mysql_prep($_POST['sengine'][$key]);
         $eloc = mysql_prep($_POST['sloc'][$key]);
         $estye = mysql_prep($_POST['sstye'][$key]);
         $ensvr = mysql_prep($_POST['snsvr'][$key]);
         $eehd = mysql_prep($_POST['sehd'][$key]);
         $epname = mysql_prep($_POST['spname'][$key]);
         $epn = mysql_prep($_POST['spn'][$key]);
         $ecu = mysql_prep($_POST['scu'][$key]);
         $eqty = mysql_prep($_POST['sqty'][$key]);
         $ett = mysql_prep($_POST['stt'][$key]);
         $mtyp = mysql_prep($_POST['sstye'][$key]);
         $mtyp2 = $mtyp=='T'?'T':'S'; 
         $cby = $_SESSION['username'];
         $ct = date('Y-m-d H:i:s');

         array_push($arr,"('$eflt','$emodel','$eengine','$eloc','$estye','$ensvr','$eehd','$epname','$epn','$ecu','$eqty','$ett','$cby','$ct','$mtyp2')");
           }

        $inExp = mysqli_query($link,"INSERT  INTO tab_mydbtrans(fltno,model,engine,loc,serviceType,nextSvr,usageHr,partName,partNo,costUnit,qty,total,createdBy,created_at,mtype)VALUES".implode(',', $arr));

Please, note that I have searched and none has been able to solve my question, hence this question, so as not to marked it as duplicate.

insert multiple rows via a php array into mysql

Request Entity Too Large PHP

David Mukoro
  • 467
  • 1
  • 8
  • 25
  • Your script is wide open to [SQL Injection Attack](http://stackoverflow.com/questions/60174/how-can-i-prevent-sql-injection-in-php) Even [if you are escaping inputs, its not safe!](http://stackoverflow.com/questions/5741187/sql-injection-that-gets-around-mysql-real-escape-string) Use [prepared parameterized statements](http://php.net/manual/en/mysqli.quickstart.prepared-statements.php) – RiggsFolly Jan 31 '18 at 13:18
  • 1
    Insert the data `Row by Row` and dont try an build a HUGE single query – RiggsFolly Jan 31 '18 at 13:19
  • _Small Note_ Throwing random, big numbers, at `php.ini` would normally indicate you are doing something wrong – RiggsFolly Jan 31 '18 at 13:21
  • 1
    Or at least break your current query mechanism into smaller manageable nibbles. Like 200 rows at a time, then try 300, etc if speed is an absolute necessity – RiggsFolly Jan 31 '18 at 13:23
  • Try to insert 100 records in one query. That way you will have to run 200 queries for 20000 records instead of 20000 queries. – Simrat Pal Singh Jan 31 '18 at 13:24
  • @SimratSingh: please, can you show me how to accomplish this with my code? – David Mukoro Jan 31 '18 at 14:13
  • @RiggsFolly: Please, show me how to do this. – David Mukoro Jan 31 '18 at 14:14
  • Please, I am still not able to solve the issue. Any help from anyone. All the above are suggestions which are good, but i don;t know how to implement them – David Mukoro Jan 31 '18 at 14:34
  • Surely you can code a simple loop in a loop? – RiggsFolly Jan 31 '18 at 15:27
  • @RiggsFolly: Please, show me . If i know, i would not ask. Show me using the code above, refactor and show me how to implements. – David Mukoro Jan 31 '18 at 15:43
  • Noone is going to do it for you. Have a go and it you fail ask another question – RiggsFolly Jan 31 '18 at 19:15

1 Answers1

1

The below code will insert 100 records in a single query. Use prepared parameterized statements in user code as this query is open to SQL Injection Attacks.

<?php 
if(isset($_POST['exportBtn']) && isset($_POST['sflt'])){
    set_time_limit(0);
    $arr = array();
    $i=0;
    foreach($_POST['sflt'] as $key => $value) {     
        $eflt = mysql_prep($_POST['sflt'][$key]);
        $emodel = mysql_prep($_POST['smodel'][$key]);
        $eengine = mysql_prep($_POST['sengine'][$key]);
        $eloc = mysql_prep($_POST['sloc'][$key]);
        $estye = mysql_prep($_POST['sstye'][$key]);
        $ensvr = mysql_prep($_POST['snsvr'][$key]);
        $eehd = mysql_prep($_POST['sehd'][$key]);
        $epname = mysql_prep($_POST['spname'][$key]);
        $epn = mysql_prep($_POST['spn'][$key]);
        $ecu = mysql_prep($_POST['scu'][$key]);
        $eqty = mysql_prep($_POST['sqty'][$key]);
        $ett = mysql_prep($_POST['stt'][$key]);
        $mtyp = mysql_prep($_POST['sstye'][$key]);
        $mtyp2 = $mtyp=='T'?'T':'S'; 
        $cby = $_SESSION['username'];
        $ct = date('Y-m-d H:i:s');

        array_push($arr,"('$eflt','$emodel','$eengine','$eloc','$estye','$ensvr','$eehd','$epname','$epn','$ecu','$eqty','$ett','$cby','$ct','$mtyp2')");

        $i++;
        if($i == 100){
            $inExp = mysqli_query($link,"INSERT  INTO tab_mydbtrans(fltno,model,engine,loc,serviceType,nextSvr,usageHr,partName,partNo,costUnit,qty,total,createdBy,created_at,mtype)VALUES".implode(',', $arr));
            $arr = array();
            $i=0;
        }
    }

    //If any rows left at the end this code will insert them
    if(count($arr) > 0){
        $inExp = mysqli_query($link,"INSERT  INTO tab_mydbtrans(fltno,model,engine,loc,serviceType,nextSvr,usageHr,partName,partNo,costUnit,qty,total,createdBy,created_at,mtype)VALUES".implode(',', $arr));
    } 
}
  • thanks for your response. I have tried this but it is not inserting any record again. – David Mukoro Feb 02 '18 at 08:02
  • print your query and try to run in phpmyadmin. – Simrat Pal Singh Feb 02 '18 at 10:30
  • I tried on the live serve and it is still show Request Entity Too Large. I thought breaking it into small units with all those configguration will enable the data to be saved bit by bit.But error of request entity data too large perists. Any suggestion on way forward – David Mukoro Feb 02 '18 at 14:57