0

I have a huge json array. Some elements of the json array are serialized json arrays themselves. The problem is I have a large series of table inserts to do using this large json array. I am also using pdo.

How can i quickly turn this json array, which has serialized json arrays themselves, into table inserts using pdo?

It just seems like its taking too much time, and it doesn't seem like a clean way of doing things.

Basically, I have an interface that inserts into tables using the json array, and I need to convert it into table inserts using columns and placeholders with pdo. There are just too much elements in the array to insert.

I tried to automate the creation of inserts using this code and it works fine, and I like the way it came out, but I can't use this code or automate it with the json arrays whose elements have serialized json arrays themselves.

function create_query( $table_name, $cols, $placeholders )
{

    $query= "INSERT INTO $table_name ( 
                                          $cols
                                       )
             VALUES                    (
                                          $placeholders
                                       )";
}

/*
    arg1= table_name;

*/
function insert_into_table(  $argums, $location )
{

    $num_classes= $location[ "num_classes" ];


    foreach( $location as $key => $value )
    {
        $$key= $value;
    }

    $keys= array_keys( $argums );

    $table_name     = $keys[ 0 ];

    array_shift( $keys );
    array_shift( $argums );
    $cols= implode( ", ", $keys );


    $bound_array= array();

    echo var_dump( $keys );

    foreach( $keys as &$value )
    {
        $bound_array[ $value ]= ${ $argums[ $value ] };
        $value= ":".$value;
    }

    $placeholders= implode( ", ", $keys );

    $query= create_query( $table_name, $cols, $placeholders );

    return array( "query" => $query, "bind" => $bound_array );

}

function do_insert( $arr )
{
    global $db;

    echo var_dump( $arr );
    $query= $arr[ "query" ];
    $bind=  $arr[ "bind"  ];

    $stmt= $db->prepare( $query );
    echo var_dump( $stmt );

    if( !$stmt->execute( $bind ) )
    {
        echo var_dump( $stmt->errorInfo() );
    }

    $affected_rows= $stmt->rowCount();
}

    $assoc_arr= insert_into_table( array( "company" => "company", 'companyName'=> "companyName", 'pkid' => 'pkid', 'reminderEmailTop' => 'reminderEmailTop', 'reminderEmailBottom' => 'reminderEmailBottom', 'reminderLink' => 'reminderLink' ), $location );

    do_insert( $assoc_arr );

    $assoc_arr= insert_into_table( array( "companyNav" => "companyNav", "companyId" => "pkid", 'type'=> "basiclife_type", "page_nav" => "basiclife_pagenav", "orderby" => "basiclife_orderby" ), $location );

    do_insert( $assoc_arr );

    $assoc_arr= insert_into_table( array( "companyNav" => "companyNav", "companyId" => "pkid", 'type'=> "bene_type", "page_nav" => "bene_pagenav", "orderby" => "bene_orderby" ), $location );

    do_insert( $assoc_arr );

    $assoc_arr= insert_into_table( array( "companyNav" => "companyNav", "companyId" => "pkid", 'type'=> "beneper_type", "page_nav" => "beneper_pagenav", "orderby" => "beneper_orderby" ), $location );

    do_insert( $assoc_arr );

    $assoc_arr= insert_into_table( array( "companyNav" => "companyNav", "companyId" => "pkid", 'type'=> "dent_type", "page_nav" => "dent_pagenav", "orderby" => "dent_orderby" ), $location );

    do_insert( $assoc_arr );

    $assoc_arr= insert_into_table( array( "companyNav" => "companyNav", "companyId" => "pkid", 'type'=> "eflex_type", "page_nav" => "eflex_pagenav", "orderby" => "eflex_orderby" ), $location );

    do_insert( $assoc_arr );

    $assoc_arr= insert_into_table( array( "companyNav" => "companyNav", "companyId" => "pkid", 'type'=> "hsa_type", "page_nav" => "hsa_pagenav", "orderby" => "hsa_orderby" ), $location );

    do_insert( $assoc_arr );

....and about 50 more of these type of inserts..

however with the inserts that have elements who are json arrays themselves i couldn't use the above code.

so i created a function which is just one manually typed insert...

Masu
  • 1,568
  • 4
  • 20
  • 41
  • You may be doing the inserts one by one instead of a one time insert. Either you reconstruct the query as a multiple insert and if that gets a bottleneck you should add chunking as well. If you post a code sample I'll take a peek. – Vladimir Ramik Mar 15 '15 at 01:12
  • sorry, by taking forever, i mean its taking forever to manually type the sql inserts, not that the code is taking too long to execute – Masu Mar 15 '15 at 01:24
  • the problem is mapping the json data to values for the table columns – Masu Mar 15 '15 at 01:27

1 Answers1

0

Here's a prototype. Just feed it a data array, csv or json file and it'll run through all the data entries. Flavor as needed: class MySql { private $sDbName = 'play'; private $sUsername = 'root'; private $sPassword = ''; private $sHost = 'localhost'; private $oConnection = null;

    public function __construct()
    {
        $this->oConnection = new PDO( 
            'mysql:host=' 
            . $this->sHost 
            . ';dbname=' 
            . $this->sDbName, 
            $this->sUsername, 
            $this->sPassword 
            );
    }
    public function getDb()
    {
        return $this->oConnection;
    }

    public function bindVariables( &$oStmp, $aBinds )
    {
        foreach( $aBinds as $sVariable => $vValue )
        {
            // Ensure we have a colon prepended for PDO.
            if( substr( $sVariable, 0, 1 ) !== ':' )
            {
                $sVariable = ':' . $sVariable;
            }
            $oStmp->bindValue( $sVariable, $vValue );
        }
    }
}
$oMySql = new MySql;
$oDb = $oMySql->getDb();

// $sCsvExampleLine = '1,2,3,4,5';
$aCsv = file( 'infile6.csv' );
$iCountlines = count( $aCsv );
for( $j = 0; $j < $iCountlines; ++$j )
{
    $aRaw = explode( ',', $aCsv[ $j ] );
    $aData[ $j ][ 'companyNav' ] = $aRaw[ 0 ];
    $aData[ $j ][ 'companyId' ]  = $aRaw[ 1 ];
    $aData[ $j ][ 'type' ]       = $aRaw[ 2 ];
    $aData[ $j ][ 'page_nav' ]   = $aRaw[ 3 ];
    $aData[ $j ][ 'orderby' ]    = $aRaw[ 4 ];
}

// $aData[] = array( 'companyNav' => '1', 'companyId' => '2', 'type'=> '3', 'page_nav' => '4', 'orderby' => '5' );

// Create the sql outside of the loop.
$sSql = "
    INSERT INTO `1` 
    ( companyNav, companyId, type, page_nav, orderby )
    VALUES 
    ( :companyNav, :companyId, :type, :page_nav, :orderby )
    ";
$oStmp      = $oDb->prepare( $sSql );
$iCountData = count( $aData );
for( $k = 0; $k < $iCountData; ++$k )
{
    $oMySql->bindVariables( $oStmp, $aData[ $k ] );
    $oStmp->execute();
    // var_dump( $oStmp->rowCount() );
}
$oErrors = $oStmp->errorInfo();
var_dump( $oErrors );

One can leverage binds to use the same sql to insert different data. Query above allows you to wrap the database connection for convenience in a class and avoid using globals.

O/P's main concern was the daunting task of replicating inserts. If one can instead pull out the data in the right order/format out of an external source or generate it on demand than the task becomes less of an overhead.

So long as the format remains the same you can continue to append to $aData from various sources and it will insert as desired.

The following could be further improved by implementing multiple binds but that is slightly outside of scope of the question even if more efficient.

Vladimir Ramik
  • 1,920
  • 2
  • 13
  • 23
  • Absolutely right. Added some comments for why one would want to do this vs. something else. Multiple binds would push efficiency. http://stackoverflow.com/questions/12344741/binding-multiple-values-in-pdo – Vladimir Ramik Mar 15 '15 at 02:28