1

I am trying to insert data into MySql table from csv file using PHP,

please see this image - my CSV file. enter image description here

the problem i face is, while loading the CSV file, the net_sales column becomes rounded and inserted in the table.

please see below sample image of MySql Table after inserting. enter image description here

FYI, here i am showing only net_sales column for explaining my problem, actually i have more columns in the table and CSV.

due to some reasons, i cant have a static insert statement like this "insert into tran_detail (tran_id,tran_datetime,net_sales) values (...)";

so i prefer to have this statement "INSERT INTO tran_detail (".implode(',', array_keys($data)).") VALUES('".implode('\',\'', array_map("convert",array_values($data)))."')";

this is the PHP i am using for inserting. please help to insert the value with decimals in table as it is in the CSV.

    function convert($string)
    {
        return htmlspecialchars($string,ENT_QUOTES);
    }

    $columnArray    = array();
    $dataArray      = array();
    $firstRule      = true;

    while ($data = fgetcsv ($source, 1000, ","))
    {
        if($firstRule)
        {
            foreach($data as $columnName)
            {
                $columnArray[] = $columnName;
            }

            $firstRule = false;
        }
        else
        {
            $rule = array();
            for($i = 0; $i < count($data) ; $i++)
            {
                $rule[$columnArray[$i]] = $data[$i];
            }
            $dataArray[] = $rule;
        }
    }

    foreach($dataArray as $data)
    {

    $query = "INSERT INTO `tran_detail` (".implode('`,`', array_keys($data))."`) VALUES('".implode('\',\'', array_map("convert",array_values($data)))."')";

        mysql_query($query) or mysql_error();

    }

    fclose($source);
davidb
  • 263
  • 5
  • 10
  • 23
  • 1
    use field type `decimal(15,2)` that's what I use for money. – ArtisticPhoenix Nov 23 '17 at 12:53
  • Possible duplicate of [Best data type to store money values in MySQL](https://stackoverflow.com/questions/13030368/best-data-type-to-store-money-values-in-mysql) – ArtisticPhoenix Nov 23 '17 at 12:57
  • @ArtisticPhoenix, if i insert like this, it works with field type float, `"insert into tran_detail (tran_id,tran_datetime,net_sales) values (12345,'2017-11-21 15:11:26',155.5)";` – davidb Nov 23 '17 at 12:59
  • then convert it from aa string using `(float)` or `floatva($value)` but I would still use decimal as it has fixed point precision – ArtisticPhoenix Nov 23 '17 at 13:02
  • @ArtisticPhoenix, agreed, i will change the field type to decimal – davidb Nov 23 '17 at 13:05
  • @ArtisticPhoenix, some of the columns are strings, like product description, etc. so while inserting through loop, how do i specifically convert net_sales field like this floatva($value) ? – davidb Nov 23 '17 at 13:06
  • 1
    how should I know, your array keys are not shown, in your loop be like this `$data['net_sales'] = (float)$data['net_sales'];` that said you code is wide open to SQLInjection, so I would use prepared statements or some person will hack your database. Not to mention `mysql_*` functions are depreciated and removed in PHP7, look into using PDO instead, its far superior. – ArtisticPhoenix Nov 23 '17 at 13:10
  • the problem is, CSV will not have static number of columns. each time while uploading the number of columns will be either more or less, so iam taking the headers from csv file itself which is the first line of CSV. it will exactly match with table column names. – davidb Nov 23 '17 at 13:29
  • 1
    Can you show what you get when you `echo $query` right before it runs? – random_user_name Nov 23 '17 at 17:17
  • That's not as big a problem as someone hacking your database, the DB has a fixed schema, no? – ArtisticPhoenix Nov 23 '17 at 18:57
  • your also missing a backtic ( `{bt}` ) `INSERT INTO {bt}tran_detail{bt} ({here}".implode({bt}` in your sql, I'll hook you up though. – ArtisticPhoenix Nov 23 '17 at 19:26
  • Also, keep in mind to use mysqli_query or PDO instead of mysql_query as the last one is deprecated and you should always escape data. – zeeks Nov 23 '17 at 20:53
  • @cale_b, i have checked echo $query , the value is with decimal in the insert statement. – davidb Nov 24 '17 at 09:26

1 Answers1

3

This is the way I would do this:

<?php
/*
   PDO named placeholders require that the array keys are matched
   so we have to prefix them with a colon : as in 'field' becomes ':field'
   the benefit here is the array key order is irrelevant,
   so your csv could have the headers in any order.
*/
function prefixPdoArray(array $array){
    return array_map(function($item){
        return ':'.$item;
    }, $array);
}

//PDO database driver
$dsn = 'mysql:dbname=testdb;host=127.0.0.1';
$user = 'dbuser';
$password = 'dbpass';

$db = new PDO($dsn, $user, $password);

//header array should match DB fields
$header_map = [
    'field1',
    'field2',
    'field3',
    'field4',
];

$placeholders = prefixPdoArray($header_map);

//prepare the query outside of the loop
$stmt = $db->prepare('INSERT INTO `tran_detail` (`'.implode('`,`', $header_map).'`)VALUES('.implode(',', $placeholders).')');

/*
    we can dynamically build the query because $header_map and $placeholders
    are "canned" data, but you could just type it out as well.

    if you do the SQL manually you can dump $header_map and $placeholders
    and manually create $default_map. You could also dump this function
    prefixPdoArray() and just move the array map to $headers.
    so it would be a bit more efficient, but I thought I would show you
    a proper way to build the query dynamically.
*/

$default_map = array_fill_keys($placeholders, '');

//read the first line
$headers = fgetcsv($source, 1000, ","); 
//$header_count = count($csv_headers); //for error chcking if needed

//prefix csv headers
$headers =  prefixPdoArray($headers);

while ($data = fgetcsv($source, 1000, ",")){      
    /*
        array combine will throw an error if the header length
        is different then the data length.
        this indicates a missing or extra delimiter in the csv file.
        you may or may not have to check for this condition  
        -------------------------------------
        if( $header_count != count($data) ){ //do something on error }
    */
    //map file data to file headers
    $csv_mapped = array_combine( $headers, $data);

    //map file row to database query
    $csv_mapped = array_replace($default_map, $csv_mapped );

    //execute the query
    $stmt->execute($csv_mapped);
}

fclose($source);

Note I can only do limited testing on this ( no DB or files ), so for testing and explanation purposes here is the code for testing the basic functionality.

<?php
function prefixPdoArray(array $array){
    return array_map(function($item){
        return ':'.$item;
    }, $array);
}

//header array should match DB fields
$header_map = [
    'field1',
    'field2',
    'field3',
    'field4',
];

$placeholders = prefixPdoArray($header_map);
echo str_pad(' Placeholders ', 45, '-', STR_PAD_BOTH)."\n";
var_dump($placeholders);

//prepare the query
echo "\n".str_pad(' Raw SQL ', 45, '-', STR_PAD_BOTH)."\n";
echo 'INSERT INTO `tran_detail` (`'.implode('`,`', $header_map).'`)VALUES('.implode(',', $placeholders).')';

$default_map = array_fill_keys($placeholders, '');
echo "\n\n".str_pad(' Default Map ', 45, '-', STR_PAD_BOTH)."\n";
var_dump($default_map);

//(CANNED TEST DATA) read the first line
//example data for testing ( missing field1 ), and field3 out of order
$headers =  [
    'field3',
    'field2',
    'field4',
];
//prefix headers with placeholders
$headers =  prefixPdoArray($headers);

echo "\n".str_pad(' CSV headers ', 45, '-', STR_PAD_BOTH)."\n";
var_dump($headers);

//while ($data = fgetcsv($source, 1000, ",")){
    //(CANNED TEST DATA) read the data line(s)
    //example data for testing ( missing field1 ), and field3 out of order
    $data = [
        'value3',
        'value2',
        'value4',
    ];
    echo "\n".str_pad(' CSV data ', 45, '-', STR_PAD_BOTH)."\n";
    var_dump($data); 

    $csv_mapped = array_combine( $headers, $data);
    echo "\n".str_pad(' CSV mapped data ', 45, '-', STR_PAD_BOTH)."\n";
    var_dump($csv_mapped); 

    $csv_mapped = array_replace($default_map, $csv_mapped );
    echo "\n".str_pad(' CSV filled data ', 45, '-', STR_PAD_BOTH)."\n";
    var_dump($csv_mapped); 
//}

Outputs

    --------------- Placeholders ----------------
array(4) {
  [0]=>   string(7) ":field1"
  [1]=>   string(7) ":field2"
  [2]=>   string(7) ":field3"
  [3]=>   string(7) ":field4"
}

------------------ Raw SQL ------------------
INSERT INTO `tran_detail` (`field1`,`field2`,`field3`,`field4`)VALUES(:field1,:field2,:field3,:field4)

---------------- Default Map ----------------
array(4) {
  [":field1"]=>   string(0) ""
  [":field2"]=>   string(0) ""
  [":field3"]=>   string(0) ""
  [":field4"]=>   string(0) ""
}

---------------- CSV headers ----------------
array(3) {
  [0]=>   string(7) ":field3"
  [1]=>   string(7) ":field2"
  [2]=>   string(7) ":field4"
}

----------------- CSV data ------------------
array(3) {
  [0]=>   string(6) "value3"
  [1]=>   string(6) "value2"
  [2]=>   string(6) "value4"
}

-------------- CSV mapped data --------------
array(3) {
  [":field3"]=>   string(6) "value3"
  [":field2"]=>   string(6) "value2"
  [":field4"]=>   string(6) "value4"
}

-------------- CSV filled data --------------
array(4) {
  [":field1"]=>   string(0) ""
  [":field2"]=>   string(6) "value2"
  [":field3"]=>   string(6) "value3"
  [":field4"]=>   string(6) "value4"
}

You can check it out here.

http://sandbox.onlinephpfunctions.com/code/ab868ac6c6fbf43d74cf62ef2907b0c72e1f59bf

The most important part in the output is the last 2 arrays, as you can see how we map the data to the file headers, and then use the $default_map to fill in any missing columns. You can put whatever defaults in there you need, such as null or what have you, but you'll have to do it manually instead of using $default_map = array_fill_keys($placeholders, '');

This should make it pretty self explanatory, if not feel free to ask.

Hopefully I got everything matched up between them and for the DB and file stuff, if not it should be really close. But this is quite a bit of fairly complex code, so it's not inconceivable I may have missed something.

The important thing is this will let you map out the CSV data in an elegant way, and avoid any SQL Injection nastyness.

ArtisticPhoenix
  • 21,464
  • 2
  • 24
  • 38
  • thank you so much for your time, and it is a great solution. – davidb Nov 24 '17 at 09:27
  • Sure, glad I could help – ArtisticPhoenix Nov 24 '17 at 10:12
  • Thanks, I actually built a database class that uses `SHOW table` to get the schema and do something similar. Not to mention it helps that I work with CSV files on a daily basis ... I also have a SplFileObj class that maps csvs. I gotta get some of that stuff published on Composer ... – ArtisticPhoenix Nov 24 '17 at 22:02