0

Update: my code works if I put my database file in script file, so not sure why is it not working if I have different file

I have a csv file that I parse at input as in command line and use that file (csv) to insert those values of csv (name, surname and email) to mysql. When I run my code it keep printing over and over until it runs out of memory connected to database what am I doing wrong? ANy help is appreciated.

script.php

<?php 
require "database.php";


$options = ['file:'];

$values = getopt(null, $options);

$lines = file($values['file'], FILE_IGNORE_NEW_LINES|FILE_SKIP_EMPTY_LINES);


$csv = array_map('str_getcsv', $lines);
$col_names = array_shift($csv);

$users = [];
foreach($csv as $row) {
    $users[] = [
        $col_names[0] => ucfirst(strtolower($row[0])),
        $col_names[1] => ucfirst(strtolower($row[1])),
        $col_names[2] => filter_var(strtolower($row[2]), FILTER_VALIDATE_EMAIL),
    ];
   
    $stmt = $dbh->prepare("INSERT INTO users(name, surname, email) 
            VALUES(:name, :surname, :email)");
            $stmt->bindParam(':name', $col_names[0]);
            $stmt->bindParam(':surname', $col_names[1]);
            $stmt->bindParam(':email', $col_names[2]);
            $stmt->execute();
      
}
$dbh = null;

?>

database.php

<?php 

$hostname = 'localhost';
$username = 'root';
$password = 'yourpasswordhere';

try {
    $dbh = new PDO("mysql:host=$hostname;dbname=userDetails", $username, $password);
    $dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
    echo 'Connected to Database';

} catch (PDOException $e) {
    echo $e->getMessage();
}


?>
iccea
  • 11
  • 2
  • Do you need `$users` variable? If this code is running until some point, maybe you have to increase the `memory_limit` and/or don't use `$users` array to store the data – Felippe Duarte Oct 27 '20 at 22:36
  • My data in csv file is just 12 lines, so I don't think so I need more memory. And, yes I do need $users variable since I am storing the csv file values in it, so I can use that to insert to mysql. But, if you think my code is not right as I am new to it I will appreciated any help. I just need to fetch the data from csv file from command line and insert it into mysql. @FelippeDuarte – iccea Oct 27 '20 at 22:39
  • @FelippeDuarte Also, If I don't use $users array to store then how shall I store my data of csv file. Ca you reckon something please? – iccea Oct 27 '20 at 22:41
  • I would add a print_r or var_dump before your foreach loop just to ensure that the data is still what you are expecting. –  Oct 27 '20 at 22:48
  • @jeff Could you please tell me what am I suppose to write inside var_dump()? – iccea Oct 27 '20 at 22:51
  • @jeff I try printing the var_dump() with the values I have but it can't get past `connected to database` so nothing else runs – iccea Oct 27 '20 at 22:52
  • It works if I put my database file in same file – iccea Oct 27 '20 at 22:54
  • Consider scrapping all that and using an ```external table``` instead. – alexherm Oct 27 '20 at 23:03
  • You won't actually need the `$users` variable for pooling your CSV rows (as in your [previous question](https://stackoverflow.com/questions/64530592/how-to-import-parse-csv-file-to-maria-db)) if you're doing inserts in the same loop. I would personally pool the CSV into a desirable format first and run a second, separate routine for the inserts. (Consider writing functions to handle one job per function.) Will make debugging a lot easier. – Markus AO Oct 27 '20 at 23:26
  • If you want to it all in one loop, instead of adding the cleaned fields to the `$users` array, simply assign as `$name = ucfirst(strtolower($row[0]));` etc. and use those variables for your binds like ` $stmt->bindParam(':name', $name);`. The `$col_names[0]` etc. routine you've used from my previous answer is there to keep it abstract, ie. if your field names on the header row change, the CSV row pooler will still turn the CSV into an associative array without need to change hard-coded field names. – Markus AO Oct 27 '20 at 23:29

0 Answers0