2

I'm going to explain with my best efforts what my goal is here. Everything I've searched for online hasn't been relevant enough for me to gain an idea.

First off, this is a PHP assignment where we have to load CSV files into a MySQL database.

Now, each table (total of 4) have the exact same field values. What I am trying to accomplish is using a for each loop that populates each table with the information from the CSV file. I know I can do this by having a while loop for each table and CSV file but I'm trying to go above the requirements and learn more about PHP. Here is my code for what I'm trying to accomplish:

$files = glob('*.txt'); // .txt required extension 
foreach($files as $file) {

    if (($handle = fopen($file, "r")) !== FALSE) {
        while (($data = fgetcsv($handle,4048, ",")) !== FALSE) {
            echo $data[0]; // making sure data is correct
            $import = "INSERT INTO".basename($file)."(id,itemName,price) VALUES('$data[0]','$data[1]','$data[2]')";
            multi_query($import) or die (mysql_error());
        }
        fclose($handle);
    }
    else {
        echo "Could not open file: " . $file;
    }
}

Each CSV file contains the id, itemName and price. Hopefully this is understandable enough. Thank you

David Ferenczy Rogožan
  • 23,966
  • 9
  • 79
  • 68
Steven Marrocco
  • 111
  • 1
  • 10
  • multi_query is not called that way. If you are using the procedural variant, then the first argument is the connection, then the query. – Palantir Oct 16 '15 at 17:23
  • 1
    Aside from the obvious syntax error (you need a space in `INTO "...`), this sounds like a very bad database design. The best way to do this is to normalize your data and not repeat yourself. – elixenide Oct 16 '15 at 17:24
  • Also, try and print your $import to check that it has the correct syntax and variables inside it (hint: it has not). – Palantir Oct 16 '15 at 17:25
  • If you need to insert same data into 2 different tables, just execute the same query twice, each time with a different table name. But as @EdCottrell noticed, you should avoid having the same date in the DB more than once. – David Ferenczy Rogožan Oct 16 '15 at 17:34
  • Thank you for pointing out the selling mistakes. What I eventually noticed is that when basename($file) is called it is (for example) calling cellphone.txt. The name of the table is "cellphone" i used substr to remove the .txt and it works but using mysqli_error() seems to not allow it to loop through, removing mysqli_error() fixed this issue – Steven Marrocco Oct 16 '15 at 17:43
  • Also, please [don't use `mysql_*`](http://stackoverflow.com/questions/12859942/why-shouldnt-i-use-mysql-functions-in-php?rq=1); the `mysql_*` functions are outdated, [deprecated](http://us3.php.net/manual/en/intro.mysql.php), and insecure. Use [`MySQLi`](http://us3.php.net/manual/en/book.mysqli.php) or [`PDO`](http://us3.php.net/manual/en/intro.pdo.php) instead. And you are wide open to [**SQL injection**](https://www.owasp.org/index.php/SQL_Injection). – elixenide Oct 16 '15 at 17:50
  • I'll switch my mysqli! thank you. I dont know much about SQL injection, thank you for the link. I'll read up on it. Would an injection be possible in my code by editing the CSV files it reads in? – Steven Marrocco Oct 16 '15 at 17:53

1 Answers1

1

The way you are importing data into MySQL is OK for small volume of data. However, if you are importing huge volumes(thousands of rows), the best way would be to import it directy into MySQL is by using infile. Fo example:

LOAD DATA LOCAL INFILE '/path/to/your_file.csv' 
INTO TABLE your_table_name 
FIELDS TERMINATED BY ',' 
ENCLOSED BY '"' LINES 
TERMINATED BY '\n' (id, itemName, price)

That's a smarter way to import your CSV data :)

Hyder B.
  • 10,900
  • 5
  • 51
  • 60