1

I am trying to insert a data file from excel pivot table into mysql using C#. I am using phpmyadmin. I was able to use the insert statement to insert the data. However, I am finding the performance for the insertion is extremely slow for my purposes as the data file will contain at least 15000 rows and this operation will have high number of occurrences. Currently I have an array list which holds the data before insertion. I was reading online and found that I should be using Load Data Infile and have my data in CSV file. My CSV file does not have all the same amount of columns as my SQL table. I am not sure how to pass the address to the CSV file and I don't want the whole operation to fail if one of the rows is a duplicate. I want to use the in file method if it can help my situation.

The error I am getting now is: File not found (Errcode: 22 "Invalid argument")

Here is my code so far for attempting the in file method.

       OpenFileDialog ofd3 = new OpenFileDialog();

        if (ofd3.ShowDialog() == DialogResult.OK)
        {
            DirectoryInfo di = new DirectoryInfo("../../files/");

            string hours = ofd3.FileName;


            string cmd = "LOAD DATA INFILE '" + hours + "' INTO TABLE timesheet FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'";

            mysql_insert_infile(cmd);
            }








   private void mysql_insert_infile(string query)
    {


        using (var connection = new MySqlConnection("Server=localhost;Database=projectdash;Uid=root;Pwd=;allow zero datetime=yes;Allow User Variables=True"))
        using (var cmd = connection.CreateCommand())
        {
            connection.Open();


            cmd.CommandText = query;



            try
            {
                cmd.ExecuteNonQuery();
            }
            catch (Exception w)
            {

                //   MessageBox.Show(w.Message);

            }



        }


    }
Isaac Bennetch
  • 11,830
  • 2
  • 32
  • 43
NoobCoder
  • 21
  • 5
  • 1
    I'm pretty sure the way your question is off topic as it is formulated now. What have you done to try? What are the errors you encountered? Show some proof you actually don't expect us to write code for you for free. Please read [ask] and [mcve] and edit your question accordingly. – davejal Feb 21 '17 at 14:23
  • Without altering your process too much, you should be able to open up the connection once, perform a bunch of nonquery executions, then close the connection...I don't have the exact syntax handy, but this should speed things up considerably without much refactoring. – user2366842 Feb 21 '17 at 15:23
  • Also consider [this](http://stackoverflow.com/questions/14330314/bulk-insert-in-mysql) – TaW Feb 21 '17 at 16:51

1 Answers1

1

My bet would be to load the entire file in memmory and then create a "BULK" insert command containing at least 1000 rows.
You can clean your data to avoid duplicates and then create the command using only teh columns you need.
Just remember to insert multiple rows as described here:

Inserting multiple rows in mysql

Community
  • 1
  • 1
Balbinator
  • 220
  • 2
  • 9