0

I am uploading a file through a file upload control and then stream reader reads the file itno a datatable and then sql bulk copy copies the datatable to my sql database and fills the appropriate columns. Does anyone see anything wrong with the code below? I dont get an error message but it seems like it is getting hung up on the IIS process. I cant go in to delete the csv file from the folder because it says the process is still working.

    protected void btnUpload_Click(object sender, EventArgs e)
    {

        //upload file to the gencouploadfiles folder
        UploadFile();

        //fetch CSV file from the folder
        string strFilePath = Server.MapPath("GencoUploadFiles") + "\\" + "GencoUploadFile.txt";

        //perform sql bulk copy
        PerformBulkCopy(GencoUpload(strFilePath));

        //delete the file from the folder
    }



    public void UploadFile()
    {
        if (fileUpload1.HasFile)
        {
            FileInfo fileinfo = new FileInfo(fileUpload1.PostedFile.FileName);
            string strCsvFilePath = Server.MapPath("GencoUploadFiles") + "\\" + "GencoUploadFile.txt";
            fileUpload1.SaveAs(strCsvFilePath);
        }
    }


    public static DataTable GencoUpload(string filepath)
    {
        StreamReader sr = new StreamReader(filepath);
        string line = sr.ReadLine();
        string[] value = line.Split('|');
        DataTable dt = new DataTable();
        DataRow row;

        foreach (string dc in value)
        {
            dt.Columns.Add(new DataColumn(dc));
        }

        while (!sr.EndOfStream)
        {
            value = sr.ReadLine().Split('|');
            if (value.Length == dt.Columns.Count)
            {
                row = dt.NewRow();
                row.ItemArray = value;
                dt.Rows.Add(row);
            }
        }
        return dt;
    }


    public void PerformBulkCopy(DataTable dt)
    {
        SqlConnection conStr = new SqlConnection(ConfigurationManager.ConnectionStrings["EDI"].ConnectionString);

        using (SqlBulkCopy bulkcopy = new SqlBulkCopy(conStr.ConnectionString))
        {
            bulkcopy.DestinationTableName = "dbo.GencoUploadTempTable";
            bulkcopy.BatchSize = dt.Rows.Count;
            conStr.Open();
            bulkcopy.WriteToServer(dt);
            bulkcopy.Close();
            conStr.Close();
        }
    }
Jeff
  • 427
  • 1
  • 14
  • 31
  • Can you use a SqlDataReader instead of a DataTable..? – MethodMan Feb 23 '12 at 21:53
  • SQL Bulk copy should work fine, but I've never used it in a web page, only in a windows service. Can you give us an idea of the file size? It looks like you are basically batching the entire file in one push to the server. Have you examined the memory utilization of the server when doing this? This is basically storing the entire file in memory when you load it into the data table, so that could be degrading the server if the file is huge. – swannee Feb 23 '12 at 22:03
  • It should work fine; the file is uploaded to the server and then the SqlBulkCopy is done from the local server file system to the db - provided the file has actually been uploaded, of course (which it has in this case as the file is locked server side ;-) – dash Feb 23 '12 at 22:10
  • I don't think bulk copy is your problem. It looks fine, but maybe try taking out the batchsize. – Induster Feb 23 '12 at 22:23

1 Answers1

0

Simplify your code; there are plenty of ways to get the csv into a datatable - see How to read a CSV file into a .NET Datatable for example. In your example above, you do not appear to be closing the stream after you have finished reading it... add sr.Close() before the return, or better still wrap the declaration in a using() statement:

  public static DataTable GencoUpload(string filepath) 
  { 
        DataTable dt = new DataTable(); 

        using(StreamReader sr = new StreamReader(filepath))
        { 
            string line = sr.ReadLine(); 
            string[] value = line.Split('|'); 

            DataRow row; 

            foreach (string dc in value) 
            { 
                dt.Columns.Add(new DataColumn(dc)); 
            } 

            while (!sr.EndOfStream) 
            { 
                value = sr.ReadLine().Split('|'); 
                if (value.Length == dt.Columns.Count) 
                { 
                    row = dt.NewRow(); 
                    row.ItemArray = value; 
                    dt.Rows.Add(row); 
                } 
            }
        }

        return dt; 
    } 

This should prevent the file from becoming locked.

Next thing to look at is to check if the data table actually has any data in it. Put a break point in and test to see if your load code is actually working; you are adding rows to a data table but you haven't defined the column structure (i.e. you've only supplied names not data types so you might get conversion issues). It's definitely going to be easier to use one of the other methods for loading your file :-)

Community
  • 1
  • 1
dash
  • 89,546
  • 4
  • 51
  • 71