I have an MVC5 application, I'm using EF6 and MySQL database service from Azure, which is CleanDB MySQL. I have a CSV reader class, which works pretty fast in my console app. Then, in my MVC app, I have a model like this:
Provider.cs
public class Provider
{
public int ProviderId { get; set; }
public string ProviderName { get; set; }
public string Email { get; set; }
public string Address { get; set; }
public string City { get; set; }
[ForeignKey("State")]
public int StateID { get; set; }
public virtual State State { get; set; }
public string Zip { get; set; }
public string Phone { get; set; }
public string Fax { get; set; }
public string SICCode { get; set; }
public string Description { get; set; }
public string Website { get; set; }
public string RefId { get; set; }
}
Then, on my Configuration.cs file, I have something like this:
string[] csvs = Directory.GetFiles(@"C:\Users\User\Downloads\db\test\");
foreach (string csv in csvs)
{
using (CsvReader reader = new CsvReader(csv))
{
foreach (string[] values in reader.RowEnumerator)
{
var provider = new Provider();
provider.ProviderName = values[0];
provider.Email = values[1];
provider.Address = values[2];
provider.City = values[3];
provider.StateID = 491;
provider.Zip = values[5];
provider.Phone = values[6];
provider.Fax = values[7];
provider.SICCode = values[8];
provider.Description = values[9];
provider.Website = values[10];
provider.RefId = "";
context.Providers.Add(provider);
}
}
}
context.SaveChanges();
The point is that I have about 50 CSV files, with a total size of 400MB, and total entries around 9 million. Just for testing purposes I used the smallest CSV file in my list, which was a 2MB file, with around 15k entries. Executing Update-Database
took around 40 minutes to put all those entries into the database. You can assume how much time I will need for 9 million entries. Any idea how to speed-up the process?