I have a method that has to execute sql scripts for many times. These scripts use for create tables, views, stored procedures, and functions on the database. I came up with this code which works fine with 100 files.
foreach (var folder in ActivityConstant.SourceFolders)
{
var destination = context.GetValue(this.DestinationPath) + folder;
// TODO: Execute all sql folders instead of each sql file.
// this is cmd command for %f in (*.sql) do sqlcmd /S <servername> /d <dbname> /E /i "%f"
foreach (var file in Directory.GetFiles(destination))
{
var begin = DateTime.Now;
context.TrackBuildWarning(string.Format("Start exec sql file at {0}.",
DateTime.Now));
Process process = new Process();
process.StartInfo.UseShellExecute = false;
process.StartInfo.RedirectStandardOutput = true;
process.StartInfo.RedirectStandardError = true;
process.StartInfo.CreateNoWindow = true;
process.StartInfo.FileName = "sqlcmd.exe";
process.StartInfo.Arguments = string.Format("-S {0} -d {1} -i {2} -U {3} -P {4}",
sqlServerName,
databaseName,
file,
sqlUserName,
sqlPassword);
process.StartInfo.WorkingDirectory = @"C:\";
process.Start();
//process.WaitForExit();
context.TrackBuildWarning(
string.Format(
"Finished exec sql file at {0} total time {1} milliseconds.",
DateTime.Now,
DateTime.Now.Subtract(begin).TotalMilliseconds));
}
}
Now, I am moving on to the next step. I am testing it with our database which has around 600 files (tables, views, stored procedures, and functions) and it looks like the current code cannot deal with large amount of query like this.
As far as I have records, it takes between 3 to 5 minutes to run around 100 files. As the time I am writing this question, it has been 40 minutes for those 600 files.
I would like to know how can I improve my code. I also welcome any suggestion if it is better to use difference way to archieve the goal.