0

I have a method that has to execute sql scripts for many times. These scripts use for create tables, views, stored procedures, and functions on the database. I came up with this code which works fine with 100 files.

foreach (var folder in ActivityConstant.SourceFolders)
{
    var destination = context.GetValue(this.DestinationPath) + folder;

    // TODO: Execute all sql folders instead of each sql file.
    // this is cmd command for %f in (*.sql) do sqlcmd /S <servername> /d <dbname> /E /i "%f"

    foreach (var file in Directory.GetFiles(destination))
    {
        var begin = DateTime.Now;
        context.TrackBuildWarning(string.Format("Start exec sql file at {0}.", 
                                                DateTime.Now));

        Process process = new Process();
        process.StartInfo.UseShellExecute = false;
        process.StartInfo.RedirectStandardOutput = true;
        process.StartInfo.RedirectStandardError = true;
        process.StartInfo.CreateNoWindow = true;
        process.StartInfo.FileName = "sqlcmd.exe";
        process.StartInfo.Arguments = string.Format("-S {0} -d {1} -i {2} -U {3} -P {4}", 
                                                    sqlServerName, 
                                                    databaseName, 
                                                    file, 
                                                    sqlUserName, 
                                                    sqlPassword);
        process.StartInfo.WorkingDirectory = @"C:\";
        process.Start();
        //process.WaitForExit();

        context.TrackBuildWarning(
           string.Format(
                "Finished exec sql file at {0} total time {1} milliseconds.", 
                DateTime.Now, 
                DateTime.Now.Subtract(begin).TotalMilliseconds));
    }
}

Now, I am moving on to the next step. I am testing it with our database which has around 600 files (tables, views, stored procedures, and functions) and it looks like the current code cannot deal with large amount of query like this.

As far as I have records, it takes between 3 to 5 minutes to run around 100 files. As the time I am writing this question, it has been 40 minutes for those 600 files.

I would like to know how can I improve my code. I also welcome any suggestion if it is better to use difference way to archieve the goal.

sehe
  • 374,641
  • 47
  • 450
  • 633
Anonymous
  • 9,366
  • 22
  • 83
  • 133

4 Answers4

2

You are starting 600 processes at the same time, that is not good for performance, they probably take a LOT of RAM, maybe swapping occurs?

It will also create lots of load on your SQL server, might be a memory issue there too.

Try to run your queries sequentially or at least only a few at a time.

Edit
As as side note you should consider using SqlCommand instead of launching external processes.

Albin Sunnanbo
  • 46,430
  • 8
  • 69
  • 108
  • SQL files contain CREATE (tables, views, stored procedures) command. From what I think, SQL command works only with SELECT, INSERT, UPDATE, and DELETE. Do you have any alternative ideas? – Anonymous Oct 03 '12 at 09:10
  • `SqlCommand` works with DML too (CREATE, ALTER, etc), but does not handle the `GO` statement. You can either split your files on each `GO` and run each splitted part in a separate SqlCommand or you can look [here](http://stackoverflow.com/a/40830/401728) – Albin Sunnanbo Oct 03 '12 at 10:50
0

If this is only one database per set of files - you shouldnt login every file/query.

Kamil
  • 13,363
  • 24
  • 88
  • 183
0

I suspect the most useful way to improve performance would be to move the execution of the actual SQL commands into the application.

This avoids

  • creating many processes (600, in parallel ?!)
  • repeated connections (at least one per process)
  • suboptimal lock contention (due to undeterministic execution order)

I'd look at either storing the scripts as resources, then executing then on an SqlConnection.

Think about transaction management, when applicable. The easiest way to handle this would be to still open a new connection for each 'script'. As long as you do this using the same connection string each time, the connection will be pooled automatically by the Framework.

sehe
  • 374,641
  • 47
  • 450
  • 633
0

You can try to merge the files before executing them:

        using (var merged = File.OpenWrite("merged.sql"))
        {
            foreach (var file in Directory.GetFiles(destination))
            {
                var data = File.ReadAllBytes(file);
                merged.Write(data, 0, data.Length);
            }
        }
        //continue with "merged.sql" as source file for the sqlcmd

Also, for this task there is a multiplicity of existing tools, google for "Database Deployment Tool" or some such.

Tar
  • 429
  • 2
  • 5