I have a very large sql insert file which is throwing "out of memory" errors when running it in SQL Enterprise Manager.
The advice I have seen is to add the "GO" command every X amount of rows to the inserts are "batched".
I am trying to write a small function to read the file and every 50 lines add a row with the text "GO"
The code I have written is also throwing System.OutOfMemoryException when I run it.
Can anyone suggest a better way of writing my code to fix this problem please?
This is what I have written:
public static void AddGo()
{
int currentline = 0;
string FilePath = @"C:\Users\trevo_000\Desktop\fmm89386.sql";
var text = new StringBuilder();
foreach (string s in File.ReadAllLines(FilePath))
{
// add the current line
text.AppendLine(s);
// increase the line counter
currentline += 1;
if (currentline == 50)
{
text.AppendLine("GO");
currentline = 0;
}
}
using (var file = new StreamWriter(File.Create(@"C:\Users\trevo_000\Desktop\fmm89386Clean.sql")))
{
file.Write(text.ToString());
}
}