I have a series of objects which each have their own helper functions to prepare them for an SQL Insert. When originally coded, this was accomplished by ordinary string manipulation to generate the SQL Command Text. The string would then get passed to a generic SQL Execute module which does the database communication. I am now converting these methods to use parameters for best practice employment. My Select Query, method looks like this:
public static DataTable ExecuteSelectSQLQuery(string SQLString,
List<SqlParameter> myParameters)
{
string fullSQLString = "";
fullSQLString = "USE " + dbName + "\n" + SQLString;
string myConnString = ConnectionStringHelper.GetConnectionString();
Console.WriteLine("Connecting to SQL Server for Parameterized Query...");
SqlConnection connection = new SqlConnection(myConnString);
SqlDataAdapter myAdapter = new SqlDataAdapter(SQLString, connection);
foreach (SqlParameter para in myParameters)
myAdapter.SelectCommand.Parameters.Add(para);
DataSet mySet = new DataSet();
myAdapter.Fill(mySet, "myTable");
DataTable dt = mySet.Tables["myTable"];
Console.WriteLine("Connection Closed. Operation Complete.");
return dt;
}
I am now seeking to parameterize the Insert query in the same fashion. At times, I will want to upload a few hundred records at a time, so my initial method just built the SQL accordingly. When I seek to parameterize the query, it seems like the preferred bulk processing technique is to run a loop within the connection, as shown here. I want my execute function to be object agnostic; I'm happy to submit strings and parameters, but I don't want to be looping through my data source within the SQL Execute method.
Short of calling a new single insert function for each row of data, is there a best-practice way of efficiently passing multiple rows of parameterized data?