I'm working on a project in C# that converts a database table to an XML-file with base64 encoded contents. Please bear with me, because C# is not my day-to-day programming language.
The code I've managed to come up with is this:
OdbcCommand DbCommand = DbConnection.CreateCommand();
DbCommand.CommandText = "SELECT * FROM " + dbTable;
OdbcDataReader DbReader = DbCommand.ExecuteReader();
int fCount = DbReader.FieldCount;
string[] colnames = new string[fCount];
output += "<" + dbTable + ">\n";
for (int i = 0; i < fCount; i++)
{
string fName = DbReader.GetName(i);
colnames[i] = fName.ToString();
}
while (DbReader.Read())
{
output += "\t<export_row>\n";
for (int i = 0; i < fCount; i++)
{
string col = "";
try
{
col = DbReader.GetString(i);
}
catch (Exception) { }
if (col.ToString().Length > 0 || i == 0)
{
output += "\t\t<" + colnames[i] + ">" + Base64Encode(col).ToString() + "</" + colnames[i] + ">\n"; ;
}
}
output += "\t</export_row>\n";
}
output += "</" + dbTable + ">\n";
The problem is, that even with a relatively small table, this causes the application to choke up and run extremely slowly. The obvious clue is that there's an enormous amount of iterations involved for each row, so I have been looking for a solution to this problem. I have tried using a DataSet, which seemed to increase performance slightly, but not significantly enough.
connection.Open();
adapter.Fill(dataSet);
output += "<" + dbTable + ">\n";
foreach (DataTable table in dataSet.Tables)
{
foreach (DataRow row in table.Rows)
{
output += "\t<export_row>\n";
foreach (DataColumn column in table.Columns)
{
output += "\t\t<" + column.ToString() + ">" + Base64Encode(row[column].ToString()).ToString() + "</" + column.ToString() + ">\n"; ;
}
output += "\t</export_row>\n";
}
}
output += "</" + dbTable + ">\n";
However, the problem remains that there is no other way than iterating through all the columns each and every time. Which begs the question: isn't there a more efficient way to do this? I'm not going to make a model for every table, because there are hundreds of tables in this database and the power would be the flexibility of transferring data in this way.
Can someone help me out, or point me in the right direction? For example, is there a way to extract both the column and the value at the same time? As in: foreach(row as key => value) or something. That would drastically reduce the amount of iterations required.
Thanks in advance for thinking along! There must be something (obvious) I missed.