I have this function in my C# app:
public static string SafeTrim(object str)
{
if ( str == null || str == DBNull.Value )
return string.Empty;
else
return str.ToString().Trim();
}
It works fine, but in my import utility it is called millions of times while processing hundreds of thousands of records. The ANTS profiler stated that this function consumes a lot of CPU cycles because it is called so often.
Edit 1
I neglected to mention that a very common usage of SafeTrim()
in my app is for DataRow/DataColumn
values. Example: SafeTrim(dt.Rows[0]["id"])
- it's common for that to contain a DBNull.Value
, and it's also common for that to contain edge spaces that need to be trimmed.
Can it be optimized in any way?
Edit 2
I'll be trying these different approaches, under load, and reporting back tomorrow.