I decided not to use an orm and going to use straight ADO.NET for my project. I know I know its gonna take longer to program but I just want pages to load at high speeds even at peak time.
-
What database (I assume SQL Server)? – MusiGenesis Aug 28 '09 at 23:57
-
I would avoid premature optimization. – Alex Aug 29 '09 at 00:15
-
9+1 for using *plain old* ADO.NET :) – Alfred Myers Aug 29 '09 at 00:32
-
I will be using sql server 2008 – user161433 Aug 29 '09 at 00:52
-
I agree with Alex on premature optimization. It's not only going to take longer to program, it's going to take more time and effort to MAINTAIN it. Modern O/RMs can be very smart about SQL generation, and (as Axl implied) may have built-in caching. – TrueWill Aug 29 '09 at 01:12
-
@Truewill- What ORM would you suggest? – user161433 Aug 29 '09 at 01:43
-
5I am curious why you think that plain old ADO.NET would be faster than any modern ORM? You are creating more work for yourself in the short term, much greater maintenance costs in the long run, and assuming you could actually achieve all the performance optimizations modern ORM's provide...they will keep improving and eventually leave you in the dust again. Sorry, but I think you are making a massive error in judgement by trying to prematurely optimize before there even IS anything to optimize. – jrista Aug 29 '09 at 01:46
-
Suggested ORM's, in order of precedence: LINQ to SQL (simple/moderate project), nHibernate (complex project), Entity Framework v4.0 (with .NET 4.0), LLBLGen, then a wide variety of other alternatives. – jrista Aug 29 '09 at 01:47
-
1@jrista: all ORMs ultimately pass SQL to an underlying database for processing. ADO.NET also passes SQL to an underlying database for processing. It is theoretically possible for an ORM to be just as fast as ADO.NET, but it cannot possibly be faster than ADO.NET unless the SQL passed to the database via ADO.NET is poorly written. – MusiGenesis Aug 29 '09 at 03:34
-
11I wish more programmers had a basic understanding of the concept of engineering trade-offs. Rarely (and I mean very rarely) does one option outperform another in every possible dimension of comparison, and this is certainly true of ORM. ORMs have many advantages over straight SQL, but speed ceiling is definitely not one of these advantages. The asker is completely correct in assuming that (properly written) ADO.NET will not be outperformed. – MusiGenesis Aug 29 '09 at 03:39
5 Answers
One error I see repeated over and over again:
Instantiating and setting up everything (DbConnection, DbCommand, DbParameters) inside a method which is called repeatedly in a tight loop.
Most of the time you can refactor those local variables as class members instantiating them only once and keeping only the updates to DbParameters inside the method.
UPDATED to include sample code asked for in the comments
Disclaimer: This is a quick assembled sample for the sole intent of demonstrating the point about moving repetitive stuff out of the loop. Other better practices aren't necessarily implemented.
public static void Show() {
List people = new List();
//Everything is done inside the loop
PersonDal personDal = new PersonDal();
foreach (Person person in people) {
personDal.Insert(person);
}
//Things are setup once outside the loop.
using (DbConnection connection = SqlClientFactory.Instance.CreateConnection()) {
// setup the connection
BetterPersonDal betterPersonDal = new BetterPersonDal(connection);
connection.Open();
foreach (Person person in people) {
betterPersonDal.Insert(person);
}
}
}
}
class Person {
public int Id { get; set; }
public string Name { get; set; }
}
On this first implementation, every thing is set up every time the method is called:
class PersonDal {
public int Insert(Person person) {
DbProviderFactory factory = SqlClientFactory.Instance;
using (DbConnection connection = factory.CreateConnection()) {
connection.Open();
connection.ConnectionString = "Whatever";
using (DbCommand command = connection.CreateCommand()) {
command.CommandText = "Whatever";
command.CommandType = CommandType.StoredProcedure;
DbParameter id = command.CreateParameter();
id.ParameterName = "@Id";
id.DbType = DbType.Int32;
id.Value = person.Id;
DbParameter name = command.CreateParameter();
name.ParameterName = "@Name";
name.DbType = DbType.String;
name.Size = 50;
name.Value = person.Name;
command.Parameters.AddRange(new DbParameter[] { id, name });
return (int)command.ExecuteScalar();
}
}
}
}
Now we move the setup to the objects construction leaving it out of the loop:
class BetterPersonDal {
private DbProviderFactory factory;
private DbConnection connection;
private DbCommand command;
private DbParameter id;
private DbParameter name;
public BetterPersonDal(DbConnection connection) {
this.command = connection.CreateCommand();
this.command.CommandText = "Whatever";
this.command.CommandType = CommandType.StoredProcedure;
this.id = command.CreateParameter();
this.id.ParameterName = "@Id";
this.id.DbType = DbType.Int32;
this.name = command.CreateParameter();
this.name.ParameterName = "@Name";
this.name.DbType = DbType.String;
this.name.Size = 50;
this.command.Parameters.AddRange(new DbParameter[] { id, name });
}
public int Insert(Person person) {
this.id.Value = person.Id;
this.name.Value = person.Name;
return (int)command.ExecuteScalar();
}
}

- 6,384
- 1
- 40
- 68
Have a look at the Improving .NET Application Performance and Scalability book (available online for free, in MSDN). There's a whole chapter about improving ADO.NET performance.

- 94,805
- 45
- 217
- 260
-
Excellent links. The only thing I'd disagree with is their emphasis on using stored procedures for performance. As I understand it, modern versions of SQL Server cache the execution plans for ad hoc SQL queries, so after an ad hoc query is run once it should be as fast as a stored procedure. However, the linked article is actually from Microsoft, so I'm not going to claim that I know better than them. In all the benchmarking I've done lately, however, the performance of stored procedures and ad hoc queries is virtually identical. – MusiGenesis Aug 29 '09 at 00:07
-
1A very good book indeed. I have a printed version of it. Unfortunately it hasn't been updated since 2004 and lots of stuff has changed. The DataRow’s backing store for instance was totally rewritten in .Net 2.0 which dramatically changed the performance profile of using DataSets between versions 1.x and 2.0. – Alfred Myers Aug 29 '09 at 00:46
Be very smart about connection management. Opening a DB connection can be very expensive so try and keep that in mind when writing the database access layer.

- 10,274
- 3
- 35
- 42
-
5This shouldn't really be an issue so long as connection pooling is enabled. Instantiating a connection object and calling its `Open` method will usually just grab an available connection from the pool, rather than establishing a completely new connection to the database. – LukeH Aug 28 '09 at 23:42
-
3Agreed with Luke. Connections haven't been worth worrying about (much) for more than 10 years. – MusiGenesis Aug 28 '09 at 23:56
-
I got a legacy code written in VB.Net that has everythig mixed like if it were written in VB6: no layers, no classes, all the code inside the form and the bas modules... – Broken_Window Aug 29 '09 at 02:05
If you're not going to use an ORM, are you also not going to cache your data? That's a big advantage of using an ORM. If there's no data cache, you'll need to look at ways to cache the HTML/JavaScript. This can be accomplished using OutputCache directive and SqlDependency. Or by publishing out static HTML and JavaScript files. Either way, you will be able to handle higher load if you're not constantly hitting the database on every request.
Some links:
ASP.NET Web Site Performance Improvement http://www.codeproject.com/KB/aspnet/aspnetPerformance.aspx
10 ASP.NET Performance and Scalability Secrets http://www.codeproject.com/KB/aspnet/10ASPNetPerformance.aspx
Omar AL Zabir blog on ASP.NET Ajax and .NET 3.5 http://msmvps.com/blogs/omar/archive/tags/performance/default.aspx

- 8,272
- 2
- 26
- 18
The article Martin links to is excellent. I would just add that you definitely want to use a DataReader instead of a DataSet (I love DataSets, but not for performance reasons).

- 74,184
- 40
- 190
- 334
-
Unless you have to go back to previous records in which case using a DataSet can be a good idea. – Alfred Myers Aug 29 '09 at 00:42
-
@Alfred: he said he "want[s] pages to load at high speed", so I assume he's talking about ASP.Net. In this case letting a user go back to previous records in a DataSet would mean caching the DataSet, which probably wouldn't be a good idea. – MusiGenesis Aug 29 '09 at 00:53
-
@MusiGenesis: I'm not disagreeing with you. You said you love DataSets and I’m giving an example of when they’re suitable. Even on a *super duper* high speed webpage, someone could have logic requiring access to previous records for whatever business reason. Instead of going back to the database, use a DataSet. No DataSets on cache – no nothing. Just a DataSet being used during the lifetime of the request. – Alfred Myers Aug 29 '09 at 01:56
-