2

Are there any ORM's which provide decent methods for high speed bulk inserts?

The reason I ask is that the Entity framework is fantastic for pulling data out, but completely falls over if you you want to do inserts faster than 10 rows per second.

Update

I am currently inserting at 80,000 rows per second using a DataTable and T-SQL, but it'd be cleaner if I could use an ORM across the board.

Contango
  • 76,540
  • 58
  • 260
  • 305
  • Rows/sec is not useful measure since you don't specify how wide they are. I can say my car can go "1,000,000 length-units per hour" which is meaningless without knowing how long the length unit is. – JNK Jun 28 '11 at 19:08
  • There are 25 columns in each row. Most tables don't have more than about 30 columns. I found the row/second measurement to be the same, regardless of whether I was inserting rows with 5 columns, or 25 columns. – Contango Jul 06 '11 at 22:30

5 Answers5

2

Blimey, asked 4 years ago?

Well....

In case anyone's still looking for an answer here, Jon Wagner's Insight.Database (an extremely nifty Micro-ORM) supports SQL BULK INSERT.

I can easily write 200,000 rows per second using the following syntax -

Database.Connection().BulkCopy("TableName", myListOfItemsToInsert);

Would that work for you?

Insight.Database on GitHub

Rich Bryant
  • 865
  • 10
  • 27
2

I'm afraid you're going to find ORM's to be a poor choice to do 80K rows per second. Just cannot be done, unless you bypass the Linq to T-SQL transformation and go directly against the database. But then you are defeating the purpose of an ORM. Even then I'd be hard-pressed to see how you're going to achieve 80K rows per second (although you didn't define how wide each row is).

ORM's like EF and Linq to SQL will issue a single Insert statement per update. This is not conducive to 80K/sec throughput rates. IMO, you should be using a bulk insert tool like SSIS. This is what it was designed to do.

Randy Minder
  • 47,200
  • 49
  • 204
  • 358
  • If they're fairly simple bulk inserts then it's possible to make a stored procedure that does say 50 at a time and hook that up in EF, which would boost performance. But going beyond that you're totally right, this is just not the best tool for the job. – Tridus Jun 28 '11 at 19:44
1

You could backdoor using EF, but one of the common arguments against using an ORM is performance.

ORMs are designed to help readability/codeability/and decrease development time. If you care about performance use native sql.

Nix
  • 57,072
  • 29
  • 149
  • 198
  • This is not a limitation of ORMs per se, but of most existing ORMs. And to use ‘native SQL’ from application code you need something in between, like ADO.Net, or a performant ORM. – DvS Jun 01 '21 at 21:07
1

No, this isn't what ORM are for. ORM support OLTP, not bulk data processing. Use the right tool for the job.

(If you insist on using an ORM, look at setting the batch size for inserts.)

Contango
  • 76,540
  • 58
  • 260
  • 305
jason
  • 236,483
  • 35
  • 423
  • 525
  • Thank you, I did not know that ORMs were originally designed for OLTP like environments. – Contango Jun 28 '11 at 19:28
  • This isn’t correct in principle. “Object–relational mapping is a programming technique for converting data between incompatible type systems using object-oriented programming languages.” There’s no usage limitation implied by the definition, only by the implementation. https://en.wikipedia.org/wiki/Object%E2%80%93relational_mapping – DvS Jun 01 '21 at 21:12
1

There are now full ORMs that can achieve very close to this performance, like Nfinity.Data (https://www.nfinity.software), which will insert 80,000 rows in around 2 seconds. This I just tested for a table of around 5 columns wide, including a mix of data types.

Windows 10, i7 CPU, 32 GB RAM, SQL Server 2017, .Net Core 3.1

DvS
  • 1,025
  • 6
  • 11