0

I work on a project in a Financial Institution. In this company databases are distributed in separate branches. We want to set up a data center and turn all databases to one database. But in this situation we have a database table with more than 100 million records. I think SQL operations (ex insert, update, select) in this table will be too slow and costly. Which scenarios can help me? We use code first approach of Entity Framework in our project.

ArMaN
  • 2,306
  • 4
  • 33
  • 55

2 Answers2

1

A) 100 million is not too much for SQL server. With appropriate indexes, disk topology, memory and CPU allocations plus a good DBA to oversee for a while. Things should be fine.

b) Initial migration is NOT an EF topic. I would not recommend EF for that task. EF can Create the DB, but use tools to load data. Sample SO post

c) Test and/or do some research on expected insert/Select times on the SQL server with 100 million rows.

d) The trick to getting good performance with EF is holding as FEW records in a context as possible. Good EF code first code is the key to it working well.

Community
  • 1
  • 1
phil soady
  • 11,043
  • 5
  • 50
  • 95
0

Take a look on Bulk and BCP commands. They are used to copy large amount of data.

http://technet.microsoft.com/en-us/library/ms130809(v=sql.110).aspx

http://technet.microsoft.com/en-us/library/ms190923(v=sql.105).aspx

If you don't use MS SQL Server, look for the correspondent feature in your server.

Note that 100 milion records may not be really large amount of data. I recommend you do some performance test to realize if it will really be an issue.

Caffé
  • 1,161
  • 1
  • 9
  • 19