1

I have over 18GB of database .sql file, which have over 200 Millions records. My server have Xeon E5 processor, 16GB ECC RAM and SSDs. I have SQL Server 2014 Web Edition and Windows Server 2012.

I know I would need sqlcmd to run huge database so my command is this

sqlcmd -S SQL-Server2014 -i C:\main_db.sql -d newtest -o D:\main_db_log.txt

When I start this command, T-SQL command line utility start working it's magic. But after few minutes it use full 16GB of RAM and server crashes.

main_db_log.txt show's this error

Sqlcmd: Error: Scripting error.

I tried to restrict SQL Server to use only 10GB by limiting memory in ServerInstance > Properties > Memory Tab but it didn't work.

I researched around but only found out about sqlcmd to run large databases but it's not working in my case.

Please help, because our old server crashed and I can't make .bak file there anymore.

aadi1295
  • 982
  • 3
  • 19
  • 47
  • One doesn't "execute" a database (although I have a few I wouldn't mind taking out hunting....). You've not given enough information about the database or the query(ies) you're running or even what you're really trying to achieve for people to guess here. – alroc Dec 05 '16 at 18:53
  • @alroc: Sorry about word `execute`, I am a newbie with Databases. I just want to add the whole database's records in `newtest` in MS SQL Server 2014 but I only used the single line above `sqlcmd`. Still having memory issues and scripting error. – aadi1295 Dec 05 '16 at 19:03
  • Again, without seeing what your query(ies) are **actually doing**, it's impossible to even guess. You've got an error related your script - so show the script and maybe we can find the error. How big is `main_db.sql`? Also look at https://stackoverflow.com/questions/24665757/sqlcmd-scripting-error and https://dba.stackexchange.com/questions/16763/how-do-i-run-a-large-script-with-many-inserts-without-running-out-of-memory/ – alroc Dec 05 '16 at 19:22
  • there are only `insert` commands on each line with just two columns, 1st is `id` and second is `sales_id` and database slze is `18GB`. I saw the questions you recommended, should I use `GO` every 1000 lines? – aadi1295 Dec 05 '16 at 19:33
  • If you have tens of thousands of `INSERT` statements yes, it would make sense to break them into smaller batches. – alroc Dec 05 '16 at 19:40
  • @alroc: Thanks so much for help, I will try to do it now. Will update here as well. Thanks again – aadi1295 Dec 05 '16 at 19:43
  • @TT. yeah let me delete that one. – aadi1295 Dec 05 '16 at 21:57
  • I just deleted the other one :) please don't flag this one. I also thought dba was the place for this. but no one was replying there. – aadi1295 Dec 05 '16 at 21:59
  • now I have another issue, let me update my question. – aadi1295 Dec 05 '16 at 22:00

0 Answers0