0

I have saved a big table as generate script (like this). So the resultant sql file has ~billion lines of INSERT INTO code. Let's say the table is about 30 Gb and my computer has 64 Gb RAM - I don't think we should load all 30Gb data into memory in this case.

So my question is, when I execute this sql file (of many INSERT INTO), does sql try to load everything in RAM? Or does it automatically execute it by batch?

If sql doesn't automatically split it into batch, how do I make it? Thank you-


Big picture: i need to save a big table into hard drive, go to another computer and import the table to the sql server there

YJZ
  • 3,934
  • 11
  • 43
  • 67
  • 2
    Better to use bcp to export then bcp to import so large a table, E.g. http://www.sqlserver-training.com/how-to-backup-sql-table/ (Assuming actually creating a true backup is not an option for you) – Alex K. Nov 06 '17 at 16:56
  • As Alex said, [bcp](https://learn.microsoft.com/en-us/sql/tools/bcp-utility) is your friend here. Couple of questions though. Is the new table in the same database? As far as memory management is concerned, you can limit what's available to SQL Server by editing the "Maximum Server memory" (right-click your server in SSMS and select "properties") – pim Nov 06 '17 at 17:14
  • *DON'T* do that. Especially if you have a lot of data. SQL Server has very strong ETL tools. Create the schema then use the Import Wizard to load the *data* in a streaming fashion. Underneath the same bulk import functionality is used. This way you can skip writing out the data to a file. You can *save* the generated SSIS package and edit it or schedule it to run with SQL Server Agent – Panagiotis Kanavos Nov 06 '17 at 17:24
  • Why do you want to *script* the table in the first place? Are you trying to create a "backup"? No need, since SQL Server also has real backups. Are you trying to *deploy* a database to production? You can take a backup from the original location and restore it to production or detach the original DB, copy the files and reattach it. Periodically load data? That's what SSIS is for. Add change tracking and you can only load the few thousand rows that changed since the last load – Panagiotis Kanavos Nov 06 '17 at 17:28
  • thank you @PanagiotisKanavos see update in OP please. i need to save a big table into hard drive, go to another computer and import the table to the sql server there. For import/export wizard you mention, the two computers have to be connected with inner-net, right? see . – YJZ Nov 06 '17 at 18:25
  • @PanagiotisKanavos about `backup` you mentioned, can you give a link of details? Are you talking about something in this post ? thank u~ – YJZ Nov 06 '17 at 18:28
  • @YZhang google it? Or just check the [documentation](https://learn.microsoft.com/en-us/sql/relational-databases/backup-restore/backup-overview-sql-server)? Or go to the Database context menu and select `Tasks > Backup ...` ? I can't stress how *fundamental* this is. Perhaps you should look for a course on SQL Server before proceeding? Otherwise you'll end up using hacks suitable for other databases when a super easy alternative is available – Panagiotis Kanavos Nov 07 '17 at 08:33

1 Answers1

1

I would recommend bcp for an operation like this.

This would involve two steps:

  1. Export data using bcp by issuing a command like: bcp dbo.someTable out "C:\someTable.txt" -c -T. Where dbo.someTable is the table containing your data.
  2. Import the data by issuing a command like: bcp dbo.someNewTable in "C:\someTable.txt" -c -T. Where dbo.someNewTable is the new table to store the exported data.

If you're keen on managing the batch size explicitly. You would use the -b parameter. From the MS docs:

By default, all the rows in the data file are imported as one batch. To distribute the rows among multiple batches, specify a batch_size that is smaller than the number of rows in the data file.

pim
  • 12,019
  • 6
  • 66
  • 69