I have saved a big table as generate script
(like this). So the resultant sql file has ~billion lines of INSERT INTO
code. Let's say the table is about 30 Gb and my computer has 64 Gb RAM - I don't think we should load all 30Gb data into memory in this case.
So my question is, when I execute this sql file (of many INSERT INTO
), does sql try to load everything in RAM? Or does it automatically execute it by batch?
If sql doesn't automatically split it into batch, how do I make it? Thank you-
Big picture: i need to save a big table into hard drive, go to another computer and import the table to the sql server there