5

I have a table with some 5 Million rows and above 400 columns in a test level database.

I want to temporarily hold a copy of this table in the production database.

I know I can use Tasks and generate script and can choose to script the schema item and the data. This seems to take a long long time.

I could use the Tasks and generate script option to just copy table definition and process the data copy via another method.

Can I use an insert ... select .. across multiple servers with SQL server..?

I could add the test DB and server as a linked server.

I could use SSIS.

Am just asking the general forum for a recommendation that is a best (or better) approach i.e. that is faster than the generate scripts option

I am trying the Generate Scripts option but this has been running all afternoon.

I am hoping for an approach which will run in under an hour or two.

Hadi
  • 36,233
  • 13
  • 65
  • 124
Allan F
  • 2,110
  • 1
  • 24
  • 29
  • 2
    I would use linked server and use `INSERT INTO ... SELECT ...` with some partitioning, eg. 100K rows at a time – Michał Turczyn Jun 12 '19 at 06:30
  • 1
    the Production and test database are on same or separate machine ? – Squirrel Jun 12 '19 at 06:34
  • 1
    you may find this useful: https://stackoverflow.com/a/4198009/891715 - backup the table using bcp to a file, then import the file. This way there's no network bottleneck if you have physical access to the data. – Arie Jun 12 '19 at 10:29
  • Production and test databases are on separate machines .. – Allan F Jun 12 '19 at 23:08

3 Answers3

4

If you are on SQL server I would tend to use the Import Export Wizard as the quick and easy method. It will fall over gracefully if there are issues.

1) Create the table in the destination database

2) Right click on the destination database and then Tasks-> Import Data

3) Connect to the Source server when prompted and then keep following the prompts

Hope it helps

Gerard
  • 301
  • 2
  • 9
3

One option is to right-click the database, then select Tasks -> Export data.

However, if you know SSIS, I'd think it's the easiest way to go, as moving data is exactly what it's intended for. You can increase the batch size there to beter suit your data volume.

As for how long it takes, that really depends on your hardware. The biggest bottlenecks most likely are your network and disks.

SchmitzIT
  • 9,227
  • 9
  • 65
  • 92
  • Thanks ... A work colleague mentioned this also .. i.e. create the table and use Tasks -> Import ... (which I guess is same as Export ... depends on pulling or pushing ..) This at least gives progress indication .. i.e. looks to be doing about 500 records per second or better .. thus estimate 10,000 seconds ... equals 166 minutes ... equals about 2.7 hours ... Better than the other scripting method !. – Allan F Jun 12 '19 at 06:54
  • 1
    @AllanFYup, it's the exact same process, really. I generally stick to starting from the source system, but that's just personal preference, I guess. – SchmitzIT Jun 12 '19 at 09:10
  • 1
    @AllanF, I'd expect 5M rows to take minutes rather than hours. Are you running it on the source or target server? Otherwise, the data will flow through the client machine. Make sure there are no non-clustered indexes on the target table - create those afterwards, if needed. – Dan Guzman Jun 12 '19 at 09:52
  • Many thanks for the comments and help SchmitzIT and Dan Guzman ! .. I ran it on my own PC ... i.e. I did an import .. I had a PK clustered index but no non-clustered indexes. It worked successfully .. I noted that the execution report that is generated does not contain any datetime stamp info ... Maybe Microsoft improved this execution report in a more recent version post 2012 version to include datetime stamps? – Allan F Jun 12 '19 at 23:04
1

In a similar case, i will use SSIS since it gives you more control over the transfer operation. There are many tips and tricks that can be done to achieve higher performance such as importing data in chunks, configure buffer size, use bulk insert ...

I have posted many SSIS related answers that you can refer to in order to obtain more information:

Hadi
  • 36,233
  • 13
  • 65
  • 124