0

I need to backup a couple of tables from my prod database.

Each table has 250 million rows.

The complete backup is almost 1TB

I don't want the full backup due to space (Needs double space for backup & restore)

I can't use BCP due to volume and heavy inserts involved.

Attie Wagner
  • 1,312
  • 14
  • 28
DHAR
  • 63
  • 11
  • 3
    SQL Server backups (`.bak`) are always **all or nothing** - the only "selection" you get is if you had multiple filegroups - then you could back up just one filegroup (and ignore others). But you cannot selectively backup just a handful of tables into a `.bak` file – marc_s Dec 09 '19 at 10:03
  • Does this answer your question? [Backup a single table with its data from a database in sql server 2008](https://stackoverflow.com/questions/19698310/backup-a-single-table-with-its-data-from-a-database-in-sql-server-2008) – Shekar Kola Dec 09 '19 at 10:21
  • What is your *actual* problem and why do you want to backup specific tables only? If your database fails, you need the entire backup. Are you trying to use backups to move data around? Is that an *actual* requirement, or did it come up because there was no real data movement mechanism? What do those tables contain - it matters. You could use partition switching in some cases. – Panagiotis Kanavos Dec 09 '19 at 10:23
  • @ShekarKola definitely not. That link doesn't even talk about backups, and SQL Server 2008 was far more limited than modern SQL Server versions – Panagiotis Kanavos Dec 09 '19 at 10:23
  • @DHAR there are *MANY* features that can help, provided you explain what the actual problem is - recovery? Data transfer?. You can compress backups. You can store tables in different filegroups, and backup/restore the *filegroups*. You can use partitions combined with filegroups. Or you could create a proper ETL job to move data from production to reporting databases so you *don't* have to move 1TB of data every day, just the modified rows every 1 hour or 15 minutes. Or, you can configure mirroring/replication and have the data moved automatically. You could read from a read-only replica ... – Panagiotis Kanavos Dec 09 '19 at 10:27
  • @DHAR ... you could configure log shipping. 1TB is not big data data for a standalone SQL Server database. – Panagiotis Kanavos Dec 09 '19 at 10:28
  • @DHAR `I can't use BCP due to volume and heavy inserts involved.` on the contrary, if you create a proper ETL job that only pulls modifications from the source database, you can have it run frequently and pull only the modified data. Instead of pulling 1TB of mostly stale data, you could pull only the few thousand rows modified in the last N minutes – Panagiotis Kanavos Dec 09 '19 at 10:30

2 Answers2

1

SQL Server backups (.bak) are always all or nothing.

The only "selection" you can get is if you had multiple filegroups - then you could back up just one filegroup and ignore all others.

But you cannot selectively backup just a handful of tables into a .bak file. There's just no feature for that.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
1

Create an auxiliary database, select from your 250mil tables into corresponding columnstores (in mil batches) , backup the aux db and you have your data "suitcase"

lptr
  • 1
  • 2
  • 6
  • 16