1

I need to create an hourly .SQB backup file of some specific tables, each filtered with a WHERE clause, from a SQL Server database. As an example, I need this data:

SELECT * FROM table1 WHERE pk_id IN (2,5,7)
SELECT * FROM table2 WHERE pk_id IN (2,5,7)
SELECT * FROM table3 WHERE pk_id IN (2,5,7)
SELECT * FROM table4 WHERE pk_id IN (2,5,7)

The structure of the tables on the source database may change over time, e.g. columns may be added or removed, indexes added, etc.

One option is to do some kind of export, script generation, etc. into a staging database on the same instance of SQL Server. Efficiency aside, I have no problem dropping or truncating the tables on the destination database each time. In short, I'm looking to have both the schema and data of the tables duplicated to the destination database. That's completely acceptable.

Another is to just create a .SQB backup from the source database. Being that the .SQB file is all that I really need (it's going to be sent SFTP) - that would be fine, too.

What's the recommended approach in this scenario?

Eliezer
  • 429
  • 1
  • 9
  • 20
  • In summary, you're looking to back up a subset of your data? – David Atkinson Feb 08 '18 at 21:22
  • Uh, yeah, I guess... :-/ Since your comment, I've rephrased my question to try to make it clearer. The only thing is that it needs to be restorable as a complete database. In other words, if the source database has 100 tables, I'm only looking to backup 10 of those, but I want that .SQB to be able to be restored to a complete, functional database containing only 10 tables. – Eliezer Feb 08 '18 at 21:39
  • I've confirmed that RedGate SQL Backup (nor any of RedGate's products) is able to do selective backups. – Eliezer Feb 09 '18 at 19:59

1 Answers1

2

Well if I understand your requirement correctly, you want data from some tables from your database to be shipped over to somewhere else periodically.

  1. Thing that is not possible in SQL server is taking a backup of a subset of tables from your database. So, this is not an option.

  2. Since you have mentioned you will be using SFTP to send the data, using BCP command to extract data is one option, but BCP command may or may not perform very well and it definitely will not scale-out very well.

  3. Instead of using BCP, I would prefer an SSIS package, you will be able to do all (extract files, add where clauses, drop files on SFTP, tune your queries, logging, monitoring etc) in your SSIS package.
  4. Finally, SQL Server Replication can be used to create a subscriber, only publish the articles (tables) you are interested in, you can also add where clauses in your publications.

Again there are a few options with the replication subscriber database.

  • Give access to your data clients to your subscriber database, no need for extracts.
  • Use BCP on the subscriber database to extract data, without putting load on your production server.
  • Use SSIS Package to extract data from the subscriber database.
  • Finally create a backup of this subscriber database and ship the whole backup (.bak) file to SFPT.

I short there is more than one way to skin the cat, now you have to decide which one suits your requirements best.

M.Ali
  • 67,945
  • 13
  • 101
  • 127
  • M.Ali, your ideas are very good and I will definitely give them more consideration. But what I'd *really* like to do is somehow accomplish a "Generate Scripts" action of both schema and data for the desired tables. Is it possible to schedule such an action? And prior to executing those statements in the destination database, I could drop the all the tables therein. – Eliezer Feb 08 '18 at 23:42
  • Shipping a backup of the subscriber and restoring over the existing older database will be more efficient, rather than dropping and creating database objects and inserting data. – M.Ali Feb 08 '18 at 23:50