1

Only part of the data in the database is being processed by the application, the rest is necessary for reporting purposes, but it causes poor application performance. I would like to archive historical data without modifying database schema.

Is there a possibility to replicate database, delete old data from primary instance and regularly synchronise new changes into replicated database? That way primary "transactional" database will be lightweight and replicated database will contain full set of both current and historical data for reporting purposes.

Could you recommend some tools or give some tips to achieve that on Oracle?

edit: I'm wondering if I could use streams and somehow make DML handler to ignore DELETE operations on rows (docs.oracle.com/cd/B28359_01/server.111/b28321/…) so that during data replication historical rows will be preserved despite being deleted from transactional db.

kar8o
  • 41
  • 5
  • 1
    You mean [a data warehouse](https://docs.oracle.com/cd/E11882_01/server.112/e25554/concept.htm#DWHSG8063)? – Alex Poole Jan 13 '16 at 13:21
  • do you want to archive the data ? you have 100million row in a table and you want to move half of them (depend on conditions) to another table and delete them from the primary database ? if yes I dont think with replication you can do that. I useally have a procedure that do the work to archive data , that incerase the performance alot . – Moudiz Jan 13 '16 at 13:26
  • Unfortunately, I don't have the luxury to modify schema neither for current nor historical data. Both reporting and transactional functionalities are implemented based on existing schema. I'm aiming for improvised solution not interfering with current implementation. – kar8o Jan 13 '16 at 13:50
  • honestly its not clear for me what is the exact situation for you , but moving data table from one database to another wont effect you schema, right ? – Moudiz Jan 13 '16 at 13:57
  • What I'm worried about is continuous synchronization between transactional and historical database. All changes made to transactional database during everyday processing need to be eventually applied to historical database. Otherwise generated reports won't contain actual data. Doing that manually seems to be a lot of work. – kar8o Jan 13 '16 at 14:13
  • 1
    Setup another instance of your current DB, call it REP. Create a dblink to the live one. Write a job that periodically copies records from live to rep. Write another job that periodically deletes records from live based on their age. You might think that some synch tool can save you from the manual work, but deleting records in almost any case requires a deep understanding of how your app works. – Gergely Bacso Jan 13 '16 at 14:13
  • create a procedure that do all the work , a procedure that contain for example 50 tables that you move the data from them to another database it will not take lot of time, and as gergely suggest a job is good if you want it to work at a specific time – Moudiz Jan 13 '16 at 14:48
  • I'm wondering if I could use streams and somehow make DML handler to ignore DELETE operations on rows (https://docs.oracle.com/cd/B28359_01/server.111/b28321/capappdemo.htm#BCGIBEIF) so that during data replication historical rows will be preserved despite being deleted from transactional db. – kar8o Jan 13 '16 at 15:30

1 Answers1

0

You don't need to create two separate databases. Just create one transactional database where you will save all your transactions and then create views based on these tables to show required data. In this way you just have to maintain only one database.