Times are tough nowadays. You can’t rely on an only instance of your database and redundancy is a world of a century, probably, but the truth is: you have to have a copy of all your data somewhere, and you need to keep it up to date. Of course there are different solutions like Oracle Streams, Data Guard etcetera, but sometimes you need much more simple though powerful and reliable way to get a copy of your data. For example, you may need an analytic database for reports since you don’t wont let muggles access to your database and you you are not happy with all the high load they treat your precious instance. Then this story is yours.
Well, you happen to have tables inside one Oracle Database which you want to duplicate to another. So this another will be your workplace, and the source database will be the distant one.
Also, your source database has tables in different schemas while your target one has only one, different from these ones, and you want to store these tables within only one this scheme. Moreover, source DB objects placed in different tablespaces but you want to use your ones locally.
The script of the process will be:
- Connect to distant DB
- Export needed table using Data Pump
- Check if locally exists table with the same name and temporary table for import
- If local table does not exist, create it structure using definitions from exported data, and copy its structure to temporary table using Data Pump
- Truncate temporary table
- Import data from export file to temporary table using Data Pump
- Merge temporary table with persistent one using field ‘id’ or, if ‘id’ is not exists, given field.
- And, of course, log everything
The final result will be a script which takes schema and table name as a parameter and does all the other steps on his one. We’ll run it outside cron daemon, and it will create and keep up to date our tables.