Carel has the correct question. Why? However, we do this at my place, and here is how I did it. 1. I went to JaredStill.com and shamelessly copied the scripts to dump contents of any table to a sqlldr dumpfile and automatically creating a control file. 2. incorporated this into a cron job that fires at 4am everyday for two tables. It creates dated dump and control files. 3. at 5am, another script wakes up, runs against a different database. Checks if the files exists and loads that data from that (dated dump) file into the db using external table method. On an average I move about 3m rows a day, the whole process (dump and re-load) takes about 15 minutes. But at 5am, who cares? This has been running fine for over 2 years (three cheers for "cron") , never ever had to change the scripts again. Oh yeah ... thanks Jared. rjamya On 1/31/07, Carel-Jan Engel <cjpengel.dbalert@xxxxxxxxx> wrote:
Archiving is a solution. What is the probplem you're trying to solve? Best regards, Carel-Jan Engel === If you think education is expensive, try ignorance. (Derek Bok) === On Tue, 2007-01-30 at 10:59 -0500, Luc Demanche wrote: Hi, We are thinking to have a process that will archive data from our production database to another database, or somewhere else ..... For example, data of an old customer, info of an old product, etc ....
-- ---------------------------------------------- Got RAC?