Re: Backup for large number of archived logs per hour

  • From: Martin Bach <development@xxxxxxxxxxxxxxxxx>
  • To: mdinh@xxxxxxxxx
  • Date: Sat, 06 Mar 2010 21:25:31 +0000

Hi Michael!

I suggest you take on the advice given by the other replies to your
thread - 200G database generating 20G/day seems a lot...

On 04/03/10 20:31, Michael Dinh wrote:
> As an example, if we have a 200G DW generating 20G archived log per
> hour, then what would be a more efficient way for backup?
>  
> It does not make sense to backup archived log when the entire DW is less
> that the amout of archived logs generated.

In your case incrementally updated (RMAN) backups could be just what the
doctor ordered, check section 4.4.3 in the 10g backup and recovery
basics section.

In a nutshell you take a level 0 image copy of your database, then take
incrementals and merge them into your original level 0 backup. This is a
really cool strategy that can cut your recovery time considerably since
there are a lot less archived logs to be applied. Of course it's
something to test thoroughly before putting into production...

Regards

Martin
-- 
Martin Bach
OCM 10g
http://martincarstenbach.wordpress.com
http://www.linkedin.com/in/martincarstenbach
--
//www.freelists.org/webpage/oracle-l


Other related posts: