Re: Backup for large number of archived logs per hour

  • From: Jared Still <jkstill@xxxxxxxxx>
  • To: development@xxxxxxxxxxxxxxxxx
  • Date: Sat, 6 Mar 2010 18:46:34 -0800

On Sat, Mar 6, 2010 at 1:25 PM, Martin Bach

> Hi Michael!
> I suggest you take on the advice given by the other replies to your
> thread - 200G database generating 20G/day seems a lot...
I'm not too sure that hard and fast rules, or even rules of
thumb can really be trusted for such things.

The size of the database can be a fair predictor of how much
redo may be generated, but the volatility of the data must
also be considered, as well as the number of processes
making changes.

Jared Still
Certifiable Oracle DBA and Part Time Perl Evangelist
Oracle Blog:
Home Page:

Other related posts: