Ram,
BCT has been mature since 11.2.0.4 at least, but there are always
issues, even in the latest versions. If you're using RMAN incremental
backups, use BCT.
One thing to consider is the limit in the number of incremental backups,
which is explained in Oracle Support doc #425455.1
<https://support.oracle.com/epmos/faces/DocumentDisplay?id=452455.1>
(entitled "/How Many Incremental Backups Can Be Taken WHen BCT Is
Enabled?/"). The title of this support document is a bit misleading,
and should read "/How Many Incremental Backups Will Be Optimized By
BCT?/" instead, but nobody asked for my opinion.
The support note references documentation HERE
<https://docs.oracle.com/database/121/RCMRF/rcmsynta006.htm#RCMRF107>
which states...
/The change tracking file maintains bitmaps that mark changes in the
data files between backups. The database performs a bitmap switch
before each backup. Oracle Database automatically manages space in
the change tracking file to retain block change data that covers the
8 most recent backups. After the maximum of 8 bitmaps is reached,
the most recent bitmap is overwritten by the bitmap that tracks the
current changes./
//
/The first level 0 incremental backup scans the entire data file.
Subsequent incremental backups use the block change tracking file to
scan only the blocks that have been marked as changed since the last
backup. An incremental backup can be optimized only when it is based
on a parent backup that was made after the start of the oldest
bitmap in the block change tracking file./
//
/Consider the 8-bitmap limit when developing your incremental backup
strategy. For example, if you make a level 0 database backup
followed by 7 differential incremental backups, then the block
change tracking file now includes 8 bitmaps. If you then make a
cumulative level 1 incremental backup, RMAN cannot optimize the
backup because the bitmap corresponding to the parent level 0 backup
is overwritten with the bitmap that tracks the current changes./
Else consider using storage-level snapshots instead of RMAN to backup
the growing database. This means going back to user-managed backups
techniques which were widely used prior to introduction of RMAN, but
RMAN is entirely based on the concept of streaming backups to (and
restores from) sequential media (i.e. tape or virtual tape library). As
sequential media obsolesces, so does RMAN. When a database grows large
enough, there comes a point where there are no more clever tricks to
complete a full streaming backup or restore operation within a given RTO.
This is especially important to consider when migrating into a fully
virtualized environment such as a public cloud, because in addition to
the RTO, there are limits on the number of IOPS and the volume of I/O
throughput for each virtual machine, and streaming backups consume
resources from that budget, stealing those resources from "normal"
database activities. Be cognizant of the I/O limits on virtual
machines when moving into cloud, and be equally aware of how ancillary
activities like backups are impacted by those limits.
Hope this helps...
-Tim
On 3/2/2021 4:19 PM, Ram Raman wrote:
List,
I am trying to come up with a backup strategy for a multi TB database, say 18Tb. We are going to be backing it up on disks (over NFS). I am thinking of enabling BCT in the database so backups will be quicker and use less resources. This DB is going to have nightly loads of several million rows and also have Data guard. v19c on Linux.
I have had issues with BCT in older versions of oracle. Is anyone using BCT in a similar environment (big DB with loads and DG) and things are running good?
We are not looking into backing up the DB from the standby site as things may change in that area in future.
Thanks,
Ram.
--