RE: transporting 3 TBs of data to a new database

  • From: "H.Tonguç YILMAZ" <tonguc.yilmaz@xxxxxxxxx>
  • To: oracledbaquestions@xxxxxxxxx
  • Date: Fri, 3 Oct 2008 12:46:32 +0300

I remember myself asking a similar question to the list here.
//www.freelists.org/archives/oracle-l/04-2008/msg00330.html

But first of all you didn't mention details of your Oracle release, source
and target databases' OS versions?

There are several limitations(especially related to the datatypes) on
Streams based options depending on your release. Please search Metalink for
these notes.

If you go for a row-by-row option(data pump or insert into select from kind
of migration) then testing for migration's validity and performance and also
post migration application performance is important.

So here is a brief summary of our experience: from Hp-Tru 64 to IBM AIX,
~6,5 hours for ~3,8 TB of table data transfer over dblink and plus index
builds. And I think even more performance is possible depending on your
downtime constraints, performance of this kind of a migration depends highly
on your hardware capabilities(network, disk, memory and cpu), configuration
of these resources and parallelization of the tasks.

But success of this kind of a migration is highly depending on your tests
quality and DBA team's dedication since there are a lot of issues to be
checked and handled manually.

For more details of our experience please check my answers to a similar
question on OTN forums here:
http://forums.oracle.com/forums/thread.jspa?threadID=651607&start=15&tstart=0

2008/10/3 FreeLists Mailing List Manager <ecartis@xxxxxxxxxxxxx>

    oracle-l Digest Thu, 02 Oct 2008        Volume: 05  Issue: 271

    ------------------------------



    Dba DBA schrieb:
    > I am on ASM, the group moving the data is not. I think transportable
    > tablespaces is the best way to do this. So I have to use RMAN to put
the
    > data in ASM. Is this a copy to an an ASM tablespace? Any idea how
    > this works? I am trying to figure out how long this will take. We may
be
    > able to do it in pieces.
    >
    > I dont think data pump is good for this.
    >
    > I also dont know a good wait to write comma delimitted files out of
this
    > much data to sql load.

-- 
Best Regards,
H.Tonguç YILMAZ

http://tonguc.yilmaz.googlepages.com/

Other related posts: