I did not do the import. We dropped the idea and decided to take a different route. I had the same feeling too on the size. To make it worse, the only place that was available was nfs mount point. Thanks for the help. On 1/18/08, Jared Still <jkstill@xxxxxxxxx> wrote: > > 200G is a mighty large export. > > Any idea how long the import will take? > > > On Jan 18, 2008 12:13 PM, Ram Raman <veeeraman@xxxxxxxxx> wrote: > > > Thanks all who replied. > > > > I was able to see the total estimated size using expdp in a > > small database, but it was hanging for over an hour for a DB of size ~200G. > > I had to kill it. Jared's script helped too. > > > > > > On 1/17/08, Tony Sequeira <tony@xxxxxxxxxxxxxxx> wrote: > > > > > > On Thu, 2008-01-17 at 13:27 -0600, Ram Raman wrote: > > > > Hi, > > > > > > > > Is there a way to determine the dump file size before the export > > > > starts? I plan to do a full export of a database. ver 10.2. > > > > > > > > Thanks. > > > > > > What I used to do back in the days of Oracle7. Unix or Unix tools > > > required. > > > > > > Create a "named pipe" (mknod), pipe an export to this file, and wc -c > > > the result. > > > > > > Have a google search. > > > -- > > > S. Anthony Sequeira > > > ++ > > > Smear the road with a runner!! > > > ++ > > > > > > -- > > > //www.freelists.org/webpage/oracle-l > > > > > > > > > > > > > > > > > -- > Jared Still > Certifiable Oracle DBA and Part Time Perl Evangelist >