Download a trial of WisdomForce FastReader http://www.wisdomforce.com/products-FastReader.html It should do everything you need, including the ability to compress, split and parallelize the export. On Wed, Oct 26, 2011 at 2:59 PM, Steve Wales <sjwales@xxxxxxxxxxx> wrote: > Hello, > > I've been tasked with generating a CSV file with the contents of a view in > an old database as an extra copy before the DB is decommissioned. > > I've done this in the past for smaller tables or views without issue. > > There are 80 million rows in the view, full row length is 5500 bytes (most > probably shorter since the columns on the base tables are mostly varchar2). > > Doing the math on that, that's a maximum possible size of 440GB. > > I had seen a stored procedure on a website somewhere that read through a > cursor and dumped the records out to a text file via utl_file.put. Anyone > have a better method or is that my best option ? I've had some TEMP > tablespace failures, I may just need to allocate a bunch of space in temp, > kick it off one afternoon and let it run overnight. > > I've could shrink the overall output size of the file by running "trim" on > the columns too. > > Any ideas appreciated. > > Environment: Oracle 9.2.0.7 on HP-UX 11.11 -- Regards, Greg Rahn http://structureddata.org -- //www.freelists.org/webpage/oracle-l