RE: Copying table from one database to another db which have long raw column & huge data

  • From: "Mark W. Farnham" <mwf@xxxxxxxx>
  • To: "'Oracle-L Freelists'" <Oracle-L@xxxxxxxxxxxxx>
  • Date: Tue, 23 Apr 2013 11:24:44 -0400

And if you need network connectivity for the move, you can use the sqlplus
copy command. If you do, make sure you set long in your session to at least
the length of the longest long you have or it will silently truncate it.
With an arraysize in the range 1K to 10K and a copycommit in the range 1 to
10 that yields a product in the range 1K to 20K or so (your mileage may
vary), this can work very fast and you get reports on incremental progress.
You can also do ordering and predicates in the sourcing query if that seems
useful.

Very old fashioned. Avoids the export everything/import everything rhythm
and starts getting things into the destination sooner. Still worth a look.
Being an old tool they stopped upgrading some time ago (unless they started
again when I wasn't working) it does not support a lot of newer types. But
since you mentioned long raw, this vintage tool seems likely to fit.

mwf

-----Original Message-----
From: oracle-l-bounce@xxxxxxxxxxxxx [mailto:oracle-l-bounce@xxxxxxxxxxxxx]
On Behalf Of Guillermo Alan Bort
Sent: Tuesday, April 23, 2013 8:23 AM
To: Raja Subramaniyan
Cc: oracle-l-freelists
Subject: Re: Copying table from one database to another db which have long
raw column & huge data

Well, I just told you. the restrictions are the buffer size in traditional
exp, and the fact that you can't use a network link with datapump.
Cheers

Alan.-


On Tue, Apr 23, 2013 at 3:09 AM, Raja Subramaniyan
<raja.s28@xxxxxxxxx>wrote:

> Thanks Alan. I am preferring exp / expdp methods to proceed. Please 
> let me know if there any constraints for using that for LONG RAW DType.
>
> Regards,
> Raja.S
>
>
> On Tue, Apr 23, 2013 at 10:45 AM, Guillermo Alan Bort < 
> cicciuxdba@xxxxxxxxx> wrote:
>
>> if you are going with traditional export (the one with exp) then you 
>> need to ensure you have a large enough buffer (think along the lines 
>> of largest row). Datapump is your best bet from 10g on. Though  
>> dblink replication won't work on long data, it's easy enough to 
>> export only tables containing longs to a dump and  duplicate the rest
with datapump over network link.
>>
>> hth
>>
>> Alan.-
>>
>>
>> On Tue, Apr 23, 2013 at 1:13 AM, Raja Subramaniyan
<raja.s28@xxxxxxxxx>wrote:
>>
>>> Hi,
>>> I need to safely transfer databases / tables with LONG RAW coloumns 
>>> from one machine to another.
>>>
>>> What parameters have to be used to make exp work with the LONG RAW 
>>> data type and what are all the steps have to be taken?
>>>
>>> Thanks
>>> Raja
>>>
>>>
>>> --
>>> //www.freelists.org/webpage/oracle-l
>>>
>>>
>>>
>>
>


--
//www.freelists.org/webpage/oracle-l


--
//www.freelists.org/webpage/oracle-l


Other related posts: