Re: Data extraction over dblink

  • From: "David Lord" <dlordster@xxxxxxxxx>
  • To: dubey.sandeep@xxxxxxxxx
  • Date: Thu, 13 Apr 2006 20:43:02 +0100

Sandeep,

I'd second the comments about dblinks not scaling well.  A while back
I needed to do something like this to populate a data warehouse.  I
don't have accurate figures to hand, and it did not involve blobs, but
I found spooling data to csv files, ftping them across the network and
loading them with external tables to be of the order of 2-4 times
faster than insert...select across a dblink.

Have you tried transportable tablespaces?  I believe Oracle recommend
them as the fastest transportation method, but of course, in 9i you
are limited to both databases being on the same architecture.  I'm not
sure whether there are any restrictions in 10g.

Regards
David Lord

On 12/04/06, Sandeep Dubey <dubey.sandeep@xxxxxxxxx> wrote:
> I need to extract data from a remote database to local database. One
> of the column to extract is blob. Each single extract will fetch
> approx. 1 GB of data. Connectivity between the database is 10 mbps VPN
> connection over the internet. For security reasons Local database can
> have db link, ssh connectivity but not the other way. So it will be a
> pull mechanism from local and not push from remote to local.
>
> I am trying to compare the following approaches to extract data:
>
> 1. Extract data from remote database using dblink. Is it feasible to
> fetch 50K rows /1 GB data over a internet connection using db link? Is
> anyone using this approach?
>
> 2. Write the data to a temp table on the remote database. export and
> import the data in the local database.
>
> 3. Create a flat file for the data to be moved, transfer the file and
> load the file. I have a blob column. How difficult it is to write blob
> data along with other data to flat file and load using sqlloader?
>
> 4. Any other ideas?
>
>
> Thanks
>
> Sandeep
> --
> //www.freelists.org/webpage/oracle-l
>
>
>
--
//www.freelists.org/webpage/oracle-l


Other related posts: