I have a table of around 65mill rows that I need to replicate from an Oracle 18
database to SQL Server 2008
Source: Oracle 18.5, Goldengate 18.1
Target: SQL Server 2008R2, Goldengate 12.3 (this system is being migrated later
this year to SS 2017 - different story)
I am trying to do this via the FILE to REPLICAT method:
-- Extract to trail files on remote server
-- Start SPECIALRUN replicat to read through the trail files and load the
target table
The problem I am running into is that the replicat can only seem to read a
*single* trail file at a time. It cannot use wildcards or any other way I have
found to specify multiple EXTFILE datafiles
-------------- file list (partial) ----------------
Directory: C:\GGate\dirdat
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 5/10/2019 8:54 AM 2041669405 initld000000
-a--- 5/10/2019 8:56 AM 2041669563 initld000001
-a--- 5/10/2019 8:59 AM 2041669310 initld000002
-a--- 5/10/2019 9:01 AM 2041669276 initld000003
-a--- 5/10/2019 9:03 AM 1119899237 initld000004
...
------------------------------------------------------------
----------- replicat param file -------------------------
specialrun
abortonfatalerror
BATCHSQL
discardfile ./dirrpt/initld.dsc, purge
targetdb ers_gg, useridalias ggadmin
extfile ./dirdat/initld
sourcecatalog vsopdb
MAP vsop.masterline_repl, TARGET dbo.masterline_repl, colmap(usedefaults);
end runtime
------------------------------------------------------------
Get the error:
2019-05-10 09:36:14 ERROR OGG-01091 Unable to open file
"C:\GGate\dirdat/initld" (error 2, The system cannot find the file specified.).
Is there any way to do this? Or am I supposed to create a powershell loop of
some sort to process the full 57 data files one by one? I have tried with a
wildcard (error), multiple EXTFILE parameters (only reads the last one) ... ?