Hi list, Does any one have experience using merge statement with clob? We are trying to load a table from an external table using merge statement on the daily basis. Here is table definition: create table test (id number, position varchar2(10), desc clob ); create table test_xt (id number, position varchar2(10), desc clob ) ORGANIZATION EXTERNAL (TYPE ORACLE_LOADER DEFAULT DIRECTORY xt_dir ACCESS PARAMETERS (records delimited by newline skip 1 CHARACTERSET US7ASCII BADFILE 'test_xt.bad' LOGFILE 'test_xt.log' fields terminated by ',' OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES ARE NULL REJECT ROWS WITH ALL NULL FIELDS (id, position, desc varchar2(32000) ) ) LOCATION ('test.csv') ) REJECT LIMIT UNLIMITED NOPARALLEL NOMONITORING; There are 98000 records in the test_xt table. The merge statement finishes in 7 secs when I change the clob to varchar2(4000). With the clob, it seems never finished. I let the merge statement run for more than 3 hours and killed it. I have CACHE set in test table for clob. Is there any way to improve the performance? Another question is how to load multiple line data from csv file into external table, e.g. one record in csv file is as following: 2, "DBA", "this position requires a certificate of OCP and 10 years experience in Oracle." This record can not be loaded into external table. I changed "records delimited by newline" to 'records delimited by '|'" and modified the csv file. It still does not work. I know I can use the LOBFILE method, but exporting the clob data to a file for each record is not an option. David _________________________________________________________________ Peek-a-boo FREE Tricks & Treats for You! http://www.reallivemoms.com?ocid=TXT_TAGHM&loc=us