RE: Ridiculously high number of commits

  • From: "Khemmanivanh, Somckit" <somckit.khemmanivanh@xxxxxxxxxxxxxxxx>
  • To: <scott.hutchinson@xxxxxxxxxxxxxxxxxxxxx>, <oracle-l@xxxxxxxxxxxxx>
  • Date: Mon, 10 Oct 2005 14:23:05 -0700

Some general thoughts:

1) Have you tried loading into the PSA first?
2) Can you expand further on the InfoPack you're using -- I may be able
to setup a test on my end...
3) Can you e-mail me any relevant ST03N, ST05 and/or SE30 traces?
4) Are the flat files ASCII or CSV files? SAP recommends ASCII format.

FYI: We use Ascential for our ETL loading and don't have this particular
issue.

-----Original Message-----
From: oracle-l-bounce@xxxxxxxxxxxxx
[mailto:oracle-l-bounce@xxxxxxxxxxxxx] On Behalf Of
scott.hutchinson@xxxxxxxxxxxxxxxxxxxxx
Sent: Monday, October 10, 2005 4:58 AM
To: oracle-l@xxxxxxxxxxxxx
Subject: Ridiculously high number of commits

All,

I have a performance problem while loading data into SAP/BW from flat
files.  
This process for loading the data is a standard SAP routine, and it
issues a 
COMMIT after each record is inserted - we have about 20 million rows to
insert, 
so this is a lot of commits!

We've broken the load process into 10 jobs that run concurrently,
however they 
spend the majority of their time sitting around waiting on "log file
sync", 
which is no great surprise.  I have a target of 4 hours for loading this
data 
into SAP's "Info Cubes", but this is currently taking 8 hours.

Does anyone have any smart ideas for lessening the impact in the
database from 
issuing such a high number of commits?

btw - the DB server is a 12 CPU HP running at 12% utilisation during the
load.  
And yes - we are also enganging SAP to see if they can improve their
load 
process.

Thanks,
Scott Hutchinson
Interact Analysis Ltd.

::This message sent using the free Web Mail service from
http://TheName.co.uk
--
//www.freelists.org/webpage/oracle-l


--
//www.freelists.org/webpage/oracle-l

Other related posts: