Re:

  • From: Riyaj Shamsudeen <riyaj.shamsudeen@xxxxxxxxx>
  • To: david.cheyne@xxxxxxxxx
  • Date: Thu, 12 Mar 2009 12:07:36 -0500

David
  Can you please share statspack data during this time frame? or can you
enable sqltrace on this import process to see why it is slowing down ?

  BTW, what is the underlying column type for these xml data?

-- 
Cheers

Riyaj Shamsudeen
Principal DBA,
Ora!nternals -  http://www.orainternals.com
Specialists in Performance, Recovery and EBS11i
Blog: http://orainternals.wordpress.com

Old message:

On Thu, Mar 12, 2009 at 12:00 PM, David Cheyne <david.cheyne@xxxxxxxxx>wrote:
Hi,

I've really cut the import parameters to the bone.indexes=n  triggers=n
constraints=n Buffer=20971520 Commit=y

. . . and just for the hell of it I switched on feedback=20000 the last time
around.

As no indexes are being built, will an increase in sort area size change
anything?

Redo logs are now 1gb each and there are 5 groups. All data is on netapp.

This is a stand alone audit table being imported into an instance that was
built out of standard scripts but will house only this table.

Any input would be appreciated.

Thanks in advance!
David Cheyne
B.A.(hons)
Oracle DBA
Odd spacing and typos courtesy of my iPhone

On 12 Mar 2009, at 15:14,  wrote:


>
>
>>
>> did you try below options
>>
>> file size option during the export FILESIZE
>>
>> and during import use option COMMIT
>> buffer
>>
>>
>> what is the exact error you face.
>>
>> thanks,
>> Vamshi .D
>>
>> --------------------------------------------------
>> From: "David Cheyne" <david.cheyne@xxxxxxxxx>
>> Sent: Thursday, March 12, 2009 10:47 AM
>> To: "oracle-l" <oracle-l@xxxxxxxxxxxxx>
>> Subject: Import speed falls off at 2million rows
>>
>>  Hi list!
>>> I'm trying to import a table with about 8 million rows in to oracle
>>>  9.2.0.6 EE on solaris 10.
>>> The import starts as you would expect but seems to hit a wall at about  2
>>> million rows and the rows seem to trickle in after that point. I've  tried
>>> exporting the data again, expanding the datafiles before hand,  increasing
>>> redo log size. The only different thing about this table is  that it
>>> contains 2 columns of an XML type which will be populated.   Archiving is
>>> switched off and no users or developers are logged in.
>>> Any ideas?
>>> David Cheyne
>>> B.A.(hons)
>>> Oracle DBA
>>> Odd spacing and typos courtesy of my iPhone
>>> --
>>> //www.freelists.org/webpage/oracle-l
>>>
>> --
> //www.freelists.org/webpage/oracle-l
>
>
>

Other related posts: