RE: When stats trash your performance

  • From: "John Kanagaraj" <john.kanagaraj@xxxxxxx>
  • To: <stephenbooth.uk@xxxxxxxxx>, <oracle-l@xxxxxxxxxxxxx>
  • Date: Wed, 7 Dec 2005 11:12:57 -0800

Stephen,
 
What is the database version? Any clues as to *what* programs/functions
are slow, and by how much? How were the stats collected? DO you have
baseline figures (from STATSPACK before and after) that will give us a
clue?
 
Cheers,
John Kanagaraj

________________________________

From: oracle-l-bounce@xxxxxxxxxxxxx
[mailto:oracle-l-bounce@xxxxxxxxxxxxx] On Behalf Of stephen booth
Sent: Wednesday, December 07, 2005 10:59 AM
To: oracle-l@xxxxxxxxxxxxx
Subject: When stats trash your performance


I've heard mention in a number of talks that you shouldn't gather stats
very often, even not at all once a system is bedded in, as it can cause
your database to start performing really badly.  I don't recall any one
ever saying, or ever reading, an explanation as to why this might
happen.  Perhaps I'm looking at it from the wrong perspective but it
seems axiomic that the fresher your stats the better decisions CBO is
likely to make.

I've Googled and can't really find anything that seems relevant other
than some references to gathering stats invalidating existing plans so
meaning Oracle has to do a hard parse.

Can anyone point me to any documentation that might explain why this
might happen.  It's not something I've really had to deal with in the
past (getting the system running and occasionally trying to tune the
server to reduce/eliminate waits before it goes live has been most of
what I've been involved in, we don't write code in house so SQL tuning
hasn't really featured).  Something with pointers as to how to resolve
such problems would be very useful as well.

What has triggered this question is I've just been hit by a situation
where one of our systems was running OK (performance wasn't great but
was acceptable and wasn't noticably worse than when the system was put
in) then one of the sysadmins (not DBA) was told by our FM supplier to
analyze a half dozen tables and associated indexes that were used by a
new module that's due to go live soon and had been loaded with some
historical information that was required.  He did so and, so he claims,
the system started running noticably slower shortly after.  He then
decided to analyze all the rest of the tables and their indexes.
Performance got worse.  Four days later they notified me (for political
reasons I have no involvement with this system or any other system that
has gone live) and basically said "it's running slow, fix it.".  I don't
know if there were any existing stats which got over written (I've
asked, still waiting for a response), if there weren't I'm considering
just deleting all the stats and seeing if that helps.  

This system is shutdown every night for backup so the plans get wiped
out anyway.

Stephen

-- 
It's better to ask a silly question than to make a silly assumption.


Other related posts: