Question re System Statistics

  • From: "William Wagman" <wjwagman@xxxxxxxxxxx>
  • To: <oracle-l@xxxxxxxxxxxxx>
  • Date: Fri, 15 Jun 2007 11:23:01 -0700

Greetings,

I would be interested in folks thoughts about how frequently system
statistics should be gathered and on what criteria should be used to
make this determination. My thoughts are that this is dependant on the
environment but I'm not sure to what extent this is so. My expectation
is that the system metrics as reflected would remain fairly consistent
over time but I may be way off base here. This is based on my assumption
that the hardware more than the code is the determining factor here.
Once the code knows how the hardware is behaving it will take that into
consideration in building the execution plan (I'm not sure I'm stating
this clearly). So the way I am understanding things one really only
needs to gather system statistics once initially and then leave them
alone. But then that leaves the question, and this is where I get a bit
confused, if I were to run a very well tuned efficient piece of code and
gather system statistics while it is running and then run a horribly
inefficient piece of code and gather system statistics while it is
running would I see great differences in the system statistics (I
haven't tried to test this because I only write well-tuned highly
efficient code <g>). I think I am sort of asking if really horrible code
can give a bad view of system statistics and is it really necessary to
regather system statistics on a regular basis. I have never really been
able to find an explanation for this and am looking for some guidelines.

Thanks.

Bill Wagman
Univ. of California at Davis
IET Campus Data Center
wjwagman@xxxxxxxxxxx
(530) 754-6208
--
//www.freelists.org/webpage/oracle-l


Other related posts: