adaptive threshold significance level

  • From: Ls Cheng <exriscer@xxxxxxxxx>
  • To: Oracle Mailinglist <oracle-l@xxxxxxxxxxxxx>
  • Date: Tue, 28 Oct 2014 15:28:55 +0100


I am trying to use adaptive threshold and set the threshold based on
significance level, but while I am reading what does this do the definition
of significance level is a bit hard to understand.

I am reading the performance guide from 12c database documentation, when it
talks about the significance level it states this:

*Significance level thresholds are most useful for metrics that exhibit
statistically stable behavior when the system is operating normally, but
might vary over a wide range when the system is performing poorly. For
example, the response time per transaction metric should be stable for a
well-tuned OLTP system, but may fluctuate widely when performance issues
arise. Significance level thresholds are meant to generate alerts when
conditions produce both unusual metric values and unusual system
performance.Significance level thresholds can be set to one of the
following levels:    High (.95)    Only 5 in 100 observations are expected
to exceed this value.    Very High (.99)    Only 1 in 100 observations are
expected to exceed this value.    Severe (.999)    Only 1 in 1,000
observations are expected to exceed this value.    Extreme (.9999)    Only
1 in 10,000 observations are expected to exceed this value.*

In OEM by default when adaptive shreshold is configured it sets
0.95 percentile for critical level and 0.95 for warning. By reading the
above statement it seems t me that it is saying for warning level it will
rise an warning alert when 0.05 percent of observations are above the
threshold and raise a critical alert when it is 0.01 percent. Should not it
be the opposite? When a higher number of observation is higher than the
threshold a critical alert should be raised?


Other related posts: