RE: "Performance Counters"

  • From: "Cary Millsap" <cary.millsap@xxxxxxxxxx>
  • To: <jeffthomas24@xxxxxxxxx>
  • Date: Sat, 28 Apr 2007 18:30:13 -0500

Jeff,

 

Ted Coyle's answer to you is the light. One metric: response time.
Everything else is unreliable, meaning it's at best a waste of time.
Every other metric can motivate effort that runs actually counter to
your real goals.

 

 

Cary Millsap

Hotsos Enterprises, Ltd.

http://www.hotsos.com

Nullius in verba

 

Hotsos Symposium 2008 / March 2-6 / Dallas

Visit www.hotsos.com for curriculum and schedule details...

________________________________

From: oracle-l-bounce@xxxxxxxxxxxxx
[mailto:oracle-l-bounce@xxxxxxxxxxxxx] On Behalf Of Jeffery Thomas
Sent: Friday, April 27, 2007 10:02 AM
To: Oracle-L Freelists
Subject: "Performance Counters"

 

We have a SQL Server team going to manage a new vendor-built application
on 10g RAC (Solaris).     
They have zero  experience with Oracle.    They are asking the Oracle
team for 'performance counters' 
similar to what they use with SQL Server so that they can identify
"satisfactory performance criteria" to be 
used as non-functional requirements for the vendor building the
application.

The ones identified for SQL Server were:

SQL Server: SQL Statistics: SQL Compilations/Sec counter must not exceed
100 per second over a 2 minute period. 
Average Disk Queue Length, averaged over a 2 minute period must not
exceed 2.5
SQL Server: Buffer Cache Hit Ratio should exceed 90% measured over a 2
minute period.
The CPU usage must not exceed 35% averaged over a 2 minute period. 
The SQLServer:LockWaitTime must not exceed 2ms measured over a 2 minute
period

If you've got a vendor building an application for you -- what kind of
benchmarks -- if any -- actually make sense with 
respect to performance criteria that the vendor must meet?

Thanks,
Jeff

Other related posts: