Re: Capacity Planning

  • From: MARK BRINSMEAD <mark.brinsmead@xxxxxxx>
  • To: exriscer@xxxxxxxxx
  • Date: Thu, 23 Feb 2006 11:26:02 -0700

I am also no expert on capacity planning, but I
do know that it is at once both much more complex
than that, and yet (sometimes) surprisingly easy.

The first thing to realise is that growth rates are
almost *never* linear.  It's been my experience that
this is number one error in capacity planning; the
foolish notion that if it takes X CPU cycles to
achieve Y units of work, it will take 10 * X CPU
cycles to achive 10 * Y units of work.

Depending on the task you are trying to achieve, the
growth curves may be logarithmic, quadratic, 
exponential, or whatever.  Now, here's the trick:
if you know even just a *little* bit about the 
critical tasks being performed, you can make a very
good guess about which kind of growth curve to use.

Reporting tasks are commonly dominated by sorting,
so  curve of 'N * log(N)'  (the Big-Oh complexity of
most sorting algorithms) might be a good guess.
(Here, N is really the number of records being 
sorted, but it's often equally valid to consider it
to be something like your daily transaction volume.)

Order entry tasks generally involve inserting 
rows into tables indexed with B-Tree indexes.
The Big-Oh complexity for inserting N rows into
a B-tree index is 'N^2 * Log(N)', so if you know
nothing else, this can be a very good first guess
for your growth curve.

The more you know about the implementation of the
application, the more reliably you can model the
growth curves.

Once you have chosen the best curves you can, all
you need is two good samples, and you can project
growth with surpising accuracy...

I have only had to do this once -- a *very* long
time ago -- but at the time I used this to plan
capacity for a Fortune-500 client implementing a
new mail-order line of business.  Based on only
moderate knowledge of the implementation of the
application (e.g., this task is dominated by 
sorting; that task is dominated by inserting into
indexed tables) and given accurate measurements
from (about) their 3rd and 6th months of operation,
I was able to project CPU capacity for 6, 12, 24,
and 60 months.

I was told after the fact that these had proven to
be the most accurate capacity plans ever done in
that organisation (they had a whole team of 
"professional" capacity planners) and that all of
my predictions up to 24 months had hit withing
+/- 10% of observed actuals.  Of course, I might
have just been lucky...  ;-)

Anyway, I have found this technique to be useful
in the past, although I've never had cause to
revisit it.  If you're not already familiar with
'Big-Oh' growth curves (or algorithm analysis in
general) there are numerous books on the subject.
As far as I know, the definitive ones on the subject
are from the authors Aho, Hopcroft and Ullman [who
often work(ed) in collaboration]; chances are, you
can still find their texts in most University
book stores.


----- Original Message -----
From: LiShan Cheng <exriscer@xxxxxxxxx>
Date: Thursday, February 23, 2006 4:26 am
Subject: Re: Capacity Planning

> Hi
> 
> Thanks for the reply.
> 
> I am quite lost in this area basically becase I am quite reluctant 
> aboutcapacity planning since I have seen quite a few and I have 
> seen cero
> success. Many times I would really like to ask the guy who did the 
> plan how
> he did it without knowing so many factors.
> 
> In my last project for example, a HP guy estimated 12 CPUs for a 
> serverhowever in the practice we almost doubled otherwise the 
> server would be CPU
> bounded.
> 
> I understand it is necessary but... whose job should it be? DBA? 
> HardwareVendor? Software Maker? System Admin? Mix of all is 
> probably the answer but
> then what information is needed by a DBA to perform such job?
> 
> Shall I tkprof 10 sessions get the cpu time and multiply by the 
> real number
> of users I will have? :-)
> 
> Cheers
> 
> --
> LSC
> 
> 
> 
> On 2/23/06, Jurijs Velikanovs <j.velikanovs@xxxxxxxxx> wrote:
> >
> > I'm interested in that question as well. I believe almost all DBA-s
> > had, have, will have to answer it.
> > The most difficult for me was CPU power, I/O throughput and Memory
> > planning for home made (custom development) applications
> > .
> > By my experience you can spend tremendous amount of time (not just
> > yours time) creating capacity plan for a particular system during an
> > application development time. Work with analysts, designers,
> > developers documenting detailed calculations and all assumptions. 
> But> end of the day you will get, a +/- 50% accurate result ;).
> > If at the end the applications will not perform well enough the
> > Developers will blame HW and will some think like "You need 
> double HW
> > resources to get XX sec response time". DBA-s and Admins would 
> say "We
> > already have some thing like top HW". It is always difficult to 
> say if
> > developers have done their work well without knowing the 
> applications> business.
> > .
> > At the moment I think that most effective way to plan that type of
> > capacity is make assumptions based on you or others project members'
> > previous experience.
> > - If in the past you have worked with kind of systems you currently
> > developing you know already the most important parts of application
> > you have to pay attentions. This is there you have to concentrate 
> your> attention. In most cases it is something like 2% of overall
> > application code. Describe, prototype, play with that bit and of the
> > day you will get -/+ 10% accurate planning.
> > - If in the past you have worked with much bigger systems, with huge
> > amount of data processing you can think like. This system is 5 times
> > smaller the system I have worked with. Presumably this system 
> will run
> > successfully on 2-3 times smaller HW and we will spend 2 times less
> > time to tune it.
> > - If you have chance to get information about system like you are
> > going to implement (like OEBS, SAP, or other pre developed
> > application, or the same functional application, or old system you
> > going to replace), you can base your assumption on that information.
> > .
> > Later on then Developers will say you haven't enough HW capacity, 
> you> would be able to base you conclusion on a comparison of the 
> developed> application and other systems.
> > .
> > I don't think this approach is ideal, but at the moment I haven't 
> found> better.
> > I would be glad to hear others opinion.
> >
> > Thank you in advance,
> > Jurijs
> >
> >
> > On 2/23/06, LiShan Cheng <exriscer@xxxxxxxxx> wrote:
> > > Hi
> > >
> > > I was wondering if anyone know any source about effective 
> Oracle RDBMS
> > > Capacity Planning. I mean effective because in the past I have 
> seen many
> > so
> > > called "Capacity Plan" which failed miserably in the practice. 
> I am not
> > sure
> > > how can a DBA perform Capacity Planning without knowing much 
> about the
> > > application?
> > >
> > > Cheers
> > >
> > >
> > > LSC
> > >
> > >
> >
> >
> > --
> > Jurijs
> > +44 7738 013090 (GMT)
> > ============================================
> > http://otn.oracle.com/ocm/jvelikanovs.html
> >
> 

--
//www.freelists.org/webpage/oracle-l


Other related posts: