question about cpu usage

I'm not a hardware guy or sys admin person so forgive me if this is a stupid 
question. Leaving out all other variables(such as IO), should I expect 
performance to be the same in a databse if the server it is riding at is at 90% 
cpu usages as opposed to 10%? since there would still be spare cycles? Or is 
there a declining returns as you get closing to the maximum available cpu 
usage? 
--
http://www.freelists.org/webpage/oracle-l

Other related posts: