Hi Ryan, Just test to scale and measure it. Of course, CPU usage is affected by many things, so it will not be easy to express the results in a way that might be meaningful to management. Nevertheless, in cases like this, I think that showing a non-trivial difference is enough. @ Regards, @ Steve Adams @ http://www.ixora.com.au/ - For DBAs @ http://www.christianity.net.au/ - For all -----Original Message----- From: Ryan [mailto:ryan_gaffuri@xxxxxxxxxxx] Sent: Wednesday, 2 November 2005 3:17 PM To: steve.adams@xxxxxxxxxxxx Cc: 'Oracle-L Freelists' Subject: Re: normalization Steve, Do you know a way to take the data you presented and do a test to show how much additional CPU usage this will require under specific conditions? if the trigger is fired 'X' number of times in a certain period, how much increased CPU will I need? How much will this reduce response time and/or scalability? If I was to present this to a manager, I would typically need to show something more tangible. Just because certain data elements that I gather from the data dictionary increase, does not mean it will cause a problem for us. So I would need to show realistic estimates. any ideas on how to do this type of test? -- //www.freelists.org/webpage/oracle-l