Re: Question re statistics

  • From: "Finn Jorgensen" <finn.oracledba@xxxxxxxxx>
  • To: wjwagman@xxxxxxxxxxx
  • Date: Thu, 29 Nov 2007 22:35:13 -0500

Hint SQL statements to force them into known/proven execution paths? Use
stored outlines for the same purpose?

Finn

On Nov 29, 2007 8:11 PM, William Wagman <wjwagman@xxxxxxxxxxx> wrote:

> Greetings,
>
> I have come up against this situation before and have never been quite
> sure how to handle it. Running 10gR2 on RHEL 4 EE. In a home grown
> application there are a series of tables in which records are either
> inserted or deleted such that the number of records in the table is
> constantly changing. It is not a large table, the maximum number of
> records at any one time is 5000 but may vary between 0 and 5000. What
> they are seeing is that as updates to the table proceed performance
> degrades because allegedly statistics become old and unhelpful. So what
> they have done is create a job which runs every few minutes and analyzes
> these tables so the statistics stay fresh and performance stays up to
> their standards. Here is a description of one of the tables in question
> -
>
> SQL> desc org.requests
>  Name                                      Null?    Type
>  ----------------------------------------- --------
> ----------------------------
>  REQUESTOID                                NOT NULL RAW(16)
>  REQUESTGROUPOID                           NOT NULL RAW(16)
>  REQUEST_TYPE                              NOT NULL CHAR(1)
>  REQUEST_PERSONOID                         NOT NULL RAW(16)
>  OBJECT_TABLE_CODE                                  CHAR(2)
>  ORIGINAL_OID                                       RAW(16)
>  PENDING_OID                                        RAW(16)
>  REQUEST_ACTION                                     CHAR(1)
>  REQUEST_DATE                                       DATE
>
> I haven't yet traced a session. I've looked at awr reports but am not
> seeing anything helpful there, yet.
>
> So, my question, and I realize it is an open ended one, can someone
> offer suggestions for resolving this issue and things to look at to
> track down the problem. It just strikes me as being very inefficient,
> silly almost, to analyze these tables every 5 minutes but I don't know
> how to approach a solution.
>
> Thanks.
>
> Bill Wagman
> Univ. of California at Davis
> IET Campus Data Center
> wjwagman@xxxxxxxxxxx
> (530) 754-6208
> --
> //www.freelists.org/webpage/oracle-l
>
>
>

Other related posts: