Re: Star Transformation - 10g

  • From: "Thomas Day" <tomdaytwo@xxxxxxxxx>
  • To: oracle-l@xxxxxxxxxxxxx
  • Date: Wed, 5 Dec 2007 10:13:34 -0500

This is what were're doing -

star_transformation_enabled needs to be set to TRUE in the initialization
parameters.


hash_join_enabled also needs to be TRUE.  It's default is FALSE.



_always_star_transformation  cannot be set in the initialization prarmeters.
It needs to be TRUE.  It's default is FLASE.  The command 'ALTER SESSION SET
"_always_star_transformation" = TRUE;' should be placed in a global logon
trigger.



sort_area_size should equal sort_area_retained_size and both should be set
to the maximum amount of memory in bytes that a session can allocate.



bitmap_merge_area_size should also be set to the maximum amount of memory in
bytes that a session can allocate.  Typical for data warehouses 100MB to
250MB are reasonable if enough memory is available.



create_bitmap_area_size should be ditto above.  If memory is limited it can
be set to 50% of bitmap_merge_area_size.



pga_aggregate_target should NOT be used.



Primary keys and foreign keys are enforced with b-tree indexes.  They should
be set to RELY.  Oracle does a b-tree to bitmap transform on the fly.



Primary keys of fact tables consist of all the primary keys of the
participating dimensions.  Each of those should be foreign keyed to its
dimension.  I like to make fact tables IOTs but that's just an acquired
taste and has nothing to do with your question.



Any column that participates in a WHERE clause (except for already b-tree
indexed columns) should have a bitmap index.  Ignore guidelines on
cardinality.

Other related posts: