Hi folks, A new project has come our way that will require a decent amount of SQL testing and tuning. We thought we would ask the community for some suggestions on best approaches/solutions. Among other things, it’s looking like we will have to send result sets of several thousand records every 30 seconds to user browsers. Those few thousand records will be collected from multiple tables via UNIONs and JOINs. We are considering several approaches: Straight SELECT statements Using stored procedures to return result sets PIPELINED functions Bulk operations A "regular view / views" Materialized views Possibly compressing the result set for network transmission (if this isn’t already done automatically) Other ideas… Further background: It’s an 11.2.0.3 database with over 400 million records spread out over several schemas. It receives 3.5 million updates per day. Most records contain alarm data. The primary goal will be to retrieve uncleared alarms, plus alarms that were updated in the last 5 minutes. Right now, there are 41,000 uncleared alarms, which are part of the 240,000 alarms that were updated in the last five minutes. We would be returning subsets of this data. The developers (and we) are looking for a straightforward way to retrieve everything in one shot – though that can mean, on our side, morphing records from one table to look like those in another. Thanks, Lou Avrami