hello all, my question relates to measuring Output disable time(active-to-float time Toff) time for a PCI/PCI-X output buffer. the PCI-X rev1.0 states that for active/float timing measurements- "the 'off' state is defined to be when the total current delivered through the component pin is less than or equal to the leakage current specification" the dc spec gives max. Input leakage current=+/-10uA. and ac spec gives max Toff=7ns My question is- 1. Why is tristate delay measured using Input leakage current? 2. If anyone has measured tristate delays for PCI output buffer(using simulations), then how? 3. What characterization load,if any, do i take? any enlightment on this issue will be appreciated. regards ADEEL AHMAD ------------------------------------------------------------------ To unsubscribe from si-list: si-list-request@xxxxxxxxxxxxx with 'unsubscribe' in the Subject field or to administer your membership from a web page, go to: //www.freelists.org/webpage/si-list For help: si-list-request@xxxxxxxxxxxxx with 'help' in the Subject field List archives are viewable at: //www.freelists.org/archives/si-list or at our remote archives: http://groups.yahoo.com/group/si-list/messages Old (prior to June 6, 2001) list archives are viewable at: http://www.qsl.net/wb6tpu