Hi there, I think the classical definition of jitter transfer is related to how much jitter at the input of receiver get transfered to the output of the receiver. The classical definition of jitter tolerance is related to how much input reference clock jitter is acceptable for a given bit error rate. What if I modify slightly the definition of jitter tolerance to the receiver as, how much jitter can be applied to the input (instead of reference clock) before you have a receiver error. A practical way of doing the above is to send a known pattern such as CJTPAT to a receiver and then modulate it with a fixed frequency jitter. The minimum jitter amplitude that will trigger errors will be defined as jitter tolerance as above. So my question is, if one does the experiment above and plot out the jitter tolerance vs. frequency. How does that receiver jitter tolerance related to the receiver transfer curve ? Can I interpret the receiver jitter tolerance and transfer relationship as: a) For jitter tolerance > UI, the jitter transfer is 1 or unity b) For jitter tolerance < UI, the jitter transfer = UI - jitter tolerance - "some constant" where "some constant" is probably related to the setup and hold requirements of the receiver Thanks in advanced, Chris Cheng Distinguished Technologist , Electrical Hewlett-Packard Company +1 510 413 5977 / Tel chris.cheng@xxxxxx / Email 4209 Technology Dr Fremont, CA 94538 USA ------------------------------------------------------------------ To unsubscribe from si-list: si-list-request@xxxxxxxxxxxxx with 'unsubscribe' in the Subject field or to administer your membership from a web page, go to: //www.freelists.org/webpage/si-list For help: si-list-request@xxxxxxxxxxxxx with 'help' in the Subject field List forum is accessible at: http://tech.groups.yahoo.com/group/si-list List archives are viewable at: //www.freelists.org/archives/si-list Old (prior to June 6, 2001) list archives are viewable at: http://www.qsl.net/wb6tpu