All, I have a question that some IC design guru could help with. The issue is how much additional random jitter is added to an incoming signal by the receiver. Let's assume the receiver is driven well beyond threshold, so incoming signal strength is not an issue. Further, the signalling is discrete binary amplitude (digital). When such a circuit is done at the PCB level by an RF designer, a stability analysis is done to ensure there is no gain peaking or marginally stable behavior of the circuit. Any noise generated by devices in the circuit is magnified by gain peaking resulting from unintentional feedback. This feedback can occur within the devices themselves and depends on the reflections and matching within the circuit. An extreme case of a marginally stable condition can cause brief periods of oscillation as the signal transitions, making the exact timing of the edge more uncertain, thus adding jitter. When considering this situation for IC design the situation becomes unclear. I've discussed methodologies with analog ASIC designers. The tools used are some form of Spice. None of the designers mentioned any consideration of stability. If the instability in the circuit is at a very high frequency, Spice may not reveal it due to the finite time step duration. Is there a concern in the IC design case? Does this problem occur at the IC level? If it doesn't, then why not? If it does then what is a typical design procedure to analyze it? Thanks, Chuck Hill ------------------------------------------------------------------ To unsubscribe from si-list: si-list-request@xxxxxxxxxxxxx with 'unsubscribe' in the Subject field For help: si-list-request@xxxxxxxxxxxxx with 'help' in the Subject field List archives are viewable at: //www.freelists.org/archives/si-list or at our remote archives: http://groups.yahoo.com/group/si-list/messages Old (prior to June 6, 2001) list archives are viewable at: http://www.qsl.net/wb6tpu