[SI-LIST] Re: IBIS, si simulators, modeling and other sourcesof correlation error

  • From: "Scott McMorrow" <scott@xxxxxxxxxxxxx>
  • To: si-list@xxxxxxxxxxxxx
  • Date: Wed, 02 Apr 2003 12:49:45 -0800

I'm resending my response to Weston, since the formatting seems to have 
gotten all messed up on the previous response.



Weston,

Fear is a good thing.  If used properly, it is a good motivation for 
change.

We should always seek to understand the limitations of the tools we use, 
the approximations which are involved, and perform some sort of 
engineering analysis as to whether they are appropriate to the problem 
we are solving.  Signal integrity is really about margins:  timing 
margins and noise margins.  The question you have to ask yourself is 
whether your approximation of the problem (as embodied by modeling and 
simulation) have enough accuracy to insure that you can meet your 
margins. (Remember that there is a difference between precision and 
accuracy.  A precise answer delivered by a simulator may not be an 
accurate one.)  So, yes, do use the tools that you have ... but ... 
understand their fundamental limitations and do not believe their 
results implicitly. 
As a friend of mine often tells me, there is a fundamental operating 
principal in the the universe: the Conservation of Misery.  The laws of 
physics do not change, but we do often bend them for our own 
convenience, either analytically, computationally, or politically. 
Whenever we do this, misery is conserved in the form of incorrect 
assumptions, errors in approximations, or just plain wrong answers.  To 
borrow from your carpenter's analogy, you either use the right tool for 
a job, or you suffer the consequences.  I'm sorry, board level EDA 
signal integrity tools can no longer claim to solve the customer's 
problems when they do not state the assumptions used, the compromises 
made, and show correlation to real circuits in real systems, and the 
actual errors introduced by their methods.  With sub 150 ps edge rates, 
single ended busses operating at 200, 300, 400, 500 MHz, and 
differential busses running at 2.5, 3.125, 5, and 10 Gbps, all these 
"little things" begin to (and do) matter.  If they didn't, why would we 
have chipset vendors establishing trace length matching rules of +/- 10 
mils.

So, what could a board level signal integrity tool vendor do?  Well, for 
one, publish the known limitations of their products to there customers. 
Actually specify the accuracy of the algorithms used for field solutions 
and simulation.  State known problems in the extraction of signal and 
power/ground conductors.  State known problems and errors in the 
modeling of packages and connectors.  Show correlation to real 
measurements, and explain why differences exist.  State the accuracy of 
via modeling solution.  Then ... provide easily supported and automated 
ways in which the user can import externally derived models for pads, 
vias, connectors, transmission lines, black boxes into the modeling and 
simulation engine from external tools in useful formats like Spice and 
S-parameters.

Yes, in some cases, 3D field solvers are necessary and must be used to 
obtain accurate results.  These can  be used in areas where there are 
obvious limitations of board level tools, such as vias, non uniform 
trace geometries, package breakout regions ...etc.  A good, accurate 2D 
field solver that handles lossy conductors and dielectrics, that can 
predict the frequency dependent inductance along with resistance and 
conductance are an absolute must for anyone serious in knowing what real 
signals really do on boards.  Then, a field solver is useless without a 
method to actually simulate the lossy circuit in the time domain.  And 
... wouldn't it be nice if the methods have been shown to have some 
level of correlation with real measured results in the frequency and 
time domains?  For pre route characterization, it is very useful to be 
able to have all of these tools and methods at your disposal.  For post 
route characterization, wouldn't it be nice if a board level eda tool 
were able to provide a heuristic for finding the worst cases for a bus 
on a board, and then automatically extract them for further refined 
analysis using more advanced modeling techniques?  The board could be 
passed through the conventional signal integrity and timing scans to 
find the "big chunks" and then pass off the final verification to a 
simulation and modeling environment that is more capable and accurate, 
to provide the final simulations for sign-off.


The conditions are many and varied.  In general, if you follow the rule 
that anything that is 1/20 of a wavelength of the highest frequency 
transmitted  or 1/10th of the edgerate, or 1/10th of the timing margin 
that you are trying to achieve, then it ought to be modeled accurately. 
Given those very loose rule of thumb, for a bus operating at a frequency 
of 500 MHz, with driver rise times of 150 ps, and a design margin of 100 
ps,  in FR4 stripline, then we have:

100 ps modeling error for 500 MHz
15 ps modeling error for 150 ps rise time
10 ps modeling error for 100 ps design margin.

In this case, the modeling error required to achieve 100 ps margin is 
the limiting factor and requires a 10 ps modeling accuracy.  This is 
equvalent to accurately modeling every feature that is longer than about 
50 mils in length.  This would then include all traces, vias, pads and 
connectors and packages.  As you can see, had the timing design margin 
not been so severe, then the same design could have been modeled to an 
accuracy of  15 ps (about 80 mils) and if slew rate controlled drivers 
were used to limit the risetime to 500ps, then modeling accuracy could 
be reduced to 50 ps (about 300 mils).

So, for most common busses, it is either the rise time of the drivers, 
or the timing margin on the busses that is the limiting factor.

For differential signals, timing margin is usually not as severe, and 
losses begin to dominated.  In these cases it is usually the 3rd 
harmonic of the fundamental frequency of the fastest data pattern that 
is the limiting factor.  So, for a 2.5 Gbps serial link that uses 
encoding and clock recovery, the third harmonic of the bit rate is 3.75 
GHz, which is a 266 ps period.  1/20 th of this is 13 ps.  So any 
feature longer than 13 ps should be considered critical and modeled 
accurately. This results in a feature size in FR4 of about 70 mils, 
again, the size of some pads, and definitely with the range of size for 
vias and plated through holes. 
In any of these calculations, stubs have a multiplication factor of 4 on 
feature size, since they form 1/4 wave resonators and require the signal 
to traverse the stub 4 times.  Thus, in the 2.5 Gbps differential 
example above, stub lengths longer than 17.5 mils in FR4 are critical. 
This means that we need to especially look at things like square pads 
(which form stubs), test points not in line with the signal traces, and 
vias, which almost always have stubs.

best regards,

scott


-- 
Scott McMorrow
Teraspeed Consulting Group LLC
2926 SE Yamhill St.
Portland, OR 97214
(503) 239-5536
http://www.teraspeed.com


-- 
Scott McMorrow
Teraspeed Consulting Group LLC
2926 SE Yamhill St.
Portland, OR 97214
(503) 239-5536
http://www.teraspeed.com



------------------------------------------------------------------
To unsubscribe from si-list:
si-list-request@xxxxxxxxxxxxx with 'unsubscribe' in the Subject field

or to administer your membership from a web page, go to:
//www.freelists.org/webpage/si-list

For help:
si-list-request@xxxxxxxxxxxxx with 'help' in the Subject field

List archives are viewable at:     
                //www.freelists.org/archives/si-list
or at our remote archives:
                http://groups.yahoo.com/group/si-list/messages 
Old (prior to June 6, 2001) list archives are viewable at:
                http://www.qsl.net/wb6tpu
  

Other related posts: