[SI-LIST] intra-pair skew and jitter

  • From: "Eric Bogatin" <eric@xxxxxxxxxxxxxxx>
  • To: <si-list@xxxxxxxxxxxxx>, <jim.nadolny@xxxxxxxxxx>
  • Date: Thu, 10 May 2012 12:07:44 -0500

Jim-
 

Your comment about mode conversion and jitter is timely. I've been trying to
understand the problem with comm signals in diff channels and thought I
would add a follow on to your observation.

 

My question is, "why are common signals at the RX bad?" and the follow up
question is, "how much is too much?"  And for those who attempt to provide
an answer, let me start your answer for you. 

 

"it depends."  I think we are all interested in what are the factors that
influence the decisions?

 

I wrote an article with some of the answers for PCD&F magazine a year or so
ago, a copy of which is available for download on my web site,
www.beTheSignal.com , as BTS-329. In the SI library menu. To summarize the
article, the answer I usually hear, breaks down into 3 categories:

 

1.      Distortion of the diff signal. What Jim observed is that even a
gross line to line skew and a large fraction of comm signal generated, the
diff signal may have its rise time degraded, but the jitter created just
from the distortion in the edge is pretty small. An eye is pretty robust to
distorted diff signal.

 

2.      Generation of EMI. IF (caps intended) the comm signal gets out on
unshielded twisted pairs (UTP), then even 1 mV of comm signal can cause an
FCC class B failure. But what if it stays inside the box? If the common
signal stays on a board with its return in the adjacent plane, it does not
radiate. Bruce Archembeaux has pointed out that if the common signal passes
through a poorly engineered connector to a daughter card- so the common
returns are not adjacent to the diff lines, the "ground" (my apologies,
Bruce) bounce on the connector can drive the daughter card as a patch
antenna and generate EMI inside the box which can leak out. But this does
not impact the RX.

 

3.      If the common signal generated by a local asymmetry rattles around
due to not being terminated at the RX and the TX, then each time it passes
through the asymmetry, it can re-convert to diff signal and this will appear
as asynchronous diff noise at the RX, which will increase vertical collapse
and jitter on the eye. A 10% of UI asymmetry can generate 10% of UI in
jitter. I've always assumed this was the worst problem with common signals
and why the typical recommendation in specs is limit the line to line skew
to 10% the UI. This problem is dramatically reduced if the comm signal is
terminated at either end of the line.

 

 

At DesignCon 2012, I learned from Scott McMorrow that a poor Common mode
rejection ratio (CMRR) at the RX can translate asynchronous comm signal into
diff noise, randomly added across the UI. I have heard from other buddies of
mine who work on chip design that he is correct- the CMRR drops off quickly
above ~ 5 GHz. 

 

Is this the real problem with mode conversion? Is the reason mode conversion
causes a problem is because of the reduce CMRR and that any asynchronous
comm signal will be perceived as diff noise by the RX and add to collapse of
the eye?

 

Does anyone have any examples of the CMRR of RX they can share and how it
drops off with freq?

 

If this is the problem, why isn't the comm signal magnitude at the RX
spec'd, rather than the asymmetry? After all, comm signal will also be
generated on a channel by channel to channel cross talk.

 

And while I am throwing questions to the group to crowd source answers, why
do so many specs have an SCC11 or even an SCC21 spec, when it has nothing to
do with the amount of common signal present, mode conversion or EMI?

 

Of course, spec writers (and anyone who takes my S-parameter or Channel
design class) know that specifying both an SDD11 and an SCC11 also defines
the coupling between the two lines that make up the diff pair. If the spec
is going to define a coupling, why not explicitly say, tightly coupled or
loosely coupled and just the SDD11 spec?

 

If someone wants to contact me off line with answers or comments, I will
keep your identity secret. If I get good answers, I will share them in a
future blog post. 

 

You can read my blog at www.bethesignal.com.blog.

 

Thanks for reading this far and I welcome comments and answers.

 

--eric

 

 

 

*******************************************************
Dr. Eric Bogatin, Signal Integrity Evangelist

Bogatin Enterprises

Setting the Standard for Signal Integrity Training
web site:  <http://www.bethesignal.com/> www.beTheSignal.com

Blog:  <http://www.bethesignal.com/blog> www.beTheSignal.com/blog 

Twitter @beTheSignal
e:  <mailto:eric@xxxxxxxxxxxxxxx> eric@xxxxxxxxxxxxxxx

26235 W 110th Terr
Olathe, KS 66061
v: 913-393-1305 cell: 913-424-4333  skype: eric.bogatin
*********************************************** 

Msg: #8 in digest

From: Jim Nadolny <jim.nadolny@xxxxxxxxxx>

Subject: [SI-LIST] intra-pair skew and jitter

Date: Tue, 8 May 2012 15:26:45 +0000

 

We all know skew is the bane of differential signaling...at least I always
thought so.  But some simulations have me re-thinking this a bit.

First - this is a test application with phase matched coax cables sampling a
10 Gb/s signal.  The  question is "How tightly phase matched should these
cables be?".  Conventional wisdom says that they should be phase matched
within a few ps.  In PCB design we match trace lengths to within a few mills
or less for EMI/crosstalk reasons.  This design practice is transferred into
coax cable specs (for test applications) which have very tight phase match
requirements and adds test cost.

 

I wanted to look into this a bit deeper so I ran some sims in ADS.  I'm
looking at the ideal case with a simple timing shift in an uncoupled
lossless system.  I'm working at 10 Gbps but let's normalize everything.
The risetime is 0.2UI (20-80%).  The results were surprising to me in that
jitter was not affected by even gross level of intra-pair skew.

 

With 0 UI skew we have 0 UI of total jitter, again a lossless ideal system
is the focus

With 0.05UI skew, we have 0 UI of jitter and the risetime degrades 0.203 UI

With 0.1UI skew (10 ps) we have 0 UI of jitter and the risetime degrades to
0.0.21 UI

With 0.4UI skew we have 0 UI jitter and the risetime degrades to 0.46 UI
(this is "ludicrous" intra-pair skew and still no jitter)

 

Once we get to 0.5UI of skew we get "huge" jitter because of a shelving in
the transitions.

 

Clearly as intra-pair skew increases the differential risetime degrades
(increases).  This is consistent with increased differential insertion loss
due to mode conversion as skew increases.  But the eye pattern does not show
any increased jitter which is counter intuitive.

 

Before we get all giddy about these conclusions let's bear in mind a couple
things"

 

*        EMI/crosstalk is sensitive to mode conversion and is a good
motivator to keep things matched

 

*        Coupled systems (twisted pairs, twinax) are a bit of a different
animal than this coax cable test application.  Mode conversion and in
re-conversion is a different effect that does impact jitter.

 

Have others observed this lack of jitter with increasing intra-pair skew?

 



------------------------------------------------------------------
To unsubscribe from si-list:
si-list-request@xxxxxxxxxxxxx with 'unsubscribe' in the Subject field

or to administer your membership from a web page, go to:
//www.freelists.org/webpage/si-list

For help:
si-list-request@xxxxxxxxxxxxx with 'help' in the Subject field


List forum  is accessible at:
               http://tech.groups.yahoo.com/group/si-list

List archives are viewable at:     
                //www.freelists.org/archives/si-list
 
Old (prior to June 6, 2001) list archives are viewable at:
                http://www.qsl.net/wb6tpu
  

Other related posts: