[opendtv] Re: Overscanning on LCD TVs

  • From: Kilroy Hughes <Kilroy.Hughes@xxxxxxxxxxxxx>
  • To: "opendtv@xxxxxxxxxxxxx" <opendtv@xxxxxxxxxxxxx>
  • Date: Fri, 13 Nov 2009 17:26:45 +0000

There are two parts to "setup"; the encoding of the images and the wire format.
DVI assumes no setup in the wire format > (1, 1, 1) = 7.5 IRE black, below that 
is "electronically clipped".
HDMI "video modes" do, so 8-bit (16, 16, 16) = 7.5 IRE black and less than that 
is "sub-black", which should be "visually clipped" not "electronically clipped".
NTSC analog assumes setup in the content, the wire format includes setup in the 
US, doesn't include setup in Japan (similar situation with analog wire formats).

Whether or not encoded video goes over the DVI wire with setup depends on how 
it was encoded and the decoder/GPU used.  Other graphics, like the desktop and 
JPGs can be mapped to "video" setup, primaries, matrix, gamma, etc. by the GPU 
so it can go out a DVI connector and in an HDMI connector ... or it could be 
mapped to a visible range of 254, with a wider 24-bit gamut, different color 
point and gamma appropriate for accurate bright room "computer monitor" viewing.

You can get it wrong 1. Encoding, 2. Decoding (YCbCr 4:2:2 to RGB 4:4:4), 3. 
RGB > DVI or HDMI wire format, 4. Wire format to IRE light level in the display.
Various images sources make different assumptions several parameters, and each 
step in the signal chain makes its own assumptions.  Unfortunately, our "closed 
system" analog roots are showing in that metadata capturing those assumptions 
falls on the floor at each interface rather than traveling to the display point 
so that the original "render intent" can be implemented in the display 
appropriate for its viewing conditions.  An end to end system design would have 
been the right way to do this, but the reality today is that different 
assumptions and conventions only allow some combinations of image source and 
display to work accurately some of the time with professional calibration 
rather than automatic system response to encoded format information.

The thread started on "overscan", but evolved to other stuff that is often 
wrong:  setup/black level, sample aspect ratio > pixel aspect ratio, cropping, 
primaries/color temp, matrix, gamut, gamma, progressive encoded as interlace 
(with 3:2 pulldown), etc.  With 8-bit video, the only way to avoid stripes in 
the blue sky is to leave quantization alone until the display point, and then 
make any gamma, gamut, setup, etc. adjustments when translating to light.

Kilroy Hughes

From: opendtv-bounce@xxxxxxxxxxxxx [mailto:opendtv-bounce@xxxxxxxxxxxxx] On 
Behalf Of Stessen, Jeroen
Sent: Thursday, November 12, 2009 11:14 AM
To: opendtv@xxxxxxxxxxxxx
Subject: [opendtv] Re: Overscanning on LCD TVs

Hi,

dan.grimes@xxxxxxxx<mailto:dan.grimes@xxxxxxxx> wrote:
> I was under the impression that HDMI and DVI use the same standard
> and are electronically equivalent.  Of course, just because the
> signals mate doesn't mean the color space is the same.
Duh... no !
DVI is always R'G'B', 4:4:4, 8 bits, 0..255, and sRGB (Rec.709) color 
primaries, no audio.
HDMI can be that, but it can also be Y'CbCr in both Rec.601 (up to 576p) and 
Rec.709 (HD)
flavors, 4:2:2 or 4:2:2, 8 or 10 or 12 bits, 16..235(240) or 0.255 (times 4 or 
16 dep. on # bits),
can be xvYCC extended color space, and in the future even other color primaries 
(e.g.
AdobeRGB). Plus many flavors of digital audio, of course, and CEC command 
channel.
Normally all these standards are published and negotiated over the EDID 
channel. In the
absence of EDID, it should default to DVI. Which should mean 0.255, but for 
video it is
actually better to stay with 16..235, so IMO a source could be forgiven for 
leaving the
amplitudes as is. Only problem is: you don't know that it does. You could guess 
it from
the resolution, i.e. 1280x720 and 1920x1080 are 16..235, and PC resolutions are 
0..255.
Or you leave it to the viewer to adjust contrast and brightness until the 
display is fully
driven. Which in the case of a digital cinema projector is not so easy. Duh...


> So any suggestions what one might use to analyze or discover the color spaces 
> of > an HDMI and DVI-D signal?  The same question might be had for HD-SDI.

The color space, if you mean the color primaries, is almost always Rec.709 
a.k.a. sRGB.
If you mean which flavor of Y'CbCr (for HDMI): 601 for SD, 709 for HD, 
irrelevant for
R'G'B' from a PC. The rest can be inferred from the EDID data, which is plain 
I2C that
can be observed on the PC with the proper software. But it is never really easy.
OTOH, if you can just measure the 24 levels in a plain color bar, you would know
practically everything. It should be mandatory to send it during the VBI....

> We are converting between HDMI, DVI and HD-SDI to use different types and
> levels of equipment.  It is impossible to use just one standard these days.
> But in the conversion, things get messed up and I am experiencing this.
> I would like to be able to straighten it out.
Can't the customer rep for the analyzer help you with some documentation ?

Groeten,
-- Jeroen

  Jeroen H. Stessen
  Specialist Picture Quality

  Philips Consumer Lifestyle
  Advanced Technology  (Eindhoven)
  High Tech Campus 37 - room 8.042
  5656 AE Eindhoven - Nederland

________________________________
The information contained in this message may be confidential and legally 
protected under applicable law. The message is intended solely for the 
addressee(s). If you are not the intended recipient, you are hereby notified 
that any use, forwarding, dissemination, or reproduction of this message is 
strictly prohibited and may be unlawful. If you are not the intended recipient, 
please contact the sender by return e-mail and destroy all copies of the 
original message.

Other related posts: