[opendtv] Re: 1080P Question

  • From: "Stessen, Jeroen" <jeroen.stessen@xxxxxxxxxxx>
  • To: "opendtv@xxxxxxxxxxxxx" <opendtv@xxxxxxxxxxxxx>
  • Date: Wed, 23 Sep 2009 13:41:12 +0200

Hello,



I wrote:
>> I would like to put it in another way: a more versatile interface makes it 
>> possible to shift

>> your problems to (the guy on) the other side of the link.



Mike Tsinberg wrote:

Ø  Yes the conversion to old format is now responsibility of the TV or the STB. 
This is always the case the when incoming signal is "better" then display 
format. It is also driving TV makers to upgrade the TV's to better capabilities.

In my example of uncompressed audio I shifted the "problem" of decoding to the 
source.
So it can work both ways. Other than that, I agree.


Ø  The 1080p/24 is not exactly in this category. The interpolation or simple 
frame repeating of incoming 1080p/24  in TV's capable of 72 or 120 Hz refresh 
will result in better motion then 60Hz refresh capable TV that will utilize 
traditional 3/2 pull down.

As you may know, Philips has been advocating "Natural Motion", i.e. 
motion-compensated
frame rate up-conversion for some 15 years now. Yes, 3:3 pull-down to 72 Hz is 
better
than 3:2 pull-down to 60 Hz (24 Hz judder versus 12 Hz judder), but conversion 
to true
60 / 120 / 240 Hz frame rate is much better yet. And once you have that 
capability, it does
not really matter whether the source is 24 Hz, or 60 Hz 3:2 pull-down 
(film-mode). We can
figure out which fields belong together and reconstruct the original 24 frames 
per second.


Ø  The Deep Color may also create a better result for more than 8 bit capable 
TV's.


True. Even if the display is still only 8-bits, it is worthwhile to eliminate 
unnecessary
rounding of intermediate results. IMO a video chain can tolerate at most one 
8-bit
bottleneck, and then only if the signal at that position is properly coded 
(i.e. full-range
and perceptually uniform, and with just enough noise to dither the quantisation 
steps).





Tom Barry wrote:

Ø  We talk about interpolation or repeating frames for 24p but it seems simpler 
if a multisync display just displayed them at 24 fps.  On a non-flickering 
(non-interpolating) fixed pixel display there shouldn't be any difference 
between 24, 48, or 72 Hz display when the source is 24p.  So why don't the 
displays just sync to 24?


It is true that there should not be any perceptual difference between 24 Hz 
1:1, 48 Hz 2:2
or 72 Hz 3:3. In all cases a film frame lasts 42 ms. But there are properties 
of LCD panels
that demand refreshing at a higher rate. Specifically: polarity inversion. If 
an LCD runs at
60 Hz, then the polarity is inverted every frame, for an inversion frequency of 
30 Hz. As the
behaviour for the two polarities is not exactly the same, this results in a 
minor amount of
flicker at 30 Hz. This is just not visible. It can be further suppressed by 
line or dot inversion
patterns. If one relies too much on line inversion then a line crawl can be 
visible, reminding
me of interlaced displays. Now imagine that we reduce the display refresh rate 
to 24 Hz,
then the polarity inversion will happen at 12 Hz, and the problems will be much 
more visible.
Also, the capacitors of the LC cells are not perfect, the leakage current may 
become an
issue. And, due to a finite response speed, the transfer function (voltage to 
light output) of
an LCD depends on the frame rate. So this is why the refresh frequency of LCD 
must be
limited to very few values, typically 50 and 60 Hz, or 100 and 120 Hz.

And of course we (as Philips) do not want to show you 24 Hz frame rate in the 
first place,
we think that the picture after up-conversion to 50..120 Hz is far more 
enjoyable.
See: http://www.philips.com/about/company/healthandwellbeing.page
TV is part of Consumer Lifestyle, and we must contribute to the customers' 
well-being.





Kilroy Hughes wrote:

Ø  3 blade shutters on film projectors evolved because a 72Hz blink rate

Ø  gave better motion fusion than a 24 Hz blink rate,


than 48 Hz ?

72 Hz gives less perceived flicker (at higher screen brightness) than 48 Hz. An 
48 Hz shutter
wheel ("butterfly" when translated from Dutch) gives more (blanking) time for 
the film
transport than a 72 Hz shutter, so I suppose that 72 Hz was not possible until 
the film
transport became faster or continuous, or 72 Hz flashing Xenon lamps were used 
that have
a longer dark time.



Ø  but I'm hoping someone on the list will explain the physiology behind that, 
and then Jeroen will explain how that applies to the different refresh 
"fields", etc. used in plasma, DLP, LCOS, LCD, etc.


TV is much brighter than cinema, so the minimum flicker frequency is now in the 
order
of 70 Hz. We introduced 100 Hz CRT TV in 1987 (IFA) - 1988 (1000 pieces in the 
market)
because the 50 Hz flicker was becoming unbearable. This was still based on 
field repetition,
so it did not exactly improve the motion portrayal or the perception of line 
flicker (25 Hz).



Ø  I know some LCDs insert black frames to simulate CRT/film blink, but it 
costs light level, so probably not used by most of those displays burning about 
9000 degrees Kelvin with terrible black level and gamma on the showroom floors 
so they look "brighter" than the displays around them.


Many of the displays advertizing "200 Hz" or "240 Hz" motion portrayal, 
including our own
Cinema 21:9 TV, use a scanning backlight (at 100-120 Hz frequency and 50% duty 
cycle)
to achieve the same motion portrayal as a true 200-240 Hz display. Ideally the 
backlight
must make double light output in half the time, and then there are no light 
losses. In the shop
mode the backlight is always on, and then the motion portrayal will be slightly 
worse.
You need a good -synthetic- test picture to see the difference, because 
normally the motion
blur of a camera (long shutter speed) dominates over the motion smear of the 
display.



Ø  Looks to me like manufacturers (and maybe consumers) have chosen 
interpolated frames and 120/240Hz update to solve the LCD motion problem 
without blinking the backlight.


True, but preferably they are combined. Scanning the backlight when it is 
dimmed anyway
costs almost nothing and can't hurt the picture either.



Ø  PS. IMO, Devices that "deinterlace" 24P to output 60P should be recycled 
ASAP.  Forcing the display to guess whether the real sample rate was 42ms or 16 
ms and interpolate whatever mixed frames that came from the STB deinterlacer 
for 120Hz update (vs. just starting with a clean 24P signal) is very bad system 
design, and one reason display interpolation often looks bad.


I wasn't sure what you meant, until I read later that you are referring to 
"blending" as a form
of temporal interpolation. This is indeed pretty useless. Only vector-based 
motion-compensated
up-conversion makes any sense. That is complex and expensive, and it can fail 
on occlusion,
de-occlusion, repetitive structures, small objects, etc. Still, judder is much 
much worse to watch.
A good solution is to send 1080p 24 Hz (25 Hz) from the Blu-ray player to the 
TV, and then let
the TV up-convert this to 120 Hz (100 Hz). I'm watching this every day, it is 
beautiful.

Groeten,
-- Jeroen


  Jeroen H. Stessen
  Specialist Picture Quality

  Philips Consumer Lifestyle
  Advanced Technology  (Eindhoven)
  High Tech Campus 37 - room 8.042
  5656 AE Eindhoven - Nederland







________________________________
The information contained in this message may be confidential and legally 
protected under applicable law. The message is intended solely for the 
addressee(s). If you are not the intended recipient, you are hereby notified 
that any use, forwarding, dissemination, or reproduction of this message is 
strictly prohibited and may be unlawful. If you are not the intended recipient, 
please contact the sender by return e-mail and destroy all copies of the 
original message.

Other related posts: