[opendtv] Re: ITU-T/MPEG, H.264/AVC, H.265/HEVC

  • From: "TLM" <TLM@xxxxxxxxxx>
  • To: <opendtv@xxxxxxxxxxxxx>
  • Date: Thu, 14 Feb 2013 15:37:20 -0800

We know that all of these coding standards are asymmetric in the sense that
the real complexity & cost burden is on the ENcoder.  There are a huge
number of tools available in the respective Standards, and the encoder can
choose to leverage them in all kinds of combinations depending on the
business goals of who is producing (and purchasing) the resulting
bitstreams.  The types of content being ingested by the ENcoder may also
have an effect - progressive versus interlace, film (with grain) versus
electronically captured.  Another example might be in the quality dimension
- a really expensive ENcoder might be able to generate higher quality at the
same bitrate, or equivalent quality at a lower bitrate.  Lower bitrate may
mean less bandwidth and buffering required at the decoder end of the pipe,
and lower CPU usage.  And so on.  Another would be the various levels and
profiles.  Depending on who is purchasing the bitstream for XYZ delivery
pipe, they could order it up in any one of a number of forms.  A side effect
of this could be that their product might be easily usable on lower-end
platforms, perhaps even H.265/HEVC for mobile devices (you will see more
press on that shortly).

So the fact that you saw some troubles at the time might have a number of
root causes that have nothing to do with the Standard per se.  Or maybe you
had a slow PC at the time.  Or maybe there were other processes running the
background.  Who knows.

Safe to say that the newer CPUs can easily handle all of the HD resolutions
and frame rates.  Some of the new DEcoder APIs and Apps also are able to
sniff out the availability of a GPU, and, if there, can unload a good
percentage of the load onto that chip or chipset (some of the new GPUs are
outright awesome, even the one on the low end platforms).

I hate to bring this up again, but interlace versus progressive also may
have an effect.  Having to deinterlace at the receiver end (whether it be a
PC or in a flat panel) only adds additional processing burden and consumes
additional CPU cycles - no way around it.  As with the above, there are a
range of deinterlacer designs, some more costly than others.  But we all now
know what the right solution to that problem is....especially now that all
new CE displays are progressive.

And finally, I'll paraphrase what went on on another thread - 720p versus
1080i.  Some were asserting that 1080i ALWAYS looked better than 720p.  So I
asked the following question:  "Does anyone ever consider the ENcoders?
There are varying degrees of quality at the ENcoding end, independent of the
720p60 and 1080i30 numerology that people so often want to talk about.  SO
let's go back to my original post and the issue that started this - if you
have two 1080i ENcoders, A and B, where A is a relatively inexpensive
encoder and B is a very expensive encoder from a company that's know for
pulling out all the stops, is it likely that both 1080i emission streams
have identical perceptual quality? 

Similarly, if you have a really good 720p encoder and compare its decoded
imagery to that from A or B mentioned above, will A and B always have better
perceptual quality simply because they are 1080i? 

Finally, since we know definitively that interlace encodes less efficiently
than progressive, if you have a fixed bitrate (broadcast) channel, the
interlaced encoders mentioned above are shackled by having to do
prefiltering in maybe H or V or both, and maybe coarser quantification and
entropy coding. SO are encoders A and B both making exactly the same
decisions in those respects? Rather unlikely I would expect if they are
coming from different manufacturers are targeting different price points."

Sorry for getting slightly off topic.  But as John Muir supposedly said "
When we try to pick out anything by itself, we find it hitched to everything
else in the Universe".  I would hope that the people on this are a bit more
open minded than to be saying things like "X is *always* better than Y",
especially when it comes to video formats, video coding, or the UX, UI &
human-factors design issues that are currently being discussed here.

-----Original Message-----
From: opendtv-bounce@xxxxxxxxxxxxx [mailto:opendtv-bounce@xxxxxxxxxxxxx] On
Behalf Of Manfredi, Albert E
Sent: Thursday, February 14, 2013 2:06 PM
To: opendtv@xxxxxxxxxxxxx
Subject: [opendtv] Re: ITU-T/MPEG, H.264/AVC, H.265/HEVC

TLM wrote:

> It may also be useful to note that recent reports say Intel is 
> embracing H.265/HEVC for their new platforms.

I have a question on this.

When H.264 rolled out, I soon discovered that my PC at that time couldn't
really keep up. It seemed unable to decode all of the required frames in
time, which often resulted in a jerky presentation.

So my question is, will today's Core i3, i5, and i7 processors be able to
decode H.265, and/or will video adapter cards, using PCIe for instance, be
available in case the CPU can't hack it? Or is H.265 going to do for PC
sales what Win8 apparently couldn't achieve?

Thanks!

Bert

 
 
----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at
FreeLists.org 

- By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word
unsubscribe in the subject line.


 
 
----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at 
FreeLists.org 

- By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word 
unsubscribe in the subject line.

Other related posts: