[opendtv] Re: MPs back Ofcom stance on spectrum sale

  • From: Craig Birkmaier <craig@xxxxxxxxx>
  • To: opendtv@xxxxxxxxxxxxx
  • Date: Wed, 23 May 2007 08:57:49 -0400

At 3:16 PM -0400 5/22/07, Manfredi, Albert E wrote:
Craig Birkmaier wrote:
 > First, you cannot equate pixel density and bandwidth requirements.

I already showed you how to determine the max bandwidth available for
sending baseband info to the display. If you use that same scheme, you
will find that the 2.25X factor exists for 720 at 60p compared with 1080
at 60p. It is the maximum possible, so you can avoid going around in
circles wondering about the detail content in the 720p vs the 1080p
images, entropy, etc. All of that is canceled out of the equation. In
the worst case, 1080p tranbsmission *is* indeed reasonable, if your
compression algorithm is 2:1, or slightly better, against H.262. Simple
statement, Craig.

Hmmmmm - I wonder how we can cancel Bert out of the equation. To say that the statement above is uninformed is being kind. It is pur rubbish.

We are not talking about baseband image streams and the bandwidth required to deliver uncompressed image streams to a display. You might be, but this completely misses the point.

The ability to push uncompressed 1080@60P to a display is not in question. We have been doing this for a number of years with computer monitors and moving uncompressed 1080@60P (4:2:0) and even 4:4:4 over multiple HD-SDI links for years as well. HDMI was designed to deal with this requirement in consumer equipment.

The problem is delivering high quality COMPRESSED versions of these formats through an emission channel that is constrained to less than 18 Mbps.

You are WRONG about the notion that the improvement in compression efficiency with H.264 is sufficient to enable the delivery of 1080@60P. Yes it is possible, just as it is possible to deliver crappy 1080@30i through the ATSC pipe. You can get away with it for source that is not too complex, but it falls apart when the information content causes the peak bit rate requirements to spike to levels of 30 Mbps or higher. Even 720P can cause the peak bit rate requirement to exceed 30 Mbps.

Since the channel cannot handle these peaks the encoder is forced to quantize the hell out of the source to make it fit, and we see compression artifacts. OR we must pre-filter the source, removing high frequency detail BEFORE it is encoded.

IF you don't believe me, look at the EBU demonstration results. Improvements in the performance of H.264 encoders is not going to change the basic fact that there is simply too much information to squeeze into the emission channel if you want to maintain decent image quality. We will see marginal improvements in the next few years, but we would need another 2;1 improvement or more to make this stuff fit. Perhaps in another 10 years when H.264 is replaced...



 next, you are making an invalid assumption about the adequacy of
 20 Mbps for 720P.

No. I am only saying that if it's adequate for 720p, it will be equally
adequate for 1080p, with the better compression algorithm. Again, *how*
adequate is canceled out, in the comparison.

And this statement is wrong as well.

720P will compress more efficiently UNLESS the information content of the 1080@60P source is equal to or less than the information content of the 720P source. You can compress both to the same bit rate, but you will throw away most of the higher frequency detail in the 1080@60P source, which cancels out the ONLY reason why we would want to deliver 1080@60P in the first place.

 > The U.S. broadcasters are protecting "their" spectrum and
 retransmission consent.

 Now the U.K broadcasters are are using HDTV to try to protect
 their spectrum.

They aren't. The spectrum they are asking for is not even as much as
they had in analog. Their SDTV programs take up less spectrum than they
used to need, and Ofcom is telling them they cannot have any of the
analog spectrum back for HDTV.

Check your facts Bert.

The analog service includes the following channels

BBC one
BBC two
ITV (referred to by Ofcom as "Channel 3")
Channel 4
Channel 5 (branded simply as 'five').

Channel 5 was barely squeezed into the available broadcast spectrum and in some area of the UK is carried on a UHF channel. In some areas of the UK there are additional local services in the UHF spectrum.

The digital service operates from Five (that's 5) multiplexes Bert.

Near as I can tell this is the same amount of spectrum Bert, given the fact that they will be able to deliver Freeview to everyone without the need to use additional UHF channels to fill in the gaps that existed in the analog service.

If Ofcom really wanted to limit broadcasters to what they had with the analog service, this could have been done in one or a maximum of two digital multiplexes. Now THAT would have reduced the spectrum available to broadcasters.

But the SAME amount of spectrum was used for Freeview to deliver more than 30 channels of SDTV. Now the U.K broadcasters are saying that they need MORE SPECTRUM to deliver HDTV.

If the broadcasters wanted to scrap Freeview and go back to the days when there were only five channels in the U.K. they could deliver HDTV via the five channels used for Freeview.

This is EXACTLY the same situation as for U.S. broadcaster, who are trading an analog channel for a digital channel that can deliver HDTV.

The difference is that the U.S. broadcasters have traditionally tied up more spectrum to deliver the analog service. In the U.S. the 700MHz spectrum (channels above 52) is being recovered. In the UK it appears that they are recovering the UHF spectrum that has been used to fill in the gaps in the analog VHF service.

I don't know the number of households actually viewing HDTV content, and
you haven't provided any numbers, but my bet is that the number is
rapidly growing. It's not for nothing that Verizon, DirecTV, and cable
companies are advertizing their HD content so heavily these days.

Yes, the number is growing, primarily due to the multi-channel subscription services that are now pushing HDTV as a premium tier. I now can access 12 HDTV channels via Cox cable. 6 of these offer HD programming 24/7. Two are premium channels (HBO and Showtime) that are HD 24/7. And four are broadcast channels that offer a few hours of HD programming during prime time.

Clearly broadcasters ARE NOT the driving force behind the take-up of HDTV services in the U.S.

It probably helps that analog content looks pretty darned bad on LCDs
and plasmas. At the store, the image looked wonderful. Take it home, it
lokks crappy. Let's get the digital tier! Let's get HD! So it's a
completely different situation now than it was in the early 1990s, and
it is driven at least as much by consumer interest as it is by
broadcasters.

There are two main reasons that analog programming looks lousy on the new HD capable displays:

1. The source is interlaced. In order to be displayed, this means that the image processor in the display must de-interlace the source.

2. The quality of the source is marginal for many reasons. First is just the poor quality of the channel. For broadcast sources this includes multipath, noise and other distortions. For cable this includes noise and the poor performance of the analog plant.

Things improve with digital distribution of interlaced SD sources.

For DBS and the SD digital tiers delivered by cable the SD source is typically very good when it is encoded for emission. This is analogous to the quality delivered by Freeview, as the source that is encoded is high quality digital component SD. The main problem with digital distribution is the artifacts that result because the signals are OVER COMPRESSED. When a display processor getshigh quality samples it does a relatively good job with the de-interlacing and scaling When the source is compromised through excessive quantization and blocking artifacts, the local display processor is going to further degrade the quality. This is why it is not only possible, but desirable to back off on delivered resolution in order to give the local display processor the best quality samples possible.

And it SHOULD BE obvious that the proper place to deinterlace is PRIOR to encoding for emission. When this is done you can deliver the same or better quality as interlaced SD in the same bandwidth, and the local display processor DOES NOT need to de-interlace. We knew this in 1992 and made the recommendation to the Advisory Committee On Advanced Television Services that ONLY progressive formats should be used for the OTA DTV service. Unfortunately the people pulling the puppet strings in ACATS and at the ATSC did not agree, thus the delivered quality of most programming has suffered.

It seems to me that the PROPER place to place our attention in terms of improving the overall quality of DTV should be at the lower end of the spectrum - to improve the delivered quality of the SD content that represents the vast majority of what is delivered by broadcasters, cable and DBS today. The notion that we should be trying to push the limits at the high end with 1080@60P is completely absurd!

Reards
Craig


----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org
- By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word 
unsubscribe in the subject line.

Other related posts: