[opendtv] Re: Receiver costs too expensive in the Brazilian DTV system:

  • From: Tom Barry <trbarry@xxxxxxxxxxx>
  • To: opendtv@xxxxxxxxxxxxx
  • Date: Tue, 11 Sep 2007 12:50:14 -0400



Craig Birkmaier wrote:
> Cross fades and fade-to-block (black) are the most difficult
> pathological cases for any digital compression algorithm. Cross
> dissolves would be better accomplished by sending two overlapping
> streams and doing the transition in the receiver. Fades should be done
> in the receiver as this only requires attenuation of a single stream.
> The fact that this is not being done is a strong indication that content
> providers are fighting the logical approach to do more with the
> processing power in the receiver/STB. But I digress...

I've suggested before that even for MPEG-2 a good encoder could probably recognize and approximate fades by linearly decreasing the luma in all pixels of any block. The differences due to fade after motion compensation and dct would then be restricted to only the average luma, 1 component out of 96 in 4:2:0 mpeg-2 encoding. A newer codec could even implement global luma compensation.

- Tom


At 9:04 PM -0400 9/10/07, Albert Manfredi wrote:

So just for grins, I just now conducted a close inspection of the WETA-DT multiplex, which is the most densely populated that I receive so far. One HD and three SD channels. I was standing no more than 2' from the 26" screen.


Bert has correctly identified the problem "he" is having relative to the visibility of compression artifacts. He is watching all this digital stuff on a screen that is incapable of resolving all of the detail (and artifacts) at ANY viewing distance.At least he acknowledges that at 2 feet he can see a difference - but at a normal viewing distance he might as well be watching NTSC.


The cuts seemed artifact-free to me, completely.


As they should be. The only time that one might expect to see artifacts at a cut transition is if both the incoming and outgoing streams are bit starved. Usually a new I-frame takes care of the problem.


They showed a closeup of the legs of many individuals on treadmills, filling the screen. From that close viewing distance, yes, I could see macroblocking on the legs as they moved quickly back and forth. But that was the only time I noticed anything that obvious, and I'm sure it would have been less obvious from a normal viewing distance.


Not on a 40" screen or larger, as is required to BEGIN to see the detail in an HD source.


In one fade, a man's head came on while it was moving to one side. I got an instantaneous macroblocking event, and then it was fine.


Cross fades and fade-to-block (black) are the most difficult pathological cases for any digital compression algorithm. Cross dissolves would be better accomplished by sending two overlapping streams and doing the transition in the receiver. Fades should be done in the receiver as this only requires attenuation of a single stream. The fact that this is not being done is a strong indication that content providers are fighting the logical approach to do more with the processing power in the receiver/STB. But I digress...


Okay, I wasn't watching football, but what I saw, to me, was very acceptable. And no comparison at all with NTSC. And remember, I was right up close to the screen.


Yes. "most of the time" compressed digital is better than NTSC, unless of course it is pre-filtered to make the stream fit into a multiplex, as is the case with the sub-channels here in Gainesville.


The SD channels were probably well pre-filtered, because they seemed to lack any obvious artifacts. And from up close, they were certainly not as sharp as the HD channel, but again, no comparison at all with NTSC, which is essentially unwatchable that close up.


Apples and oranges. NTSC has it's own set of artifacts that are visible even with the very best reception. But it can deliver very good images. Digital SD streams may show many of these artifact as well, as interlacing is the major culprit here. But a prefiltered source may eliminate many of the higher frequency vertical details that make the interlacing artifacts visible. I have seen MANY MANY digital streams that are far inferior to NTSC. Just tune into the CW sub-channel of the ABC affiliate here in Gainesville and you can see this 24/7.


It's not about better quality images, its ALL about more channels
in less bandwidth to make more money. That's all it is about.


I think it is about both. People are demanding and will continue to demand better quality simply because they are buying plasmas and LCDs in droves, in sizes far bigger than their old CRTs. And I, for one, welcome the greater choice as well.


This may well drive the demand for more HD programming, but it is doing NOTHING to improve delivered image quality. The only digital sources that are properly encoded today are DVDs. When broadcasters learn that compression should be part of the production process - as the movie guys have learned - rather than an emission process, we may see better delivered image quality. Managing the emission multiplex is a data processing problem. not an image processing or compression problem. Programs should be delivered pre-compressed with good quality, and the broadcasters should just deliver the bits.

Imagine buying a software application by downloading the bits via the Internet. Then having the seller tell you that some of the features of the application may not work because of IP packet errors. It is EASY to use digital compression to deliver high quality - the current approach being used by ALL program distributors is the problem - they do not understand how to properly implement a digital TV delivery system.


My conclusion is that with MPEG-2, while more perfect 1080i might require an average 16 Mb/s, this is by no means a hard number. I'm not saying that a comparison with H.264 would show no difference. I'm simply saying that these are soft numbers. Yeah, if we try squeezing that down to less than 1.5 or so Mb/s, MPEG-2 will fall apart completely, and H.264 will not. But those images would not be so great on most TV-size screens anyway.
subject line.


OF COURSE the numbers are all SOFT. This is an entropy coding system.

Bit rate requirements vary with scene complexity and motion content. The proper way to manage a multiplex is to assure that all needed bits are delivered by the time they are needed. If you know what the requirements are for each sub-stream in advance this is relatively easy to accomplish - it becomes a file delivery problem.

This is a bit more complex with live programming, in which case the only way to assure good delivered image quality is to allow enough overhead to deal with peak bit rate requirements. But most of what is broadcast is NOT live.

Regards
Craig


----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org - By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word unsubscribe in the subject line.



--
Tom Barry                  trbarry@xxxxxxxxxxx  



----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org
- By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word 
unsubscribe in the subject line.

Other related posts: