Is there a Moore's Law regarding codec efficiency, or is there a theoretical limit? I mean it seems to be impossible to represent an entire 1920x1080 frame with a single bit (unless the entire screen is monotone), so there must be a theoretical limit as to how much you can compress an image and still have it be a practical display. If so, then how far away from that theoretical limit is MPEG4/AVC? Is MPEG4/AVC to the point that it really could be a standard that could last for 20 years? Personally, I have no quarrel with Europe's "problem" about obsolete MPEG2 receivers. They rolled out digital using very inexpensive boxes, and can slowly starve them of bits to make room for AVC simulcasts in HD. Just as DVB-T allows an almost continuous sliding scale of bitrates vs. robustness, there is an inherent sliding scale of SD quality vs. HD quality and/or number of HD services. Only those who need HD will have to replace their tuners, and in many cases the tuner will be built into the display or else what is another $300US on top of a $2,000US HD display? Australia got the worst of it by allowing SD only boxes to be sold, but still demanding that MPEG2 HD also be used. Perhaps their HD penetration is so low that they could allow an HD switch to MPEG4 and compensate those few HD adopters. Then they would be back in harmony with the Old Country. John ----- Original Message ----- From: "Bob Miller" <bob@xxxxxxxxxx> > The UK is starting to talk about the MPEG4 problem now. ---------------------------------------------------------------------- You can UNSUBSCRIBE from the OpenDTV list in two ways: - Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org - By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word unsubscribe in the subject line.