[opendtv] Re: 1080p @ 60 is Next?

  • From: Tom Barry <trbarry@xxxxxxxxxxx>
  • To: opendtv@xxxxxxxxxxxxx
  • Date: Sat, 19 May 2007 19:30:27 -0400



Craig Birkmaier wrote:
> This will happen in time. Most likely it will be the fall-out from using
> even higher resolution acquisition gear. I can easily imagine shooting
> in 4K x 2K with a Red Camera, then resampling to 1080P for production.
>
I think the Red Camera uses that weird but interesting kind of checkerboard sampling whose name I can never remember. But, if so, I don't think it really qualifies as something where you can effectively oversample at 4K.

- Tom

At 9:17 AM -0700 5/17/07, dan.grimes@xxxxxxxx wrote:

Call me nuts and unscientific, but when I compare 1080i and 720p original source material from a camera of a static scene on two monitors side by side, I prefer the 1080i at almost any distance for various reasons including perceived sharpness.

When I compare 1080i source material off of a HDCam tape and a 720p source off a DVCProHD, I can tell the difference across the studio floor. The 1080i wins hands down, but not necessarily based on resolution. Color (and especially color resolution) has a lot to do with it.


Me thinks you may have X-ray vision Superman.

Comparisons like this are virtually meaningless without some calibration with respect to the systems being viewed.

I do not doubt that a 1080 line system can acquire more detail than a 720 line system, however, I do question your contention that the 1080 line system delivers more perceived sharpness, especially "across the studio floor."

As a starting point, you need to consider the variables that may be at work on the systems being compared. The HD cam product line tends to be a high quality product using 2/3 inch CCDs - it produces very good 1440 x 1080 images.

The DVCPro product line offers a wide range of cameras at varying levels of performance. The very popular HVX200 and the new HPX500 use 1/3 inch and 2/3 inch CCDs respectively - the sensors have only 960 x 540 resolution. Other DVCPro HD products offer significantly enhanced performance.

So this alone could account for the differences you claim to have seen.

Then there is the issue of lenses.

Then there is the displays that were used to view the two sources.

What I am trying to get across here is that you cannot simply claim that these differences are due to the shooting format.

Do you remember the post from Mark Schubin about a Metropolitan Opera shoot using the Thompson GVG Worldcams, which can output both 1080i and 720P? This camera has an oversampling sensor, which helps the 720P output significantly. So much so that they were setting up for a shoot and discovered that one camera was in 720P mode while the others were in 1080i mode. As I recall, Mark stated that there was virtually no perceptible difference through the switcher and the studio monitors.

When I compare 1080i and 720p encoded in JPEG2000 at 300Mb/s, the 1080i wins for numerous reasons (this was projected on 35' screen, so resolution was certainly a factor).


YOU bet the 1080 will look better on a 35 foot screen. This is an application that needs the extra resolution. But were you really looking at 1080i, or were you looking at 1080@24P? If 24P the images should have been much superior. If 1080i only slightly superior, primarily in horizontal detail. And then there is the question of the projector used...


But, when I compare 1080i material and 720p material from transmitted sources (OTA, DBS, Cable, etc.), the 720p format wins everytime at any distance, independent of the screen resolution (that is 720 or higher) and especially if motion is involved. I say at any distance (within reason) because the artifacts are so gross that pixels don't matter; I'm seeing blocks.


THANK YOU!

The reality is that 720P is a superior emission format to 1080i, and the main reason is the "i."

There is not enough headroom in today's emission channels to handle the peak bit rate requirements for 1080i. By extension this is also true for 1080@60P but for different reasons. The elimination of interlace improves the entropy situation, but the 108060P camera is less sensitive, and more prone to issues with noise. Under ideal conditions it can capture significantly greater detail, which in turn adds to the stress on the encoder. The best use of 108060P cameras is to resample to 720P to gain the benefits of oversampling and to provide more headroom in the emission channel to handle the peak bit rate requirements.


My point? It doesn't really matter how high the resolution one produces in when the transmission medium can't handle it. The transmission compression is clearly the limiting factor so one might as well down convert (if necessary) to make the best use of the transmission format.


Point well taken.

But, this seem to be a reluctant conclusion. The major factor here is that resampling produces a higher quality HDTV raster, albeit at a lower spatial resolution. What we really want is to deliver the highest quality HDTV samples possible, and this suggests that a lower resolution emission format that can be encoded with fewer artifacts delivers the best pictures on displays of ALL resolutions.


However, broadcast is not the only application involved. I don't remember what format/resolution the Met is using for transmission (I know they are using 1080i for production), but clearly this is a case where resolution matters, and the transmission format is better to make use of it. I am doing a similar thing where a scene is recorded and played back on a 35' screen for an auditorium full of people. Here, clearly, resolution makes a difference.


You are correct. The need for higher resolution formats is directly related to the screening requirements. I am fairly certain that the Met theatrical screening are using 1080i. I am also fairly certain that they are using a significantly bigger pipe to get the pictures there without compression artifacts - probably in the range of 40 Mbps.

Perhaps Mark can provide the details.

I would like to build a case why 1080p@60 makes sense to me. I believe this format utilizes the better spatial and temporal resolutions of each format. I believe compression algorithms do better with progressive scanned material. I believe oversampling has always been a preferred way of producing, allowing for a better end result after processing, ESPECIALLY when downconverting to a lower resolution. And I believe archiving in a higher format is beneficial (clearly, if we had the choice between two recordings of a historical scene, we would use the best looking recording in the production.) If one produces in 1080p@60, the final product can be more easily downconverted and/or compressed into the final transmission medium, all the while preparing for a future transmission medium that is better. I would even venture to say that producing in 4:4:4 would have great merit in the production chain.


Your argument falls apart based on one detail - oversampling.

It is highly unlikely that the 1080@60P images are oversampled. Furthermore, there is sensitivity penalty when shooting this resolution at 60P.

And then there is the huge penalty for storing and processing 1080@60P source. For some applications all of this can be justified. For use at a State University, I am not certain what your justification are.

You can buy 720P cameras today that use oversampling sensors. You will gain most of the benefit from this alone, and have less headaches on the back end with a 720P production chain.

Producing in 4:4:4 does have benefits, however I am not certain how they impact your applications. The primary benefits come from the ability to manipulate the source after it has been shot. This is very useful for color grading and for use of the source in special effects.

These virtues parallel the way the movie industry has worked for years. If you are producing movies this may be important. If you are producing live programming it is irrelevant as the production chain is not in place to support it.

What this points out is that applications should drive the decisions as to the best tools for the job. My sense is that you want to build a state-of-the art facility, but may not be making practical decisions in terms of the applications that it will be used for.


To me, the decision whether to produce in 1080p@60 or one of the other formats is only based on the cost/benefit ratio. And since the production equipment is not even available (that I know of), it is currently too difficult (although not impossible) to do so. And it is impossible to do so in any live format.


See above.


So in conclusion, I believe producing in 1080p@60 today does have a benefit to the viewer today and tomorrow, no matter what format they are viewing in their home. Unfortunately the argument is moot since one can't produce in that format anyway. But I do think it would be beneficial to develop equipment to do so.


This will happen in time. Most likely it will be the fall-out from using even higher resolution acquisition gear. I can easily imagine shooting in 4K x 2K with a Red Camera, then resampling to 1080P for production.

Regards
Craig


----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org - By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word unsubscribe in the subject line.



--
Tom Barry                  trbarry@xxxxxxxxxxx  



----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org
- By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word 
unsubscribe in the subject line.

Other related posts: