I'll make more slides when I get a chance, but be aware that there are many interacting parameters:
- increase the samples on an imager and you can use less-restrictive filtering, so MTF goes up, but, if imager size is not increased, the line pairs/mm goes up, too, so lens-based and diffraction-based MTF go down - increase imager size, and lp/mm goes down, so diffraction-based MTF goes up, but aperture probably goes down, so diffraction-based MTF goes down (not to mention lens weight issues) - increase individual sensor size, and dynamic range goes up, but, if done with pixel offset as in the AG-HVX200, the possibility of color aliasing increases; large sensor sites also introduce aliasing if the sensor size doesn't match the desired pixel size (for which the optical filtering should be designed)
There's much more. Microlenses work great at narrow apertures but poorly at wide apertures. Larger imagers can cause vignetting at wide focal lengths (as on the Dalsa Origin).
I'll try to put more together when I get a chance. I'm afraid there's no "the rest of the story," only "more of the story."
TTFN, Mark Craig Birkmaier wrote:
At 5:43 PM -0500 11/12/06, Manfredi, Albert E wrote:And increase the size of the sensor or increase the quality of the lens, or both.Bert is on the right track here, but is missing the most important point.Yes, increasing the sensor size can indirectly benefit the delivered image quality. But let's be careful when we talk about size...We can produce larger sensors with the same number of sensor sites, thus increasing the ability to gather photons in each site. This is typically accompanied by the use of larger lenses that will collect those photons. Together they increase the cost of the acquisition device substantially, not to mention the physical size of the camera.The other way to improve image quality is to increase the sample density of the sensor ABD its size. Increasing the sample density without changing the physical size of the sensor reduces the number of photons that are collected at each site. And the increased sample density may require improved optics to deliver the benefits. IF we increase the sample density AND the size of the sensor, we may be able to capture more light and use it effectively.There is both good and bad news here: Using CMOS sensors, we may be able to realize substantial increases in sample density, as has been the case for digital still image cameras. But we will also need to increase the cost of the lens to deliver the photons to the sensor.To date, the progress has been slow in the transition to larger, high sample density CMOS sensors for video acquisition. Bottom line, CCD sensors are still delivering better images, albeit from smaller sensors. Today, a 2 Mpixel CCD sensor with a 2/3" die seems to be the standard for comparison. Fortunately, cameras using these sensors are teaching us a great deal about the best way to deliver high quality images.It would be very helpful if Mark could produce one or more slides, to tell "the rest of the story."These slides should deal with two subjects: 1. Oversampling 2. Quantization effectsTHE MOST IMPORTANT issue in this discussion is oversampling relative to the raster that is being encoded for emission. As Mark - hopefully - will illustrate, when we take an oversampled image and resample it to a less dense raster (e.g. 1920 x 1080 to 1280 x 720) we increase the area under the MTF curve. I do not know if this is equal to the area under the curve for the higher density raster, or somewhat less, but it is quite significant. NTSC has benefited from horizontal oversampling for decades. This process of resampling ALSO reduced entropy in the image, which helps the next step in the process - compression.Far too little attention is being paid to the negative impact of quantization on delivered image quality, and a related "dirty little secret" about MPEG compression systems. Qunatization removes image detail: if used minimally, these losses will simply remove some detail at higher frequencies of luminance and probably a bit more detail from the color difference signals; If used aggressively, we will lose the actual image detail, but worse, we may add false detail, including samples that may violate the Nyquist limits. This is especially noticeable at the edges of high frequency details. If you look at the quantized coefficients in these regions it is not unusual too see a few nearly black samples sitting next to white samples.As Mark might say, the human observer may prefer higher contrast imagery at lower sample densities over an accurate HDTV image, but we do not prefer high contrast distortions within the image, or blocking artifacts.So the rest of the story is that we need to move to higher sampling densities if we want to deliver more high contrast image detail to the viewer, but we ALSO need to tune the compression system to deliver this information. Given the recent discussion about A-VSB and how much bandwidth may be left for HDTV programming, it should be obvious that the correct approach would be to deliver 720P or even 576P accurately, rather than trying to push 1080 line images through an emission system that is starved for bits.Regards Craig ---------------------------------------------------------------------- You can UNSUBSCRIBE from the OpenDTV list in two ways:- Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org - By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word unsubscribe in the subject line.
---------------------------------------------------------------------- You can UNSUBSCRIBE from the OpenDTV list in two ways:- Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org
- By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word unsubscribe in the subject line.