Quote:
Originally Posted by ppppenguin
...*It's not exactly half. There are additional artefacts caused by interlace that give a lower perceived vertical resolution than you might expect. The Kell factor is used to quantify this.
|
The "Kell Factor" relates to having scan lines (sampling in the vertical direction); there is an additional degradation if the lines are interlaced. But since all widely used TV standards were interlaced, the term Kell factor was applied to the net effect in interlaced pictures. This is taken to be roughly 0.7. This is a subjectively determined number and not a law of nature, and can vary greatly depending on the brightness of the picture, the viewing distance, the contrast of the test pattern details, and the refresh rate.
When non-interlaced sampling was considered (the horizontal pixel sampling in digital versions of 525- and 625-line systems, or vertical resolution of a progressively scanned system), a higher factor could be applied. For the hroizontal sampling, SMPTE and ITU standardized on filters that are 3 dB down at 0.85 of the Nyquist rate. With these specs, the system was judged to be transparent to the analog signal. The limiting resoluton is probably about 0.9 of Nyquist.