|
#1
|
|||
|
|||
Interlacing and channel bandwidth
I've come across a curious statement in Albert Abramson's Zworykin: Pioneer of Television. Discussing the invention of odd-line interlacing to reduce flicker, he says "In addition [to reducing image flicker] doubling the field rate, which cut the number of lines in each field in half, afforded a considerable saving in channel bandwidth." (p. 118) But why would it? The number of lines scanned per second doesn't change, only the order does, and presumably this has no effect on the number of picture elements per line. Donald Fink in Television Engineering doesn't mention this as an advantage. On the contrary, he says "It must be understood that increasing the downward...velocities to twice the values they would have in progressive scanning does not mean that any more lines are scanned in the complete pattern." (p. 47, 1st edition, 1940) What am I missing here?
__________________
One Ruthie At A Time |
#2
|
|||
|
|||
That is wrong; the bandwidth is unaffected by the scanning pattern.
I just hate it when someone is allowed to publish without intelligent editing. |
#3
|
||||
|
||||
Two hypothetical progressive scan cases where the horizontal pixel resolution and the number of H lines in a complete image on the CRT is the same as with an interlaced system:
If the 525 H lines are scanned progressively 60 times a second, the term "field rate" would have no meaning except as a V deflection rate and the frame rate would double from 30Hz to 60Hz, the H freq. would double from about 15.75KHz to about 31.5 KHz, and the required pixel rate or pixel clock, a form of bandwidth representation, would also double if the horizontal resolution were to remain the same. One solution would be to use more RF bandwidth. The above would be similar to the non-interlaced 640x480 VGA 'standard' running about 60Hz V and 31.5KHz H. If the 525 H lines are scanned progressively 30 times a second, the term "field rate" would have no meaning except as a V deflection rate and the frame rate would be the same at 30Hz, the H freq. would remain the same at about 15.75KHz, and the required pixel rate or pixel clock, a form of bandwidth representation, for the same H resolution, would remain the same. There would likely be an annoying flicker due to the 30Hz vertical rate. It would be like a 24 frame film theater except that a CRT display is a bit brighter so the effect would be more pronounced, but offset by a 30Hz V rate. One solution would be to use be longer phosphors. Therefore the interlacing scheme represents a compromise solution taking into account the relationship between these two factors: 1.) the pixel clock frequency and 2.) the refresh rate for a given volume of pixels. In most analog video systems, MHz equals pixels per interval. resolution = pixels / time bandwidth = information / time The NTSC scheme interlaces half the image every 1/60 second. it takes 1/30 second to present the information. The first case above presents all of the information in 1/60 second. the second case presents the information in 1/30 second but in a progressive manner. The author is correct in what he seems to have meant, but not in what was said. He may have not explained it completely or properly. It is possible that the clarity of his statement relies upon information presented elsewhere in the volume. If what I have said is wrong, I am willing to consider rebuttals or corrections.
__________________
Timeless Information for Retro-Tech Hobbyists and Hardware Hackers No Kowtow
Last edited by Opcom; 07-30-2012 at 08:51 PM. |
#4
|
||||
|
||||
Isn't that the very reason TPTB chose I'lace: to halve the Chnl size.
|
#5
|
||||
|
||||
Interlace improves the TRADEOFF between flicker and bandwidth and spatial resolution; so the author's statement is correct - it just mentions one side of this three-legged stool.
By the way, movie projectors do not operate at 24 Hz because the flicker would be intolerable even at lower brightness. They always (at least) double-shutter to get a 48 Hz flicker rate. |
Audiokarma |
#6
|
||||
|
||||
For stationary pictures, as Opcom has explained, interlace does halve the bandwidth needed for a given resolution and refresh rate*. For moving pictures it's more complex. Yes, there is better temporal resolution for moving objects, though vertical resolution in those objects is reduced. Also when you try to de-interlace the picture, as required for LCD panels etc, you soon find out that it's not easy to do well.
It is now simple to do the TV equivalent of multiblade shutters as used in movie projectors. Framestores were a long way in the future in the 1930s *It's not exactly half. There are additional artefacts caused by interlace that give a lower perceived vertical resolution than you might expect. The Kell factor is used to quantify this. |
#7
|
||||
|
||||
Quote:
When non-interlaced sampling was considered (the horizontal pixel sampling in digital versions of 525- and 625-line systems, or vertical resolution of a progressively scanned system), a higher factor could be applied. For the hroizontal sampling, SMPTE and ITU standardized on filters that are 3 dB down at 0.85 of the Nyquist rate. With these specs, the system was judged to be transparent to the analog signal. The limiting resoluton is probably about 0.9 of Nyquist. |
#8
|
||||
|
||||
AFAIK there's nothing very scientific about the Kell factor. As oldtvnut says, it's determined subjectively without a great deal of theoretical backup. My feeling FWIW, is that it depends significantly on the vertical scanning aperture. In tube cameras this is gaussian which gives a falling vertical spatial frequency response. Usually corrected, at least partially by vertical aperture correction. Like wise in CRT receivers but without VAK. LCD displays and CCD cameras have a very different vertical aperture, much squarer.
The Kell factor has been used to justify decisions about choosing H bandwidth wrt number of lines. Has any good experimental work been done with modern cameras and displays? In any case all HD systems we use square pixels so if there is still a Kell effect there is a shortfall of vertical resolution. |
#9
|
||||
|
||||
Quote:
If you mean the display elements have square shapes, then, yes, this implies a certain vertical and horizontal spatial frequency response, different from that with a Gaussian CRT spot. |
#10
|
||||
|
||||
There was some talk a while back that Europe, by delaying HD adoption,
intended to avoid any interlaced format in their new (1080?) standard. What became of this? |
Audiokarma |
#11
|
||||
|
||||
Another note: Dr. Schreiber at MIT proposed random scan (random pixel sequence) as part of a high-definition TV system in the late 80s/early 90s. With proper frequency pre-emphasis / de-empahsis, channel degradations would appear as an increased noise level near edges, where it would be masked by the human visual system. Still images looked promising, but I don't recall if full high def motion was ever achieved. Such partially analog systems (Zenith also proposed one) were overtaken by the development of all-digital systems using MPEG compression.
|
#12
|
||||
|
||||
Perhaps the EBU need not worry about Interlace given the present
availability of 50/60p cameras: With an alternate method of interlace generation, motion artifact could be avoided in a similar manner to film scanning (see diagram) |
#13
|
||||
|
||||
In digital broadcasting there isn't really any such thing as "an interlaced channel". Just a coding standard and a bit rate. I don't have a citation to hand but it's probably easier to get a good picture at a lower bit rate when you start with a progressive source. As 1080/50p (and 60p) equipment becomes more readily available I think it will become standard for originating material. Hence removing the compomise between 1080/50i, 720/50p and 1080/25p. And the 60Hz related equivalents.
In the set of SMPTE standards for handling full bandwidth HD digits there are 2 basic bit rates, 1.5GB/s and 3GB/s. The latter is needed for 1080/50p and 1080/60p where the pixel clock is 148.5MHz. All the others (1080/50i, 720/60p and lots more) fit happily in 1.5GB/s with a 74.25MHz pixel clock. To add to the proliferation of standards all the 30Hz and 24Hz related standards have a variant with the pxiel clock multiplied by 1000/1001 to fit with the 59.94Hz NTSC field rate. This has always caused trouble with timecode. Now that NTSC is just about officially dead for broadcasting I can't see any reason for originating programme material on these standards. NewVista's sketch is not unlike 1080/24psF. This is effectively 48 Hz interlaced int he channel but carrying 24Hz progressive. This allows material originated at 24Hz to be displayed on a CRT monitor without undue flicker. Standards, don't you just love them. So lets have lots of them |
#14
|
||||
|
||||
All right, I see they have a name for it: "30PsF". So my sketch
which I thought of a while back will not will not earn a patent But it would obviate the need for complex & flawed motion comp in deinterlacers if broadcasts could be somehow flagged to switch off/bypass motion processor as for film sourced programming |
#15
|
||||
|
||||
30psf isn't actually in the list of SMPTE standards Implicitly 25psf has been used for years in Europe for film material where 24fps film has traditionally been shown at 25fps. The 4% fater running time is just accepted as normal, the sound pitch likewise or it can be corrected.
The US has suffered badly from its standards. 59.9Hz fouls up timecode for the prodcution people. 3:2 pulldown fouls up transmission of 24fps film. This can be overcome with advanced standards converters. These can convert 24Hz material to 30Hz without significant quality loss. They can also recognise and remove 3:2 artefacts. |
Audiokarma |
|
|