#1
|
|||
|
|||
Interlacing and channel bandwidth
I've come across a curious statement in Albert Abramson's Zworykin: Pioneer of Television. Discussing the invention of odd-line interlacing to reduce flicker, he says "In addition [to reducing image flicker] doubling the field rate, which cut the number of lines in each field in half, afforded a considerable saving in channel bandwidth." (p. 118) But why would it? The number of lines scanned per second doesn't change, only the order does, and presumably this has no effect on the number of picture elements per line. Donald Fink in Television Engineering doesn't mention this as an advantage. On the contrary, he says "It must be understood that increasing the downward...velocities to twice the values they would have in progressive scanning does not mean that any more lines are scanned in the complete pattern." (p. 47, 1st edition, 1940) What am I missing here?
__________________
One Ruthie At A Time |
|
|