Videokarma.org

Go Back   Videokarma.org TV - Video - Vintage Television & Radio Forums > Television Broadcast Theory

We appreciate your help

in keeping this site going.
 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #1  
Old 07-30-2012, 02:34 PM
Rinehart Rinehart is offline
VideoKarma Member
 
Join Date: Nov 2011
Posts: 129
Interlacing and channel bandwidth

I've come across a curious statement in Albert Abramson's Zworykin: Pioneer of Television. Discussing the invention of odd-line interlacing to reduce flicker, he says "In addition [to reducing image flicker] doubling the field rate, which cut the number of lines in each field in half, afforded a considerable saving in channel bandwidth." (p. 118) But why would it? The number of lines scanned per second doesn't change, only the order does, and presumably this has no effect on the number of picture elements per line. Donald Fink in Television Engineering doesn't mention this as an advantage. On the contrary, he says "It must be understood that increasing the downward...velocities to twice the values they would have in progressive scanning does not mean that any more lines are scanned in the complete pattern." (p. 47, 1st edition, 1940) What am I missing here?
__________________
One Ruthie At A Time
Reply With Quote
 



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 03:26 PM.



Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
©Copyright 2012 VideoKarma.org, All rights reserved.