View Single Post
  #29  
Old 01-19-2023, 09:46 AM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is offline
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,221
Quote:
Originally Posted by Electronic M View Post
Back in the tube era they would change the horizontal and vertical scanning rates slightly for color TV. I believe one of the things that started to reduce that practice was VCR timecode being used in broadcast automation....The time code was based on frame count and it probably would be hard to handle it accurately every time if it changed between color and monochrome.

I suspect color network logo watermarks in the programming contributed to the constant burst practice, since if the CEO tells you the watermark has to be in color and not change no matter what the program is then the burst has to stay on.
IIRC in the 70s didn't PBS do a thing where they made their bursts ultra precise so NIST could use them as a lab calibration reference?..If that was the case they may have had to keep burst on during monochrome shows to make that work, and may have been doing constant burst before cable.
You've got a close idea, but missing some points.

The monochrome standard had wide tolerance for scan frequencies; the FCC broadcast rules required the vertical and horizontal to be locked to each other for a precise interlaced line count, but the master rate could vary; in fact, the rate could be tied to the station's 60 cycle power and still be within tolerance, as the power companies maintained average correct frequency so synchronous motor electric clocks would tell the right time. In practice, stations adopted crystal references for the scanning frequencies, which were 60.00 Hz and 15750 Hz. An additional requirement was placed on the carrier frequencies such that the undeviated (silence) frequency of the audio carrier was 4.5 MHz above the video carrier with a rather tight tolerance. This was to allow TVs with separate audio IFs to fine tune the audio and video correctly simultaneously.
When color came along, there was concern with ~ 920 kHz video beat beween the color subcarrier and the undeviated audio being visible in monochrome sets. Because of the precise scan and audio frequencies used in monochrome, you could choose the color subcarrier to interleave (be less visible) with the video, but then it would be worst case for the 920 kHz; or vice versa. Mathematically you could not optimize both. The decision was made to change the scan rates and keep the 4.5 MHz audio spacing exactly as before, to prevent sound problems in legacy monochrome sets. So, the scan rates were changed by the ratio 1000/1001. Now the color frame rate ran slow compared to a 60 Hz wall clock, necessitating the invention of time code to make the conversion from frame count to seconds, minutes, and hours for program duration.
__________________
www.bretl.com
Old TV literature, New York World's Fair, and other miscellany
Reply With Quote