NTSC Tint Errors (20+ years to fix?)
IIRC, the Magnavox Total Automatic Color system
was the most extreme "automatic" tint system (it didn't correct the tint, just forced colors near the fleshtone color to fleshtone). Why did it take from 1953-12 to ~1976-07 (GE color TVs w/VIR) to get the NTSC Tint Errors fixed? Kirk Bayne |
GE VIR only provided a more accurate reference to the set but VIR was often mishandled by local stations playing tape or network feeds so sometimes it actually made things worse. VIR was probably the best attempt at color TV tint error correction ever applied to NTSC. If we had instituted the field by field color phase reversal RCA experimented with and PAL implemented instead on a line by line basis we would have had tint phase error correction baked in.
NTSC was initially specified to the dim but wide color gamut color phosphors available in 1954, but that dimness led to new brighter phosphors with different color response and gamut range being developed at that point broadcast sources started adjusting their (back in the early 60s somewhat drifty unstable cameras) to compromise between standard and new phosphors and once we went far enough down that road we could make new cameras and sets to the same compromise spec and achieve consistency in new content on new sets but not get perfect color consistency across all sets or program recordings. Really in the tube and film era color TV was pushing the limits of what technology could reliably and affordably do. Films would fade unevenly, tube cameras would drift, network feeds had phase and frequency distortion that messed with color signal integrity and the means of dealing with all this in real time to keep broadcast schedules was often somewhat subjective and somewhat inaccurate. |
Kfbkfb: You have a mistaken impression that the tint errors were "fixed" by VIR or anything else.
What happened in general was that more stable solid state gear was used over time so that the drifts along the delivery chain were gradually reduced (but not "fixed"). Stability between programs took a big step backwards when analog cable came into general use. Every cable channel needed a proc amp to restore the burst and sync waveforms, but of course the cable company would set up the system and only check sporadically, unlike the broadcast stations that could check at least daily. Regarding VIR, it was intended to carry a reference from the studio all the way to the receiver. The problem with mismatched color from program to program was partially due to the FCC requirement that burst and sync be restored to standard levels and waveforms before emission. Of course, the picture itself was not modified, so saturation errors persisted; plus if the phase of the regenerated burst was wrong, the hue was off. So VIR was intended to fix that. However, VIR never worked reliably, due to local stations inserting it locally even on program material that didn't have it in the first place, and using it in a closed-loop sytem to correct transmitter distortions. So, many programs with VIR had VIR that was not carried all the way from the studio, and the result was that VIR often made the color worse instead of better. |
What could have been done to improve tint
accuracy during the vacuum tube era in NTSC color TV production, national distribution and display at home (some sort of simple reference signal from the [color] Networks that was provided continuously)? Kirk Bayne |
Quote:
|
Quote:
|
NTSC rejected color-alternating at field rate because of flicker problems. I can't recall if PAL was rejected for line flicker or not. Of course, line-delay memory was not available at the time.
If output luminance were constant with chroma amplitude and phase changes, 30 Hz flicker would be invisible. However, two things make this impossible, because they change the luminance of the reproduced color when the chroma phase or amplitude changes: 1) CRT gamma - saturated colors crank up the dominant primary color beam current more than they turn down the less-used gun beam currents, so luminance is not constant vs. chroma signal level (color saturation). 2) the scaling of the R-Y and B-Y axes to prevent overmodulation except in extreme cases like 100% amplitude color bars, so luminance is not constant vs. phase. Luminance flicker threshold is somewhere between 50 and 60 hz for normal TV picture brightness, but more like 15 Hz for chroma-only changes with constant output luminance. By the way, this is why you can see luminance flicker on the COL-R-TEL rotating wheel converter, but see correct colors without visible switching from R to G to B. [Strictly true for stationary objects and stationary eyeballs, otherwise you see color fringing/breakup, but that is not exactly the same thing as flicker.] |
All times are GMT -5. The time now is 08:47 AM. |
Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
©Copyright 2012 VideoKarma.org, All rights reserved.