|
#1
|
||||
|
||||
That (as well as saturation control) was the idea.
It was not at all the same system as PAL, in which the red-minus-white (R-Y) switched with each horizontal scanning line. In NTSC with VIR, every line of the actual picture was transmitted exactly as had been done for the coverage of the 1954 Tournament of Roses Parade. The difference was one line in the vertical retrace interval, a consistent reference signal the a TV set would use to automatically set the hue and saturation perfectly - to the Vertical Interval Reference - VIR. I really don't know why it didn't catch on. I can only wonder if its promotor wanted too much money for the patent royalty? was the inventor an outsider? (see interval windshield wipers), did TV manufacturers believe their products produced fine color anyway?, was NTSC color reliable enough by the time VIR came around that setting the dials just once was enough to assure good color? Did the entertainment industry want any improvement in picture quality to be tied to selective availability and copy blocking? Last edited by Robert Grant; 09-24-2016 at 12:49 PM. |
#2
|
||||
|
||||
The reason it didn't catch on is that the local stations started using it to close the loop on their transmitter adjustments by inserting it locally. Of course, when they did that, it no longer bore any fixed relation to the original video, just as the locally re-inserted color burst didn't either. So, set manufacturers decided not to use it, as it often made the color worse instead of better. It could have succeeded if GE had managed to get the FCC to mandate its correct use.
EDit: By the way, GE was the inventor and promoter. Last edited by old_tv_nut; 09-24-2016 at 10:49 PM. |
#3
|
||||
|
||||
Quote:
Do you mean they (stations) used the actual VIR format for transmitter monitoring, or just snatched the assigned lines for similar test signals? |
#4
|
||||
|
||||
Yes, they used the actual VIR format, newly inserted. They had to, for material that didn't contain it originally, and then just did it for all material.
|
#5
|
||||
|
||||
Quote:
So I'm guessing from this the networks used VIR with their feeds? |
Audiokarma |
#6
|
||||
|
||||
Quote:
The local burst must be reinserted to standard amplitude and clean waveform per FCC rules. Therefore, it at least loses its relation to the chroma amplitude, which may have changed due to analog transmission distortion over the network. In practice, the reinserted phase is also adjustable and therefore can be misadjusted. If there is significant phase distortion of the chroma over the network, the phase of the incoming burst may also vary over its width, making it difficult to know where to set the phase of the re-inserted burst. The VIR reference had a much wider burst of chroma during the active line, which was supposed to fix this by ignoring edge distortions. Unlike NTSC, the PAL burst, by alternating phase sequence from line to line, should average out to the correct phase even when phase distortion is present. The whole idea of VIR was to insert it at the originating studio and not replace it anywhere along the chain, not even at network central control; but not every program had it inserted. I would note that VIR, occurring once per field, had a much slower control action than color burst, and could not possibly compensate for fast signal variations like airplane flutter. |
#7
|
||||
|
||||
Quote:
Brilliant of General Electric to have a wider burst in active line - and at a luminance level of a typical flesh tone! So, to me, the "problems" would have been easily avoidable with good broadcasting practice: For example source material with VIR needed to be first reclaibrated thru a professional VIR processor that also regenerated burst & VIR (with exact phase similitude) to the gated line 19 sample). Was there such an instrument? Last edited by NewVista; 09-28-2016 at 11:59 PM. |
#8
|
|||
|
|||
I remember quad videotapes (2 inch) coming in that had a sticker saying "Protected by VIRS- adjust proc amp to pass."
In reality burst and sync got "regenerated" 2 times. After leaving the switcher it went to a proc amp where it would get black and white clip and agc. (I thought the best AGC amp was the RCA TA19. You could always tell a station that had one of these on the air.) Sync and burst would also be regenerated. Then it went to the virs inserter where it sync and burst would be regenerated again. NOW, all of this is ahead of a pre-corrector that was required to compensate for non-linearities in the modulation process of the transmitter. It had to look flat on the air. AND don't forget the low pass filter. Video was band limited to 4.2 MHZ to avoid components bleeding over into the sound at 4.5 MHz.
__________________
Last edited by kf4rca; 09-29-2016 at 07:55 AM. |
#9
|
||||
|
||||
Quote:
To eliminate any guesswork, just have color bars on another vert interval line! |
|
|