Videokarma.org TV - Video - Vintage Television & Radio Forums

Videokarma.org TV - Video - Vintage Television & Radio Forums (http://www.videokarma.org/index.php)
-   Television Broadcast Theory (http://www.videokarma.org/forumdisplay.php?f=182)
-   -   Could NTSC VIR = PAL (http://www.videokarma.org/showthread.php?t=267784)

NewVista 09-24-2016 07:35 AM

Could NTSC VIR = PAL
 
Was wondering if VIR, implemented to its full potential, have eliminated the Tint control?

Robert Grant 09-24-2016 12:18 PM

That (as well as saturation control) was the idea.

It was not at all the same system as PAL, in which the red-minus-white (R-Y) switched with each horizontal scanning line. In NTSC with VIR, every line of the actual picture was transmitted exactly as had been done for the coverage of the 1954 Tournament of Roses Parade. The difference was one line in the vertical retrace interval, a consistent reference signal the a TV set would use to automatically set the hue and saturation perfectly - to the Vertical Interval Reference - VIR.

I really don't know why it didn't catch on. I can only wonder if its promotor wanted too much money for the patent royalty? was the inventor an outsider? (see interval windshield wipers), did TV manufacturers believe their products produced fine color anyway?, was NTSC color reliable enough by the time VIR came around that setting the dials just once was enough to assure good color? Did the entertainment industry want any improvement in picture quality to be tied to selective availability and copy blocking?

old_tv_nut 09-24-2016 10:42 PM

Quote:

Originally Posted by Robert Grant (Post 3170564)
...

I really don't know why it didn't catch on. ...

The reason it didn't catch on is that the local stations started using it to close the loop on their transmitter adjustments by inserting it locally. Of course, when they did that, it no longer bore any fixed relation to the original video, just as the locally re-inserted color burst didn't either. So, set manufacturers decided not to use it, as it often made the color worse instead of better. It could have succeeded if GE had managed to get the FCC to mandate its correct use.

EDit: By the way, GE was the inventor and promoter.

NewVista 09-25-2016 11:19 AM

Quote:

Originally Posted by old_tv_nut (Post 3170590)
The reason it didn't catch on is that the local stations started using it to close the loop on their transmitter adjustments by inserting it locally. Of course, when they did that, it no longer bore any fixed relation to the original video, just as the locally re-inserted color burst didn't either. So, set manufacturers decided not to use it, as it often made the color worse instead of better. It could have succeeded if GE had managed to get the FCC to mandate its correct use.


Do you mean they (stations) used the actual VIR format for transmitter monitoring, or just snatched the assigned lines for similar test signals?

old_tv_nut 09-25-2016 12:27 PM

Quote:

Originally Posted by NewVista (Post 3170615)
Do you mean they (stations) used the actual VIR format for transmitter monitoring, or just snatched the assigned lines for similar test signals?

Yes, they used the actual VIR format, newly inserted. They had to, for material that didn't contain it originally, and then just did it for all material.

NewVista 09-25-2016 09:41 PM

Quote:

Originally Posted by old_tv_nut (Post 3170590)
...when they did that, it no longer bore any fixed relation to the original video, just as the locally re-inserted color burst didn't either.

Though I'm no expert on VIR, wouldn't locally regenerated (network sourced for instance) burst have precise phase lock to the original and, by extension, the VIR (regenerated?) bursts.

So I'm guessing from this the networks used VIR with their feeds?

old_tv_nut 09-25-2016 10:11 PM

Quote:

Originally Posted by NewVista (Post 3170651)
Though I'm no expert on VIR, wouldn't locally regenerated (network sourced for instance) burst have precise phase lock to the original and, by extension, the VIR (regenerated?) bursts.

So I'm guessing from this the networks used VIR with their feeds?

Not sure if I understand your question completely.

The local burst must be reinserted to standard amplitude and clean waveform per FCC rules. Therefore, it at least loses its relation to the chroma amplitude, which may have changed due to analog transmission distortion over the network. In practice, the reinserted phase is also adjustable and therefore can be misadjusted. If there is significant phase distortion of the chroma over the network, the phase of the incoming burst may also vary over its width, making it difficult to know where to set the phase of the re-inserted burst. The VIR reference had a much wider burst of chroma during the active line, which was supposed to fix this by ignoring edge distortions. Unlike NTSC, the PAL burst, by alternating phase sequence from line to line, should average out to the correct phase even when phase distortion is present.

The whole idea of VIR was to insert it at the originating studio and not replace it anywhere along the chain, not even at network central control; but not every program had it inserted.

I would note that VIR, occurring once per field, had a much slower control action than color burst, and could not possibly compensate for fast signal variations like airplane flutter.

old_tv_nut 09-25-2016 10:26 PM

Another note regarding the mismatch of re-inserted burst to chroma amplitude distorted by network transmission:

The two major US TV makers, Zenith and RCA, had different philosophies in this regard.
Zenith used the burst as automatic color level reference, as this took out the final transmission variations due to ghosts, airplane flutter, etc. However, this ignored the network distortions. RCA's auto color mode worked on the average chroma level of the picture. They felt this was an overall improvement, although it meant the color level could be affected adversely by overcompensating for scenes either with large areas of bright colors or with no saturated colors. The general public seemed to be accepting of these "subject errors." They bothered me, though, and I always preferred to run the RCA sets with auto color off for this reason. Unfortunately, this meant also losing RCA's superior auto hue ("tint") correction, which actually had been invented by a colleague of mine at Motorola and was licensed to RCA. It worked only on hues near flesh tone, and didn't destroy greens and purples the way other makers' auto tint did. Motorola, by the way, never used their own invention.

kf4rca 09-26-2016 05:25 PM

It was originally required by the FCC for remote controlled transmitter stations. The studio had a Tektronix 147 inserter which stripped anything that was already on line 19 and insert the vir. The output fed the STL. At the transmitter there was a Tektronix 1440 which looked at the inserted vir and controlled luma, color and phase.
I remember there was a non-remote station that never had anything in the vertical interval. They even stripped the network vits.
We later upgraded the studio to the Tektronix 1910 which had GCR. We got calls from cable companies if the GCR was malfunctioning.

NewVista 09-27-2016 11:53 AM

Quote:

Originally Posted by old_tv_nut (Post 3170655)
..RCA's superior auto hue ("tint") correction, ...worked only on hues near flesh tones.

How on earth did this work?

old_tv_nut 09-27-2016 12:20 PM

Quote:

Originally Posted by NewVista (Post 3170742)
How on earth did this work?

The usual auto tint worked by changing the angles, and possibly the gains, of the color demodulators. The result was a reduction in saturation of greens and magentas, plus colors that were near orange or cyan had any green or magenta component reduced. If you picture a circular color wheel (or a gated-rainbow test pattern), the circle got compressed into an ellipse.
Thus, all colors were distorted to be closer to flesh tone or cyan, or at least a weaker green or purple.

The RCA circuit worked on the phase of the oscillator output going to the demodulators. The nearer the chroma phase was to flesh tone, the more strongly it was pulled towards flesh phase (hue). If it was already a flesh hue, this made no difference. If it was close, it was pulled strongly towards flesh. If it was further away, it was pulled less. So, yellows were made a bit more orange, but greens weren't significantly changed. Similarly, reds were pulled toward orange-red, magentas were pulled a bit towards red, but purples were changed even less if at all. Since this change was phase only, it didn't affect the saturation of greens, purples, or any color, for that matter.

NewVista 09-27-2016 04:58 PM

That is interesting how it worked.
Instead of all that effort, they should have - like you said - lobbied to mandate VIR for all transmissions as VIR decoder no more complex - and much better - than Auto Color?

That year did Auto Color appear?
What year did networks abandon VIR?

kf4rca 09-28-2016 07:57 AM

VIR was required till the end of analog transmission. The ONLY set I ever remember that utilized it was a 19 inch GE that was installed at a station I worked at in '77 in SC.
And I remember seeing a Sylvania that used the GCR in the 90's.

Electronic M 09-28-2016 08:13 AM

I know someone with a VIR set that has a gassy CRT.

kf4rca 09-28-2016 08:57 AM

The lack of set manufacturer adoption led many broadcasters to think these signals had another purpose. By putting test signals in the vertical interval, it allowed the FCC to evaluate your transmitters performance that they couldn't do with just program video.
While most VHFs kept the "house" clean, many UHFs were barely staying on the air.
I had heard that some stations were fined for excessive group delay and others recieved warnings.


All times are GMT -5. The time now is 05:25 PM.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
©Copyright 2012 VideoKarma.org, All rights reserved.