Videokarma.org TV - Video - Vintage Television & Radio Forums

Videokarma.org TV - Video - Vintage Television & Radio Forums (http://www.videokarma.org/index.php)
-   Television Broadcast Theory (http://www.videokarma.org/forumdisplay.php?f=182)
-   -   Could NTSC VIR = PAL (http://www.videokarma.org/showthread.php?t=267784)

NewVista 09-24-2016 07:35 AM

Could NTSC VIR = PAL
 
Was wondering if VIR, implemented to its full potential, have eliminated the Tint control?

Robert Grant 09-24-2016 12:18 PM

That (as well as saturation control) was the idea.

It was not at all the same system as PAL, in which the red-minus-white (R-Y) switched with each horizontal scanning line. In NTSC with VIR, every line of the actual picture was transmitted exactly as had been done for the coverage of the 1954 Tournament of Roses Parade. The difference was one line in the vertical retrace interval, a consistent reference signal the a TV set would use to automatically set the hue and saturation perfectly - to the Vertical Interval Reference - VIR.

I really don't know why it didn't catch on. I can only wonder if its promotor wanted too much money for the patent royalty? was the inventor an outsider? (see interval windshield wipers), did TV manufacturers believe their products produced fine color anyway?, was NTSC color reliable enough by the time VIR came around that setting the dials just once was enough to assure good color? Did the entertainment industry want any improvement in picture quality to be tied to selective availability and copy blocking?

old_tv_nut 09-24-2016 10:42 PM

Quote:

Originally Posted by Robert Grant (Post 3170564)
...

I really don't know why it didn't catch on. ...

The reason it didn't catch on is that the local stations started using it to close the loop on their transmitter adjustments by inserting it locally. Of course, when they did that, it no longer bore any fixed relation to the original video, just as the locally re-inserted color burst didn't either. So, set manufacturers decided not to use it, as it often made the color worse instead of better. It could have succeeded if GE had managed to get the FCC to mandate its correct use.

EDit: By the way, GE was the inventor and promoter.

NewVista 09-25-2016 11:19 AM

Quote:

Originally Posted by old_tv_nut (Post 3170590)
The reason it didn't catch on is that the local stations started using it to close the loop on their transmitter adjustments by inserting it locally. Of course, when they did that, it no longer bore any fixed relation to the original video, just as the locally re-inserted color burst didn't either. So, set manufacturers decided not to use it, as it often made the color worse instead of better. It could have succeeded if GE had managed to get the FCC to mandate its correct use.


Do you mean they (stations) used the actual VIR format for transmitter monitoring, or just snatched the assigned lines for similar test signals?

old_tv_nut 09-25-2016 12:27 PM

Quote:

Originally Posted by NewVista (Post 3170615)
Do you mean they (stations) used the actual VIR format for transmitter monitoring, or just snatched the assigned lines for similar test signals?

Yes, they used the actual VIR format, newly inserted. They had to, for material that didn't contain it originally, and then just did it for all material.

NewVista 09-25-2016 09:41 PM

Quote:

Originally Posted by old_tv_nut (Post 3170590)
...when they did that, it no longer bore any fixed relation to the original video, just as the locally re-inserted color burst didn't either.

Though I'm no expert on VIR, wouldn't locally regenerated (network sourced for instance) burst have precise phase lock to the original and, by extension, the VIR (regenerated?) bursts.

So I'm guessing from this the networks used VIR with their feeds?

old_tv_nut 09-25-2016 10:11 PM

Quote:

Originally Posted by NewVista (Post 3170651)
Though I'm no expert on VIR, wouldn't locally regenerated (network sourced for instance) burst have precise phase lock to the original and, by extension, the VIR (regenerated?) bursts.

So I'm guessing from this the networks used VIR with their feeds?

Not sure if I understand your question completely.

The local burst must be reinserted to standard amplitude and clean waveform per FCC rules. Therefore, it at least loses its relation to the chroma amplitude, which may have changed due to analog transmission distortion over the network. In practice, the reinserted phase is also adjustable and therefore can be misadjusted. If there is significant phase distortion of the chroma over the network, the phase of the incoming burst may also vary over its width, making it difficult to know where to set the phase of the re-inserted burst. The VIR reference had a much wider burst of chroma during the active line, which was supposed to fix this by ignoring edge distortions. Unlike NTSC, the PAL burst, by alternating phase sequence from line to line, should average out to the correct phase even when phase distortion is present.

The whole idea of VIR was to insert it at the originating studio and not replace it anywhere along the chain, not even at network central control; but not every program had it inserted.

I would note that VIR, occurring once per field, had a much slower control action than color burst, and could not possibly compensate for fast signal variations like airplane flutter.

old_tv_nut 09-25-2016 10:26 PM

Another note regarding the mismatch of re-inserted burst to chroma amplitude distorted by network transmission:

The two major US TV makers, Zenith and RCA, had different philosophies in this regard.
Zenith used the burst as automatic color level reference, as this took out the final transmission variations due to ghosts, airplane flutter, etc. However, this ignored the network distortions. RCA's auto color mode worked on the average chroma level of the picture. They felt this was an overall improvement, although it meant the color level could be affected adversely by overcompensating for scenes either with large areas of bright colors or with no saturated colors. The general public seemed to be accepting of these "subject errors." They bothered me, though, and I always preferred to run the RCA sets with auto color off for this reason. Unfortunately, this meant also losing RCA's superior auto hue ("tint") correction, which actually had been invented by a colleague of mine at Motorola and was licensed to RCA. It worked only on hues near flesh tone, and didn't destroy greens and purples the way other makers' auto tint did. Motorola, by the way, never used their own invention.

kf4rca 09-26-2016 05:25 PM

It was originally required by the FCC for remote controlled transmitter stations. The studio had a Tektronix 147 inserter which stripped anything that was already on line 19 and insert the vir. The output fed the STL. At the transmitter there was a Tektronix 1440 which looked at the inserted vir and controlled luma, color and phase.
I remember there was a non-remote station that never had anything in the vertical interval. They even stripped the network vits.
We later upgraded the studio to the Tektronix 1910 which had GCR. We got calls from cable companies if the GCR was malfunctioning.

NewVista 09-27-2016 11:53 AM

Quote:

Originally Posted by old_tv_nut (Post 3170655)
..RCA's superior auto hue ("tint") correction, ...worked only on hues near flesh tones.

How on earth did this work?

old_tv_nut 09-27-2016 12:20 PM

Quote:

Originally Posted by NewVista (Post 3170742)
How on earth did this work?

The usual auto tint worked by changing the angles, and possibly the gains, of the color demodulators. The result was a reduction in saturation of greens and magentas, plus colors that were near orange or cyan had any green or magenta component reduced. If you picture a circular color wheel (or a gated-rainbow test pattern), the circle got compressed into an ellipse.
Thus, all colors were distorted to be closer to flesh tone or cyan, or at least a weaker green or purple.

The RCA circuit worked on the phase of the oscillator output going to the demodulators. The nearer the chroma phase was to flesh tone, the more strongly it was pulled towards flesh phase (hue). If it was already a flesh hue, this made no difference. If it was close, it was pulled strongly towards flesh. If it was further away, it was pulled less. So, yellows were made a bit more orange, but greens weren't significantly changed. Similarly, reds were pulled toward orange-red, magentas were pulled a bit towards red, but purples were changed even less if at all. Since this change was phase only, it didn't affect the saturation of greens, purples, or any color, for that matter.

NewVista 09-27-2016 04:58 PM

That is interesting how it worked.
Instead of all that effort, they should have - like you said - lobbied to mandate VIR for all transmissions as VIR decoder no more complex - and much better - than Auto Color?

That year did Auto Color appear?
What year did networks abandon VIR?

kf4rca 09-28-2016 07:57 AM

VIR was required till the end of analog transmission. The ONLY set I ever remember that utilized it was a 19 inch GE that was installed at a station I worked at in '77 in SC.
And I remember seeing a Sylvania that used the GCR in the 90's.

Electronic M 09-28-2016 08:13 AM

I know someone with a VIR set that has a gassy CRT.

kf4rca 09-28-2016 08:57 AM

The lack of set manufacturer adoption led many broadcasters to think these signals had another purpose. By putting test signals in the vertical interval, it allowed the FCC to evaluate your transmitters performance that they couldn't do with just program video.
While most VHFs kept the "house" clean, many UHFs were barely staying on the air.
I had heard that some stations were fined for excessive group delay and others recieved warnings.

lnx64 09-28-2016 09:05 AM

Speaking of UHF, I got in an auction, one of the RF modulators used at one of the UHF stations in the area. Among a weird coaxial baseband video input, it also has the emergency broadcast inputs too. And when I got it, it was still set on the frequency that the station aired, it was never messed with.

Kamakiri 09-28-2016 09:26 AM

Quote:

Originally Posted by Electronic M (Post 3170805)
I know someone with a VIR set that has a gassy CRT.

My parents bought a GE 25" console with VIR new in '77. Lasted until '83. Been looking for a VIR equipped set for many years and never found one.

Electronic M 09-28-2016 10:03 AM

Quote:

Originally Posted by Kamakiri (Post 3170814)
My parents bought a GE 25" console with VIR new in '77. Lasted until '83. Been looking for a VIR equipped set for many years and never found one.

His is a 19" GE...I don't know if he would sell it, but I could check.

lnx64 09-28-2016 10:05 AM

So I'm guessing VIR has nothing to do with the auto color button on my TV?

NewVista 09-28-2016 11:15 AM

Quote:

Originally Posted by kf4rca (Post 3170803)
VIR was required till the end of analog transmission. The ONLY set I ever remember that utilized it was a 19 inch GE that was installed at a station I worked at in '77 in SC.
And I remember seeing a Sylvania that used the GCR in the 90's.

That's what I thought, only really seen it on the GE 19" with that red VIR light on controls panel. What a pity as VIR was a brilliant innovation

NewVista 09-28-2016 11:55 PM

Quote:

=old_tv_nut;3170654
The local burst must be reinserted to standard amplitude and clean waveform per FCC rules. Therefore, it at least loses its relation to the chroma amplitude, which may have changed due to analog transmission distortion over the network. In practice, the reinserted phase is also adjustable and therefore can be misadjusted. If there is significant phase distortion of the chroma over the network, the phase of the incoming burst may also vary over its width, making it difficult to know where to set the phase of the re-inserted burst. The VIR reference had a much wider burst of chroma during the active line, which was supposed to fix this by ignoring edge distortions. ..
The whole idea of VIR was to insert it at the originating studio and not replace it anywhere along the chain, not even at network central control; but not every program had it inserted...

Some great info there!
Brilliant of General Electric to have a wider burst in active line - and at a luminance level of a typical flesh tone!

So, to me, the "problems" would have been easily avoidable with good broadcasting practice:
For example source material with VIR needed to be first reclaibrated thru a professional VIR processor
that also regenerated burst & VIR (with exact phase similitude) to the gated line 19 sample).
Was there such an instrument?

kf4rca 09-29-2016 07:51 AM

I remember quad videotapes (2 inch) coming in that had a sticker saying "Protected by VIRS- adjust proc amp to pass."
In reality burst and sync got "regenerated" 2 times. After leaving the switcher it went to a proc amp where it would get black and white clip and agc. (I thought the best AGC amp was the RCA TA19. You could always tell a station that had one of these on the air.) Sync and burst would also be regenerated. Then it went to the virs inserter where it sync and burst would be regenerated again.
NOW, all of this is ahead of a pre-corrector that was required to compensate for non-linearities in the modulation process of the transmitter. It had to look flat on the air.
AND don't forget the low pass filter. Video was band limited to 4.2 MHZ to avoid components bleeding over into the sound at 4.5 MHz.

NewVista 09-29-2016 09:51 PM

Quote:

Originally Posted by old_tv_nut (Post 3170654)
.. If there is significant phase distortion of the chroma over the network, the phase of the incoming burst may also vary over its width, making it difficult to know where to set the phase of the re-inserted burst. The VIR reference had a much wider burst of chroma during the active line, which was supposed to fix this by ignoring edge distortions. .

Didn't know of this problem of burst phase distortion , nevertheless the burst must be the reference for regeneration I would think and not the higher level VIR chroma which may have phase shift wrt the burst?

To eliminate any guesswork, just have color bars on another vert interval line!

kf4rca 09-30-2016 08:43 AM

They DO run a line of color bars. But one of the biggest problems of NTSC is that it was compromised from the beginning. It had to fit in the same space as the existing B&W system.
As a result the bandwidth of the chrominance was limited. The I components were limited to 1.5 MHz and Q is limited to .5 Mhz. Not much detail in the color. AND its not even symmetrical. How screwey is that???
NOW, in Japan they had an analog HD system called the MUSE which I've heard took a 12 MHz channel. So when color (and HD) came along ,they should have allowed MORE bandwidth than what the old B&W system started out with.
SO, even today we are stuck with the constraints of an old system.

NewVista 09-30-2016 07:58 PM

Quote:

Originally Posted by kf4rca (Post 3170929)
They DO run a line of color bars. .

I've seen them sometimes at top of underscanned picture.
So alternatively, this is as good as VIR insomuch as there should be no guesswork for the tech before adding new VIR before transmission, as all he has to do is line up the color bars in the vectorscope avoiding subjectivity.

And since, by 1975, NTSC still had 40 more years (counting cable), LSI chips could have been produced for consumer TVs to either derive reference from V.I. colorbars or VIR and making this the default mode of the receivers - keeping consumers' fingers away from contrast and tint - if not also saturation - knobs!

dishdude 09-30-2016 10:51 PM

Cool early 80's GE projection with vir on ebay.

http://www.ebay.com/itm/Vintage-Anti...sAAOSwHoFXu0Kd

NewVista 09-30-2016 11:49 PM

Quote:

Originally Posted by dishdude (Post 3170975)
Cool early 80's GE projection with vir on ebay.

That set has 'VIR II' - wonder what improvements were made by then (1982)?

Magazine article on VIR

Boobtubeman 10-01-2016 06:18 PM

Had a Mot WARDS 19" with the VIR back in late 70s we paid $444 for brand new.. Great set till the fly shorted. Neighbor had a similar WARD set they tossed with a dead Vertical circuit. Swapped they Flys and it was up and runnin till i sold it years later and the vertical died 2 yrs after and it was tossed..

SR

kf4rca 10-02-2016 01:25 PM

Not sure what VIR2 is. Never heard of that.
Even before the HD switchover, stations were going digital in NTSC days.
I remember spots and syndicated programs coming in as a file from a digital satellite.
The files were fed directly to the playout server (a raid 5 computer). And then to air.
Sometimes the programs had low video or high chroma. Since it never went to analog at the station there was no way to adjust it.
This was eventually corrected as stations raised hell with the syndicators. Turns out the spots and programs had been uplinked by a secretary who had no knowledge of video parameters.

NewVista 10-05-2016 08:42 AM

Quote:

Originally Posted by Boobtubeman (Post 3171031)
Had a Mot WARDS 19" with the VIR back in late 70s .. Great set...

SR

Did it make much difference when you ran it with VIR switched 'on'?

kf4rca 10-06-2016 07:30 AM

As I recall, the vir switch had its own set of controls. So, you could make it look like anything you wanted.
Those vir generators are all over Ebay if you wanted to play with one. Except for the VITS100, they also have a full field test generator with a boatload of test signals. The one I liked was the Tektronix 1910.
I've seen them as low as $30.
But I think the vits deleter and inserter section could be used in eradicating copyguard thats in the vertical interval.

Boobtubeman 10-16-2016 10:41 PM

As i recall, it did have internal color and tint adjustments for VIR, and manual adjustments for non Vir mode.. If i remember correctly, it performed well most of the time, but there were times when we had to disable it because it didnt agree with all programs...

SR

kf4rca 09-28-2017 10:04 AM

1 Attachment(s)
Here is the only 19" GE VIR set I've ever seen. I think most people bought the stripped down model despite the massive advertising campaign.

Electronic M 09-28-2017 10:19 AM

A friend of mine owns a VIR set with a gassy CRT.

Jon A. 10-01-2017 03:42 PM

I got a VIR-equipped Electrohome console from 1978 just yesterday. It starts but I'm sure it'll need work. Good thing I got the manual, and saw this thread, otherwise I still wouldn't know what it is.

NewVista 02-20-2018 02:58 PM

I think I've solved the problem for RCA's original NTSC spec with a better implementation:
Just have colorburst for almost a whole line period in the last serrated segment of the field sync pulse
at ~50% luminance level(like GE-VIR) (picture level of light complexions.)

This gives nice clean, long duration subcarrier reference every sixtieth of a second (like VIR) (no burst every line needed)(who gives a flip about airplane-flutter.)

Then 1950s era receivers simply run separated vert sync thru a HPF to obtain subcarrier phase reference (no countdown circuit chips required 1970s VIR)

old_tv_nut 02-20-2018 03:20 PM

Quote:

Originally Posted by NewVista (Post 3196490)
I think I've solved the problem for RCA's original NTSC spec with a better implementation:
Just have colorburst for almost a whole line period in the last serrated segment of the field sync pulse
at ~50% luminance level(like GE-VIR) (picture level of light complexions.)

This gives nice clean, long duration subcarrier reference every sixtieth of a second (like VIR) (no burst every line needed)(who gives a flip about airplane-flutter.)

Then 1950s era receivers simply run separated vert sync thru a HPF to obtain subcarrier phase reference (no countdown circuit chips required 1970s VIR)

Do you mean a different version of NTSC with subcarrier reference only in the vertical segment but not in each line?

1) Does not solve the problem of having it re-inserted in order to meet FCC signal specs. The original problem was in good part due to the locally reconstituted sync and burst having the wrong phase and amplitude compared to what had happened to the chroma coming over the network. Simply passing the degraded burst to the transmitter was a non-starter, as it could cause problems with color killers in receivers.
2) Not possible to use it with affordable crystal oscillator / phase-locked loop technology, since the oscillator in the receiver would need to have a free-running frequency within 30 Hz of correct, or side-lock could occur. With burst on every line, the free running crystal needs only to be within half that, or 7867 Hz. To insure quick lockup, the oscillator actually should be off no more that 10% or so of the nearest sideband and the loop bandwidth should be just wide enough to pass a beat frequency. Remember, horizontal rate sidelock is what was used in gated-rainbow color bar generators. This proposal would allow accidental vertical sidelock, and would require a crystal oven for stability and/or a customer color lock control to prevent it.

A very thorough theoretical study was done of the color burst to verify that once per horizontal was the right way.
http://ieeexplore.ieee.org/document/4051510/

NewVista 02-24-2018 01:25 AM

"Not possible to use it with affordable crystal oscillator"

What if, after transition of H sync to black, the back porch was raised to 50%?

Seems the GE VIR locked nicely to 60hz interval reference with a cheap osc.
I must look up a schematic of a VIR board, haven't searched yet.

As for that paper from IEEE, they want $33 to read it, good luck with that.

old_tv_nut 02-24-2018 09:57 AM

The oscillator in the VIR sets locked to the regular burst as usual. The VIR made a secondary adjustment of phase. The goal of the VIR was to provide a secondary signal that suffered all the phase and amplitude perturbations as the video chrominance, all the way from the original encoding at the studio. VIR and the burst would always have the same frequency, since the local proc amp sync and burst reinsertion was locked to the incoming signal from the network. VIR was never intended to be a frequency reference.

Later, when the use of frame synchronizers became common, even the local frequency could be different from the incoming. VIR could no longer be carried all the way through the chain.
[Edit - maybe it could be - have to think about that]

For another thing, some stations began using local VIR insertion to adjust transmitters in an automatic closed-loop system, and this broke the chain as well.

Before frame synchronizers, the major networks had established atomic clock references to which everything was synced. These were so stable that you could sync a receiver to one network it displayed the picture from another (which I actually did for an experiment when I worked at Zenith). The National Bureau of Standards published a paper on using the network color burst as a better frequency reference than WWV.

As far as putting the burst on a pedestal:
1) The reason for the pedestal in VIR is so that any differential phase incurred along the chain would be compensated precisely nearer to skin tone luminance.
2) The reason for not putting the burst on a pedestal in the original NTSC/FCC specs was to prevent making it visible during retrace in sets that didn't have power retrace blanking. There was a proposal to put it on a pedestal to avoid possibly clipping it in studio amplifiers or transmitters, but this was rejected. Adding the pedestal after the standard was established would have required expensive surveying of the existing receiver population to make sure retrace was no longer a problem.

NewVista 02-24-2018 11:39 PM

Mr WB, given the alleged advantages of PAL, (as well as its technical drawbacks)(and inconvenience of broadcast & consumer media playback incompatibility), do you think it was wise for those few South American countries to adopt (525/60) PAL-M & PAL-N?


All times are GMT -5. The time now is 08:25 AM.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
©Copyright 2012 VideoKarma.org, All rights reserved.