Videokarma.org

Go Back   Videokarma.org TV - Video - Vintage Television & Radio Forums > Television Broadcast Theory

We appreciate your help

in keeping this site going.
Reply
 
Thread Tools Display Modes
  #1  
Old 06-19-2020, 10:57 AM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is offline
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,221
The graphics in my paper are adjusted to be correct for an sRGB monitor. The colors on the MacBeth colorchecker chart are within sRGB, except for the cyan, which is only slightly outside. The MacBeth green is nowhere near NTSC green. This illustrates how the NTSC primaries do a very good job of covering the gamut of real surface colors, and sRGB does an adequate job.

When I was on the NTSC monitor committee at SMPTE, NBC/RCA set up a test with some brightly colored objects, which were displayed on two identical monitors, except that the CRTs had been made by Sylvania with one being essentially SMPTE C and the other having NTSC green. There was a skein of kelly-green yarn that had been found that was specifically outside the modern phosphor range. I do not recall any cyan object that was outside the range. We then compared the two renditions side by side, including turning the monitor matrix adjustment on and off. This work resulted in the adoption at the time of a standard for NTSC monitors using modern phosphors with a switchable matrix. By the way, the modern phosphors with corrective matrix in the receiver showed the correct flesh tones and red-yellow-green hue range, but produced the expected brightening of reds and darkening of cyans. The work of the committee included selecting the best compromise values for the corrective matrix, to get acceptable hues with "acceptible" brightness changes. There had been a paper published on matrices for color receivers to minimize the squared error for some selection of colors, although I seem to recall it looked only at hue and saturation and ignored brightness.

The precise phosphors adopted were called "SMPTE C." The "C" stood for one of of three formulations that were proposed, and also happened to be those used in Conrac monitors. This batch of phosphors was kept aside and used by whoever supplied CRTs to Conrac and maybe some others. Asian manufacturers were on their own to formulate phosphors that matched SMPTE C.

Later, SMPTE C, EBU and other slight variants were reconciled to the HDTV/sRGB values.
__________________
www.bretl.com
Old TV literature, New York World's Fair, and other miscellany

Last edited by old_tv_nut; 06-19-2020 at 11:04 AM.
Reply With Quote
  #2  
Old 06-20-2020, 02:09 PM
etype2's Avatar
etype2 etype2 is offline
VideoKarma Member
 
Join Date: Mar 2010
Location: Valley of the Sun, formerly Silicon Valley, formerly Packer Land.
Posts: 1,494
[QUOTE=old_tv_nut;3224970]

When I was on the NTSC monitor committee at SMPTE, NBC/RCA set up a test with some brightly colored objects, which were displayed on two identical monitors, except that the CRTs had been made by Sylvania with one being essentially SMPTE C and the other having NTSC green. There was a skein of kelly-green yarn that had been found that was specifically outside the modern phosphor range. I do not recall any cyan object that was outside the range. We then compared the two renditions side by side, including turning the monitor matrix adjustment on and off.“

So you saw NTSC green and presumably other NTSC colors on the Sylvania CRT. And no professional camera could capture NTSC colors and now I mean still camera's? Is it possible for you to reproduce as close as possible a 1953 NTSC green color sample based on your knowledge?

I know what your going to say, how can I see it on my monitor. I have DCI-P3 which has 26% wider gamut over sRGB. Not as wide as 1953 NTSC though.

Edit: You said “If you put color bars on your old set, take a picture with a digital camera set to Adobe RGB or prophoto RGB, and display it on a wide-gamut monitor (or the old TV), then you should see the NTSC primaries and secondaries reproduced, sort of.” Would that be reduced luminance/saturation?
__________________

Last edited by etype2; 06-20-2020 at 03:04 PM.
Reply With Quote
  #3  
Old 06-20-2020, 05:26 PM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is offline
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,221
[QUOTE=etype2;3224992]
Quote:
Originally Posted by old_tv_nut View Post

So you saw NTSC green and presumably other NTSC colors on the Sylvania CRT. And no professional camera could capture NTSC colors and now I mean still camera's? Is it possible for you to reproduce as close as possible a 1953 NTSC green color sample based on your knowledge?

I know what your going to say, how can I see it on my monitor. I have DCI-P3 which has 26% wider gamut over sRGB. Not as wide as 1953 NTSC though.

Edit: You said “If you put color bars on your old set, take a picture with a digital camera set to Adobe RGB or prophoto RGB, and display it on a wide-gamut monitor (or the old TV), then you should see the NTSC primaries and secondaries reproduced, sort of.” Would that be reduced luminance/saturation?
1) Current digital cameras can represent NTSC green in the raw file, or in a jpg file when the camera is set to AdobeRGB color space. Warning: really, you should only use raw files for this, as a jpg file with AdobeRGB is VERY likely to be misinterpreted as sRGB by practially all software out there. Although the camera can REPRESENT NTSC green, whether it will do so exactly if you take a picture of your NTSC CRT is iffy, for several reasons:
1) the spectral responses of the sensor are not linear combinations of the eye cone responses, so saturated colors are distorted to some extent. The good news is that the distortion tends to saturate very saturated colors more than reality, and move them towards the primaries. So, there is a good possibility the green bar will be recorded as fully saturated NTSC green.. Secondary colors (yellow, cyan, magenta) are more likely to give hue shifts.
2) Photographic cameras are not linear photometric/colorimetric devices. The programs that process raw files do not do linear photometric/colorimetric transforms of the raw data. They are trying to make a pretty picture of reality, so you when you take a picture off your NTSC tube, you end up with a picture of a picture, rather than a strict duplicate of what was on the screen.
3) If you have a monitor that reaches NTSC green, and it is properly profiled, Adobe products should do a decent job of showing the result of the camera response and the photographic processing. This should show the more saturated, less yellow green, but whether it will do so precisely is harder to say. For example, in Lightroom, you can choose multiple "camera profiles." None of these are labeled as colorimetric. From the sound of the titles, the closest might be Adobe Neutral, of for a Canon camera, Camera Matching "Faithful." But even these two are different from each other. Each of these profiles affects the hue, saturation, and luminance of the primary and secondary colors differently, not to mention image contrast, highlight and shadow compression, and on and on.

In the end, you have to ask yourself why go to all this effort. You can do it for your own gratification of viewing something on your wide-gamut monitor. You could perhaps share an image (in some format other than jpg) with someone else who has color-managed software and a calibrated and profiled wide-gamut monitor. But if your goal is to show pictures on the web to other TV restorers, the majority won't have all that, so the most reliable thing is still jpg and sRGB. If the viewer only has an sRGB monitor, that's the end of it.

So, yes, you can definitely go in the right direction to take a picture of, and then display, greens closer to NTSC (if your monitor has the capability), but you shouldn't expect precision.
__________________
www.bretl.com
Old TV literature, New York World's Fair, and other miscellany
Reply With Quote
  #4  
Old 09-05-2020, 11:39 AM
pidade pidade is offline
VideoKarma Member
 
Join Date: Sep 2020
Posts: 6
Quote:
Originally Posted by old_tv_nut View Post
The graphics in my paper are adjusted to be correct for an sRGB monitor. The colors on the MacBeth colorchecker chart are within sRGB, except for the cyan, which is only slightly outside. The MacBeth green is nowhere near NTSC green. This illustrates how the NTSC primaries do a very good job of covering the gamut of real surface colors, and sRGB does an adequate job.

When I was on the NTSC monitor committee at SMPTE, NBC/RCA set up a test with some brightly colored objects, which were displayed on two identical monitors, except that the CRTs had been made by Sylvania with one being essentially SMPTE C and the other having NTSC green. There was a skein of kelly-green yarn that had been found that was specifically outside the modern phosphor range. I do not recall any cyan object that was outside the range. We then compared the two renditions side by side, including turning the monitor matrix adjustment on and off. This work resulted in the adoption at the time of a standard for NTSC monitors using modern phosphors with a switchable matrix. By the way, the modern phosphors with corrective matrix in the receiver showed the correct flesh tones and red-yellow-green hue range, but produced the expected brightening of reds and darkening of cyans. The work of the committee included selecting the best compromise values for the corrective matrix, to get acceptable hues with "acceptible" brightness changes. There had been a paper published on matrices for color receivers to minimize the squared error for some selection of colors, although I seem to recall it looked only at hue and saturation and ignored brightness.

The precise phosphors adopted were called "SMPTE C." The "C" stood for one of of three formulations that were proposed, and also happened to be those used in Conrac monitors. This batch of phosphors was kept aside and used by whoever supplied CRTs to Conrac and maybe some others. Asian manufacturers were on their own to formulate phosphors that matched SMPTE C.

Later, SMPTE C, EBU and other slight variants were reconciled to the HDTV/sRGB values.
I went back and read some of your older posts on this complicated subject of NTSC/SMPTE-C matrix variations at the transmitter/camera and receiver, as you seem to be the only source of information on this subject on the internet, ha, but as you've mentioned it here, I thought I'd ask a couple questions as I'm still pretty confused.

I understand, because of the transition in the '60s to using dimmer, less saturated phosphors in monitors and TVs than those specified by the NTSC in 1953, receivers had to adjust Y'IQ decoding (or did they alter R'G'B' values after decoding?) to produce a more palatable image, and cameras made similar adjustments to the linear RGB values before Y'IQ encoding.

In today's world, where we're obviously not receiving NTSC transmissions directly from a TK-41, but at best, watching a VHS or LaserDisc that was probably seen/mastered on a P-22/SMPTE-C phosphor monitor from at least the '80s, is this still relevant? Was this ever an issue with home/consumer video or just with live broadcasts, where encoding for NTSC phosphors was required by the FCC? How can one tell whether their monitor or TV is altering NTSC signals with its own, subjective matrix, or is there even such a thing as unaltered, accurate NTSC in practice? It all just sounds so chaotic, haha.

Last edited by pidade; 09-05-2020 at 01:13 PM.
Reply With Quote
  #5  
Old 09-05-2020, 02:31 PM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is offline
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,221
Quote:
Originally Posted by pidade View Post
...
I understand, because of the transition in the '60s to using dimmer, less saturated phosphors in monitors and TVs than those specified by the NTSC in 1953, receivers had to adjust Y'IQ decoding (or did they alter R'G'B' values after decoding?) to produce a more palatable image, and cameras made similar adjustments to the linear RGB values before Y'IQ encoding.

In today's world, where we're obviously not receiving NTSC transmissions directly from a TK-41, but at best, watching a VHS or LaserDisc that was probably seen/mastered on a P-22/SMPTE-C phosphor monitor from at least the '80s, is this still relevant? Was this ever an issue with home/consumer video or just with live broadcasts, where encoding for NTSC phosphors was required by the FCC? How can one tell whether their monitor or TV is altering NTSC signals with its own, subjective matrix, or is there even such a thing as unaltered, accurate NTSC in practice? It all just sounds so chaotic, haha.
1) "dimmer, less saturated" is not a good blanket description. Green became yellower. Blue became more violet and saturated. Red became brighter but more orange and less saturated for a few years when sulfide red was used, then was restored to close to NTSC with rare earth reds.

2) Receivers adjusted the R-Y, B-Y, G-Y decoding, but not right away when the phosphors first changed. You can see the difference in successive RCA chassis. Electrical coding of the chroma was always per FCC, even for PAL transmission, which matrixed the R,G,B linear signals differently before gamma correction to R',G',B'.

3)The CTC-100 with 15GP22 is the only set guaranteed to have both NTSC decoding and NTSC phosphors. CTC5 and some successive RCA chassis have NTSC decoding, but the phosphors may differ. 21AXP22 and 21CYP22 CRTS may have NTSC phosphors, but that needs to be measured to verify, because the introduction of sulfide blue is not clearly documented. Sets I saw at the Museum of Science and Industry in the late 50s had suspiciously violet blues and greenish yellows, which you would expect from a more-violet blue phosphor combined with NTSC decoding.

4) Yes it was very chaotic

5) I suspect that re-issued LIVE programs shot with TK-41 image orthicon cameras, such as the Dean Martin show, may not have been rematrixed in any way, and only had the chroma amplitude and phase adjusted.
Assuming no rematrixing:
The adjustments may have been made looking at a later non-NTSC monitor, but if there was no re-matrixing, receiver controls on a CT-100 could easily reverse any error. Later sets of the same vintage as the program or earlier will accurately reflect what those sets would have shown at the time.

6) Re-issued film programs, like Bonanza, that have been rescanned on later gear, are completely suspect as to whether they show the same color as the original broadcasts on the sets of the time. Subjectively, the color is probably much better because the newer scanners are much improved over the old Vidicon chains in terms of shading and color distortions caused by the Vidicon's nonlinearity.
__________________
www.bretl.com
Old TV literature, New York World's Fair, and other miscellany

Last edited by old_tv_nut; 09-05-2020 at 03:13 PM.
Reply With Quote
Audiokarma
  #6  
Old 09-05-2020, 03:17 PM
pidade pidade is offline
VideoKarma Member
 
Join Date: Sep 2020
Posts: 6
Quote:
Originally Posted by old_tv_nut View Post
1) "dimmer, less saturated" is not a good blanket description. Green became yellower. Blue became more violet and saturated. Red became brighter but more orange and less saturated for a few years when sulfide red was used, then was restored to close to NTSC with rare earth reds.

2) Receivers adjusted the R-Y, B-Y, G-Y decoding, but not right away when the phosphors first changed. You can see the difference in successive RCA chassis. Electrical coding of the chroma was always per FCC, even for PAL transmission, which matrixed the R,G,B linear signals differently before gamma correction to R',G',B'.

3)The CTC-100 with 15GP22 is the only set guaranteed to have both NTSC decoding and NTSC phosphors. CTC5 and some successive RCA chassis have NTSC coding, but the phosphors may differ. 21AXP22 and 21CYP22 CRTS may have NTSC phosphors, but that needs to be measured to verify, because the introduction of sulfide blue is not clearly documented. Sets I saw at the Museum of Science and Industry in the late 50s had suspiciously violet blues and greenish yellows, which you would expect from a more-violet blue phosphor combined with NTSC decoding.

4) Yes it was very chaotic

5) I suspect that re-issued LIVE programs shot with TK-41 image orthicon cameras, such as the Dean Martin show, may not have been rematrixed in any way, and only had the chroma amplitude and phase adjusted.
Assuming no rematrixing:
The adjustments may have been made looking at a later non-NTSC monitor, but if there was no re-matrixing, receiver controls on a CT-100 could easily reverse any error. Later sets of the same vintage as the program or earlier will accurately reflect what those sets would have shown at the time.

6) Re-issued film programs, like Bonanza, that have been rescanned on later gear, are completely suspect as to whether they show the same color as the original broadcasts on the sets of the time. Subjectively, the color is probably much better because the newer scanners are much improved over the old Vidicon chains in terms of shading and color distortions caused by the Vidicon's nonlinearity.
Thanks for the reply.

I think you may have answered this in your third point, but did receivers *always* adjust decoding of NTSC signals for newer phosphors after the '60s, even in the '90s or '00s, or was it ever designed out of the standard, i.e. encoding directly for SMPTE-C, receivers decoding without assuming 1953 NTSC? It seems nonsensical that they would continue encoding for, really, a dead standard decades after the last full NTSC sets were produced, but then if they did alter the target, that would probably wreak havoc with receivers that adjusted decoding.

Also, is "R-Y, B-Y, G-Y" just the decoded R'G'B' in the receiver?

Last edited by pidade; 09-05-2020 at 03:20 PM.
Reply With Quote
  #7  
Old 09-05-2020, 04:01 PM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is offline
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,221
Quote:
Originally Posted by pidade View Post
Thanks for the reply.

I think you may have answered this in your third point, but did receivers *always* adjust decoding of NTSC signals for newer phosphors after the '60s, even in the '90s or '00s, or was it ever designed out of the standard, i.e. encoding directly for SMPTE-C, receivers decoding without assuming 1953 NTSC? It seems nonsensical that they would continue encoding for, really, a dead standard decades after the last full NTSC sets were produced, but then if they did alter the target, that would probably wreak havoc with receivers that adjusted decoding.

Also, is "R-Y, B-Y, G-Y" just the decoded R'G'B' in the receiver?
Analog NTSC receivers continued to have modified decoding until the end.
The major decoding adjustment is increased R-Y gain to compensate for the excess of effective red content in the yellower green phosphor. Because the yellowish green was like having extra red whenever the green was turned on, it reduced the hue shift between red and green. The R-Y signal controls the balance of red in a given color, so increasing R-Y means that the difference in red as the transmitted hue changes is emphasized. This only works up to a point with the non-linear CRT and new phosphors, because 1) it can't really change the hue of the pure yellow-green phosphor when pure green is called for, and 2) it adds too much red on bright red colors.

R-Y, G-Y, and B-Y should really be all primed, e.g. R'-Y', because they are derived from R', G', and B' in the encoder, but the primes are usually omitted, just like they are for I and Q. They are the color difference signals that are derived from the chroma signal in the receiver, and can be obtained either by wideband IQ demodulation and matrixing or by equiband direct demodulation on the appropriate three different axes. These three color difference signals are then added to Y' in the receiver to get R', G', and B' drives for the picture tube. The CT-100 did the adding in external matrix circuits and then drove the 15GP22 grids. In many tube receivers that followed, the final addition was done in the picture tube by applying Y' to all three cathodes and R-Y, G-Y or B-Y to the appropriate grid (G1). Later tube designs that did not have separate grids required adding the Y' and color difference signals in the circuits before driving the picture tube cathodes.

R-Y, G-Y and B-Y are not independent, and any one can be derived from the other two, just as any one can be derived from I and Q. So some tube sets had R-Y and B-Y demodulators with a matrix for G-Y, a few had a different choice of the two axes and matrix, and a few had three separate demodulators and needed no matrix.

RCA chassis from CTC-7 onward for several years used a clever matrix that included DC restoration for the signals, but had some cross coupling between R-Y and B-Y outputs, so the demodulator axes were adjusted to compensate, and were called X and Z axes. The signals to drive the CRT grids then came out to be R-Y, G-Y and B-Y, but were adjusted further in later chassis to get approximate compensation for the newer phosphors, as discussed in previous posts.
__________________
www.bretl.com
Old TV literature, New York World's Fair, and other miscellany

Last edited by old_tv_nut; 09-05-2020 at 04:05 PM.
Reply With Quote
  #8  
Old 09-05-2020, 04:25 PM
pidade pidade is offline
VideoKarma Member
 
Join Date: Sep 2020
Posts: 6
Quote:
Originally Posted by old_tv_nut View Post
Analog NTSC receivers continued to have modified decoding until the end.
The major decoding adjustment is increased R-Y gain to compensate for the excess of effective red content in the yellower green phosphor. Because the yellowish green was like having extra red whenever the green was turned on, it reduced the hue shift between red and green. The R-Y signal controls the balance of red in a given color, so increasing R-Y means that the difference in red as the transmitted hue changes is emphasized. This only works up to a point with the non-linear CRT and new phosphors, because 1) it can't really change the hue of the pure yellow-green phosphor when pure green is called for, and 2) it adds too much red on bright red colors.

R-Y, G-Y, and B-Y should really be all primed, e.g. R'-Y', because they are derived from R', G', and B' in the encoder, but the primes are usually omitted, just like they are for I and Q. They are the color difference signals that are derived from the chroma signal in the receiver, and can be obtained either by wideband IQ demodulation and matrixing or by equiband direct demodulation on the appropriate three different axes. These three color difference signals are then added to Y' in the receiver to get R', G', and B' drives for the picture tube. The CT-100 did the adding in external matrix circuits and then drove the 15GP22 grids. In many tube receivers that followed, the final addition was done in the picture tube by applying Y' to all three cathodes and R-Y, G-Y or B-Y to the appropriate grid (G1). Later tube designs that did not have separate grids required adding the Y' and color difference signals in the circuits before driving the picture tube cathodes.

R-Y, G-Y and B-Y are not independent, and any one can be derived from the other two, just as any one can be derived from I and Q. So some tube sets had R-Y and B-Y demodulators with a matrix for G-Y, a few had a different choice of the two axes and matrix, and a few had three separate demodulators and needed no matrix.

RCA chassis from CTC-7 onward for several years used a clever matrix that included DC restoration for the signals, but had some cross coupling between R-Y and B-Y outputs, so the demodulator axes were adjusted to compensate, and were called X and Z axes. The signals to drive the CRT grids then came out to be R-Y, G-Y and B-Y, but were adjusted further in later chassis to get approximate compensation for the newer phosphors, as discussed in previous posts.
Some fascinating reading in all of the engineering involved in analogue color TV, thanks for the insight. Literally couldn't find any information about modified NTSC decoding outside of a vague mention here or there in a couple ITU or SMPTE papers, though it seems kind of significant.

Was there ever a standard set for this modified decoding, considering the phosphors used were generally fairly consistent across different tubes from the '70s onwards? I also wonder how white balance affected it (FCC specified CIE Illuminant C, SMPTE specified D65, TV manufacturers generally went for ~D93). I found this 1969 patent, though it could just be one of many kinds of decoders (or unrelated).

Last edited by pidade; 09-05-2020 at 04:28 PM.
Reply With Quote
  #9  
Old 09-05-2020, 07:03 PM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is offline
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,221
Quote:
Originally Posted by pidade View Post
... Literally couldn't find any information about modified NTSC decoding outside of a vague mention here or there in a couple ITU or SMPTE papers, though it seems kind of significant.

Was there ever a standard set for this modified decoding, considering the phosphors used were generally fairly consistent across different tubes from the '70s onwards? I also wonder how white balance affected it (FCC specified CIE Illuminant C, SMPTE specified D65, TV manufacturers generally went for ~D93).
Some papers on modified chroma decoding were published in the IEEE Transactioins on Broadcast and Television Receivers.

The SMPTE-recommended switchable matrix for monitors was the only standard one I know of. TV makers did their own thing, based on their own subjective views. One of the things that affected the TV makers decision was the decidedly cyan white balance of receivers for many years. The subjective effect of white balance is dependent on surround conditions, which vary greatly in the home. The effects are much smaller in a dark theater environment with a screen occupying much of your view.

Referring to it as D93 is incorrect, as the D series of daylight colors had not been established. It was labeled as 9300K + 27 mpcd. This means it coreesponds to a black body color at 9300K and an adjustment perpendicular to the black body locus towards blue-green by 27 minimum perceptible color differences. It really was Illuminant C with the red reduced, so it was off the daylight locus toward cyan. This was strictly a measure to reduce the ratio of red gun current to the other guns, as obtaining daylight color ran the risk of spot blooming in the red highlights. Professional monitors could get away with actual daylight white (and unequal beam currents) because they weren't intended to be searchlight-bright in a store showroom. Not all manufacturers used 9300K. One of Zenith's secrets was its specified white point, which was less blue than others. Some manufacturers (Mitsubishi large screen rear projectors in particular) maintained the extreme cyan white balance forever, even when most others were offering a customer choice of a cool or warm (at least not so blue) setting.
__________________
www.bretl.com
Old TV literature, New York World's Fair, and other miscellany
Reply With Quote
Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 05:40 PM.



Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
©Copyright 2012 VideoKarma.org, All rights reserved.