View Single Post
  #13  
Old 09-05-2020, 11:39 AM
pidade pidade is offline
VideoKarma Member
 
Join Date: Sep 2020
Posts: 6
Quote:
Originally Posted by old_tv_nut View Post
The graphics in my paper are adjusted to be correct for an sRGB monitor. The colors on the MacBeth colorchecker chart are within sRGB, except for the cyan, which is only slightly outside. The MacBeth green is nowhere near NTSC green. This illustrates how the NTSC primaries do a very good job of covering the gamut of real surface colors, and sRGB does an adequate job.

When I was on the NTSC monitor committee at SMPTE, NBC/RCA set up a test with some brightly colored objects, which were displayed on two identical monitors, except that the CRTs had been made by Sylvania with one being essentially SMPTE C and the other having NTSC green. There was a skein of kelly-green yarn that had been found that was specifically outside the modern phosphor range. I do not recall any cyan object that was outside the range. We then compared the two renditions side by side, including turning the monitor matrix adjustment on and off. This work resulted in the adoption at the time of a standard for NTSC monitors using modern phosphors with a switchable matrix. By the way, the modern phosphors with corrective matrix in the receiver showed the correct flesh tones and red-yellow-green hue range, but produced the expected brightening of reds and darkening of cyans. The work of the committee included selecting the best compromise values for the corrective matrix, to get acceptable hues with "acceptible" brightness changes. There had been a paper published on matrices for color receivers to minimize the squared error for some selection of colors, although I seem to recall it looked only at hue and saturation and ignored brightness.

The precise phosphors adopted were called "SMPTE C." The "C" stood for one of of three formulations that were proposed, and also happened to be those used in Conrac monitors. This batch of phosphors was kept aside and used by whoever supplied CRTs to Conrac and maybe some others. Asian manufacturers were on their own to formulate phosphors that matched SMPTE C.

Later, SMPTE C, EBU and other slight variants were reconciled to the HDTV/sRGB values.
I went back and read some of your older posts on this complicated subject of NTSC/SMPTE-C matrix variations at the transmitter/camera and receiver, as you seem to be the only source of information on this subject on the internet, ha, but as you've mentioned it here, I thought I'd ask a couple questions as I'm still pretty confused.

I understand, because of the transition in the '60s to using dimmer, less saturated phosphors in monitors and TVs than those specified by the NTSC in 1953, receivers had to adjust Y'IQ decoding (or did they alter R'G'B' values after decoding?) to produce a more palatable image, and cameras made similar adjustments to the linear RGB values before Y'IQ encoding.

In today's world, where we're obviously not receiving NTSC transmissions directly from a TK-41, but at best, watching a VHS or LaserDisc that was probably seen/mastered on a P-22/SMPTE-C phosphor monitor from at least the '80s, is this still relevant? Was this ever an issue with home/consumer video or just with live broadcasts, where encoding for NTSC phosphors was required by the FCC? How can one tell whether their monitor or TV is altering NTSC signals with its own, subjective matrix, or is there even such a thing as unaltered, accurate NTSC in practice? It all just sounds so chaotic, haha.

Last edited by pidade; 09-05-2020 at 01:13 PM.
Reply With Quote