|
#1
|
||||
|
||||
“‘If you put color bars on your old set, take a picture with a digital camera set to Adobe RGB or prophoto RGB, and display it on a wide-gamut monitor (or the old TV), then you should see the NTSC primaries and secondaries reproduced, sort of.”
THE COLOR BARS TODAY WOULD BE SMPTE C, RIGHT? It’s interesting after 67 years, the original NTSC color gamut has not been exceeded. RCA had lofty goals in the design of the CT-100, then dumbed down later receivers. The consumer was never going to notice the difference and “brighter was better.” The 1953 NTSC color gamut was actually a format that was never used. Excellent paper Edit: May I consider the color charts in your paper, especially green accurate 1953 NTSC colors? Like this one?https://visions4netjournal.com/wp-co...24396ECF8.jpeg
__________________
Last edited by etype2; 06-19-2020 at 08:05 AM. |
#2
|
||||
|
||||
Quote:
This results in some differences in the brightness of the compatible black and white picture of saturated colors from a camera, but not from color bars, because the matrix for PAL primaries is at a linear signal point just after the pickup devices and before the gamma correction and encoder. The resulting errors on a black and white receiver were considered too small to worry about compared to those caused by use of gamma-corrected R', G', B' to form Y'. The above is true for analog signals, and for properly encoded digital NTSC and PAL sources. When the color bars come from a proper digital HDTV signal (as Y', Cr, Cb), the luminance signals are different from NTSC and PAL because the encoding from RGB to Y, Cr, Cb is adjusted for the relative brightness of the HD primaries. Yes, it's confusing, and provides employment for those who design hardware and software to convert between the standards and make sure the resulting signals are "legal." |
|
|