Quote:
Originally Posted by basil lambri
I had read about that recommendation by the SMPTE on a reputable page on the Internet but I didn't copy it then so I do not know the specific details.
I think that they use 10 bits per channel when they produce HDTV programs in ATSC but when they broadcast the programs the use 8 bits per channel (or 24 bits total).
I know also that they also use 8 bits per channel when they broadcast programs in DVB-T and DVB-T2 where you live in England. Actually, they do the same thing there on Freeview, that they broadcast the standard definition PAL programs at a much lower bit depth than the HD programs.
|
I agree with ppppenguin - you must have confused the SDI interface specs with the number of bits per pixel for coding. Broadcast coding uses 8 bits per component pretty universally.
I would note also, however, that 8 bits per component can have visible artifacts on some picture material when viewed on a high-contrast capable display in a dimly lit room. If you want to produce shows that use the kind of dynamic range long used in the movies to contrast indoor/outdoor and day/night, the extra bits help.
Viewers have gradually been trained (over a period of decades) to higher quality in terms of noise level and contrast range. The TASO study that set signal-to-noise for analog TV coverage in the 50s would probably get a one grade lower result (on a 5-step scale) today.