View Single Post
  #21  
Old 09-28-2016, 11:55 PM
NewVista's Avatar
NewVista NewVista is offline
VideoKarma Member
 
Join Date: Jan 2010
Location: Milw, WI
Posts: 724
Quote:
=old_tv_nut;3170654
The local burst must be reinserted to standard amplitude and clean waveform per FCC rules. Therefore, it at least loses its relation to the chroma amplitude, which may have changed due to analog transmission distortion over the network. In practice, the reinserted phase is also adjustable and therefore can be misadjusted. If there is significant phase distortion of the chroma over the network, the phase of the incoming burst may also vary over its width, making it difficult to know where to set the phase of the re-inserted burst. The VIR reference had a much wider burst of chroma during the active line, which was supposed to fix this by ignoring edge distortions. ..
The whole idea of VIR was to insert it at the originating studio and not replace it anywhere along the chain, not even at network central control; but not every program had it inserted...
Some great info there!
Brilliant of General Electric to have a wider burst in active line - and at a luminance level of a typical flesh tone!

So, to me, the "problems" would have been easily avoidable with good broadcasting practice:
For example source material with VIR needed to be first reclaibrated thru a professional VIR processor
that also regenerated burst & VIR (with exact phase similitude) to the gated line 19 sample).
Was there such an instrument?

Last edited by NewVista; 09-28-2016 at 11:59 PM.
Reply With Quote