#1
|
||||
|
||||
Standardisation of video levels and impedances
I have placed a similar thread on the UK VRAT forum: http://vintagetvandradio.myfreeforum...p=56930#p56930 but I'd like to get the USA point of view too.
As I understand it video in the USA has been carried as 140 units peak to peak on a 75R co-axial cicruit. This greatly simplifies connecting different equipment. The size of the unit has changed over time, from 10mV (gives 1.4Vp-p) down to about 7.2mV currently. Not sure about how many units have been allocated to pedestal/setup and whether this has always been present on 525 signals. These values are a little different to those used in Europe but not sufficiently different to cause much difficulty. Going back to 1936 and the BBC TV station at Alexandra Palace the Black Book shows that a variety of much higher levels were used. Much of the kit was directly connected without going to the trouble of matching to co-ax. Leaving aside the fact that 1V to 1.4V p-p into 75R is not a terribly convenient level for valve kit, when and how did this standard emerge? I have a copy of Fink's 1952 Television Engineering but I haven't yet found a reference to baseband video levels, despite extensive discussion of the transmitted video waveform. Last edited by ppppenguin; 06-10-2013 at 02:38 AM. |
#2
|
||||
|
||||
Thanks for link, definitely worth a Lurk (I mean Lark)
I just came across bound annual collections of PRACTICAL TELEVISION of the late 1950s years - Intriguing. They cost 1/-3p (One shilling-threepence) (13¢) a copy in '58 Last edited by NewVista; 06-21-2013 at 01:20 AM. |
#3
|
||||
|
||||
The video voltage units are called IRE units because they were standardized by the IRE (Now IEEE). The original setup spec for black and white was looser, and tended toward zero, but the NTSC color spec set it at 7.5 IRE. However, it was not stated in IRE units in 1953, but rather as percent of RF modulation, so the IRE scale may have come later but some time before the IRE and AIEE merged to become the IEEE (1963).
I think setup was ill-advised because it resulted in objectionable variations in black level when it was not carefully maintained. Specifying zero setup as Europe did would have served the industry much better, IMO. [Edit - oh yes, thanks for the link!] Last edited by old_tv_nut; 06-20-2013 at 10:30 PM. |
#4
|
||||
|
||||
To modern eyes setup looks ill conceived. It serves no useful purpose in the studio and just wastes transmitter power. AFAIK its only purpose was to minimise flyback time artifacts on the screens of receivers.
I haven't trawled the documents but I think that 405 line (Sytem A) varied in its use of setup, finally settling on not using it. |
#5
|
||||
|
||||
Like the protracted abandonment of £ - s - d
|
Audiokarma |
#6
|
||||
|
||||
and the even more protracted US abandonment of degrees Fahrenheit, feet, inches and a rather undersized gallon.
I was looking at the 1954 IRE papers on NTSC. In particular that fateful decision to tweak the line and frame rates rather than move the sound carrier. Nobody knew it at the time but the hassle that would cause for timecode was huge and continues to this day. It's a problem we don't have this side of the pond:-) |
#7
|
||||
|
||||
By how much? Would existing TVs still work without modification?
|
#8
|
||||
|
||||
The required change is one tenth of one percent. This works out to 4500 Hz, which could affect different sets in different ways
1) how sharp are the audio traps 2) does the set have split sound (in which case the user could tune to center the audio IF frequency), or intercarrier sound (factory tuned to 4.5000... MHz) 4500 Hz is a significant portion of the full sound carrier deviation (+/- 25 kHz), so intercarrier sets could experience increased distortion. The tight tolerance on the broadcast signal was meant to allow all the variations in tuning to be in the receiver for reasons of economy. |
#9
|
||||
|
||||
They didn't want to tamper with compatibility.
As Yves Faroudja (1987 SMPTE David Sarnoff Gold Medal Award) once said: "I learned the lesson (after failed product) not to do anything that isn't backward compatible". |
#10
|
||||
|
||||
I can understand them being shy of shifting the sound carrier after the whole CBS sequential colour business. There was an easy fix available and nobody could reasonably have foreseen the trouble it would cause later.
Subsequently, at least in the UK there have been changes that have required changes for all viewers. I'm not talking about digital switchover which we've all had, but the start of UK national Channel 5. This was carried on UHF channels that were generally used for connecting VCRs etc to TVs. There was a lot of scope for intereference so Channel 5 had to fund a national programme of retuning which potentially involved visiting every household in the Channel 5 service area, in other words much of the country. |
Audiokarma |
#11
|
||||
|
||||
It was easier to shift the vertical and horizontal scan frequencies, as all consumer TV sets had consumer adjustments for locking onto these. In contrast, the FM intercarrier was set at 4.5MHz with no consumer adjustment possible. Aunt Tilly isn't going to get out a diddle stick and retune the sound IF transformers in her TV set... Most TV sets didn't even make the consumer adjust the vertical and horizontal holds anyway.
__________________
|
#12
|
||||
|
||||
Quote:
Pages 131-133 attached. |
#13
|
||||
|
||||
Thank you for posting that excerpt.
__________________
Chris Quote from another forum: "(Antique TV collecting) always seemed to me to be a fringe hobby that only weirdos did." |
#14
|
||||
|
||||
Quote:
__________________
Chris Quote from another forum: "(Antique TV collecting) always seemed to me to be a fringe hobby that only weirdos did." |
#15
|
||||
|
||||
That's a great reference. It's a really good explantion. Thanks.
|
Audiokarma |
|
|