View Single Post
  #67  
Old 09-13-2017, 09:35 AM
FrankieKat FrankieKat is offline
VideoKarma Member
 
Join Date: Feb 2012
Location: Brooklyn, NY, USA
Posts: 42
Quote:
Originally Posted by Electronic M View Post
There is a slight chance the video detector diode is going bad (I've seen it in some TVs from time to time) and reducing video amplitude.
I pulled out the old diode and tested it and it was showing a 0.7V voltage drop one way and 0.1V drop the other way, so it was definitely suspect. I changed it out with a new 1N60 and it seems to have made a difference in terms of how the signal degrades as the set warms. Contrast and sound seem more stable as the set warms. No change to the grid or plate voltages on the V7B tube though.

What I do see is that there is a slow brightness increase over 5 mins or so when starting from cold. As in, having adjusted the bright/contrast the night before, it will start very dim - almost not visible. Could this just be the age of the CRT and it's poor life test?

I can set all of the controls so that all of the voltages on the CRT are exactly on spec, and this is how the picture appears https://youtu.be/jzV57jRqrcI. I can adjust for better picture, but it changes the cathode and g1 voltage by 50-60V away from the schematic spec. Does it make logical sense that I would want to keep the controls set to produce the schematic voltages on the CRT because that will always be it's optimum operating point? As in, this should be the best picture, and there's something else causing it to be whited out like this that still needs to be found.

FK

Last edited by FrankieKat; 09-13-2017 at 09:40 AM.
Reply With Quote