View Single Post
  #1  
Old 08-12-2016, 04:45 PM
lnx64's Avatar
lnx64 lnx64 is offline
VideoKarma Member
 
Join Date: Jan 2012
Location: New Port Richey, FL
Posts: 1,787
Pixelation on subchannels

I'm having a really weird issue, and I hope this is the right spot to ask this. So, I built my own home made UHF antenna, and mounted it in the attic. It's fed to a 10dB amplifier, and the amp is only 1ft away from the antenna. The coax came with the house, and just goes straight from the amp, right down the wall and to the TV. I'd say maybe 40ft of RG59/U. My father won't let me change it to RG-6.

I did what I could, from the cable jack, I fed it straight into my SDR, and there's literally no multipath distortion whatsoever on the signal, and it's strong too! But the TV it's plugged into, one with a built in digital tuner, all main channels seem to work with minor distortion, but occasional pixelation, but the subchannels, they pixelate so much worse, even if the main channel it's off, is perfect.

What's also weird, is the TV will even show 3/4 - 4/4 bars for the channel, while it's pixelating. One channel, oddly will go from 4/4 straight to 1/4 (or 0/4). I don't get it, it's one thing for me to understand a minor swing in quality, but the major swings too? The major swings are coming from 1 channel, and all subchannels and main do go out. But on the minor swings, that's all around the board.

I have another TV, which is going through a splitter for two bedrooms, which are also off the amp (so 10dB going to -3.5dB loss, not counting coax loss), and get this, the digital stream DTA on it, hardly ever pixelates, it works perfectly fine!

I would think the TV plugged straight into the amp would be the one getting the best signal.
Reply With Quote