Interlacing destroys color
I know this is going to sound weird, but I have noticed that interlaced programming seems to have worse color qualities than a non-interlaced signal (like 240p for example).
I know in the signal itself, it's the same.. On an LCD or plasma, they look the same in color.
But on a CRT, it seems color quality on a interlaced signal is just, different. ALL CRT's I have used or seen (including my friends roundie), exhibit this same thing.
Unless of course it's just me.
(EDIT: Whooops! Wrong board! Sorry.)
|