View Single Post
  #17  
Old 02-17-2024, 02:12 PM
DVtyro DVtyro is offline
VideoKarma Member
 
Join Date: Sep 2022
Posts: 137
Quote:
Originally Posted by ARC Tech-109 View Post
Digital also loses quality with each generation of compression and jitter, not as substantial as the consumer analog formats but there is a loss that's why its called "lossy compression".
I specifically mentioned that I am talking about copying without editing or recompression - you can make a zillion of exact digital copies, whereas every analog copy is a new generation.

Visually lossless intermediate codecs have been used for the last two decades with great success for everything from home videos to big-screen movies. FX scenes in the original Blade Runner have seven exposures on the same strip of film, you can search for BTS comments about the loss of detail. Digital allows dozens of recompressions without visual loss. With modern storage solutions one can use lossless codecs to prevent data loss in the areas of the image that have not been changed.

In short, analog has nothing to show for it.

Quote:
Originally Posted by ARC Tech-109 View Post
There are time when a SD looks far better than the HD and again this comes down to the overall detail, just because it has 1080 progressive lines doesn't mean each line is unique in the sense it wasn't sourced from a lower format and upconverted.
Why would I watch upconverted HD? I watch native HD.

Quote:
Originally Posted by ARC Tech-109 View Post
Then we can get into things like 4:1:1 sub-sampling, bit error rates and the all important dithering from the low sampling be it 8-bit or the more common 10-bit.
60 Hz DV used 4:1:1 subsampling, it is a thirty-year old CONSUMER format, yet it is comparable to PROFESSIONAL analog Betacam SP. Notice, that the evaluation was done using the FIRST generation off a Rec. 601 source. Every new analog copy - even without editing - will make it worse. Broadcast HD has been using 4:2:2 @ 50 Mbit/s interframe from day one as the lowest acceptable format. Of course, nowadays the numbers went up in every metrics: resolution, bit depth, bit rate, full color, alpha channel, etc. Again, analog has nothing, it has been left behind in the dust.

Quote:
Originally Posted by ARC Tech-109 View Post
How about the compression artifacts like mosquito noise and dynamic pixelations? The modern displays are really good at hiding these things with their 3D filters and other DSP functions but when they're disabled the truth comes out, you can only make a pile of pixels look so good.
Yes, these artifacts are caused by insufficient bitrate. Blame OTA TV, which has ruined television for everyone. It still looks better than analog NTSC. Analog HD needed 30 MHz to look good. The Japanese tried to squeeze it into 15, 12, 9 and even 6 MHz, but predictably it would look worse and worse. Analog HD in 6 MHz was not watchable, whereas H.266 is roughly 8 times more efficient than H.262, meaning you can have 8 great looking HD channels instead of one analog NTSC.

Quote:
Originally Posted by ARC Tech-109 View Post
Yes 34 inches of 16:9 CRT that covers native unprocessed analog plus HDMI to 1080i due to the CRT itself.
The only use of an interlaced CRT TV set is watching interlaced programs. Sadly, the Japanese dumped their 1125-line equipment onto the U.S. thirty years ago, but only because the U.S. had stopped manufacturing televison broadcast equipment by that point.

Quote:
Originally Posted by ARC Tech-109 View Post
It will show you every flaw in that pretty digital image without shame, I have a screen cap from the first StarWars from the original Laserdisc (where Hon shoots first) but the videokarma server won't accept it due to the size. What it does show it the real quality of the image without the dithering or other digital compromises needed to keep the costs low for the consumer world.
Star Wars is 24 fps. When shown on an interlaced TV set it is telecined. You don't see full vertical resolution, you see interline twitter, and you see shimmer from insufficient 50 Hz refresh rate. Have you tried to use a consumer-grade TV set as a computer monitor? The difference is striking, which is why computer monitors have used progscan since at least 1980s. There is LESS detail in the interlaced scan.

To see full movie resolution you need to remove the pulldown, converting it back into 24p, and display it as progressive. This is what modern TV sets do. This is what good TV sets did since the late 1980s, with built-in deinterlacer, with 100 Hz or 120 Hz refresh rate, with 16:9 screen. Such TV sets became popular in Europe, much less so in the U.S., because NTSC has stuck with 4:3 interlace. Since your TV set is 1080i only, you cannot enjoy the full resolution of BD movies. OTOH, it can do 480p, so provided that it can do IVTC correctly, you can enjoy DVDs as intended.
Reply With Quote