Videokarma.org

Go Back   Videokarma.org TV - Video - Vintage Television & Radio Forums > Recorded Video

We appreciate your help

in keeping this site going.
Reply
 
Thread Tools Display Modes
  #16  
Old 02-16-2024, 03:49 PM
ARC Tech-109 ARC Tech-109 is offline
Retired Batwings Tech
 
Join Date: Jun 2020
Location: Planet Earth
Posts: 336
Digital also loses quality with each generation of compression and jitter, not as substantial as the consumer analog formats but there is a loss that's why its called "lossy compression". There are time when a SD looks far better than the HD and again this comes down to the overall detail, just because it has 1080 progressive lines doesn't mean each line is unique in the sense it wasn't sourced from a lower format and upconverted. Then we can get into things like 4:1:1 sub-sampling, bit error rates and the all important dithering from the low sampling be it 8-bit or the more common 10-bit. How about the compression artifacts like mosquito noise and dynamic pixelations? The modern displays are really good at hiding these things with their 3D filters and other DSP functions but when they're disabled the truth comes out, you can only make a pile of pixels look so good.

Yes 34 inches of 16:9 CRT that covers native unprocessed analog plus HDMI to 1080i due to the CRT itself. It will show you every flaw in that pretty digital image without shame, I have a screen cap from the first StarWars from the original Laserdisc (where Hon shoots first) but the videokarma server won't accept it due to the size. What it does show it the real quality of the image without the dithering or other digital compromises needed to keep the costs low for the consumer world. The image is very bright and dynamic, no highlight crushing or dithering and without the blockiness of the pixelation during the fast action scenes. Once again analog wins this round.
Reply With Quote
  #17  
Old 02-17-2024, 02:12 PM
DVtyro DVtyro is offline
VideoKarma Member
 
Join Date: Sep 2022
Posts: 137
Quote:
Originally Posted by ARC Tech-109 View Post
Digital also loses quality with each generation of compression and jitter, not as substantial as the consumer analog formats but there is a loss that's why its called "lossy compression".
I specifically mentioned that I am talking about copying without editing or recompression - you can make a zillion of exact digital copies, whereas every analog copy is a new generation.

Visually lossless intermediate codecs have been used for the last two decades with great success for everything from home videos to big-screen movies. FX scenes in the original Blade Runner have seven exposures on the same strip of film, you can search for BTS comments about the loss of detail. Digital allows dozens of recompressions without visual loss. With modern storage solutions one can use lossless codecs to prevent data loss in the areas of the image that have not been changed.

In short, analog has nothing to show for it.

Quote:
Originally Posted by ARC Tech-109 View Post
There are time when a SD looks far better than the HD and again this comes down to the overall detail, just because it has 1080 progressive lines doesn't mean each line is unique in the sense it wasn't sourced from a lower format and upconverted.
Why would I watch upconverted HD? I watch native HD.

Quote:
Originally Posted by ARC Tech-109 View Post
Then we can get into things like 4:1:1 sub-sampling, bit error rates and the all important dithering from the low sampling be it 8-bit or the more common 10-bit.
60 Hz DV used 4:1:1 subsampling, it is a thirty-year old CONSUMER format, yet it is comparable to PROFESSIONAL analog Betacam SP. Notice, that the evaluation was done using the FIRST generation off a Rec. 601 source. Every new analog copy - even without editing - will make it worse. Broadcast HD has been using 4:2:2 @ 50 Mbit/s interframe from day one as the lowest acceptable format. Of course, nowadays the numbers went up in every metrics: resolution, bit depth, bit rate, full color, alpha channel, etc. Again, analog has nothing, it has been left behind in the dust.

Quote:
Originally Posted by ARC Tech-109 View Post
How about the compression artifacts like mosquito noise and dynamic pixelations? The modern displays are really good at hiding these things with their 3D filters and other DSP functions but when they're disabled the truth comes out, you can only make a pile of pixels look so good.
Yes, these artifacts are caused by insufficient bitrate. Blame OTA TV, which has ruined television for everyone. It still looks better than analog NTSC. Analog HD needed 30 MHz to look good. The Japanese tried to squeeze it into 15, 12, 9 and even 6 MHz, but predictably it would look worse and worse. Analog HD in 6 MHz was not watchable, whereas H.266 is roughly 8 times more efficient than H.262, meaning you can have 8 great looking HD channels instead of one analog NTSC.

Quote:
Originally Posted by ARC Tech-109 View Post
Yes 34 inches of 16:9 CRT that covers native unprocessed analog plus HDMI to 1080i due to the CRT itself.
The only use of an interlaced CRT TV set is watching interlaced programs. Sadly, the Japanese dumped their 1125-line equipment onto the U.S. thirty years ago, but only because the U.S. had stopped manufacturing televison broadcast equipment by that point.

Quote:
Originally Posted by ARC Tech-109 View Post
It will show you every flaw in that pretty digital image without shame, I have a screen cap from the first StarWars from the original Laserdisc (where Hon shoots first) but the videokarma server won't accept it due to the size. What it does show it the real quality of the image without the dithering or other digital compromises needed to keep the costs low for the consumer world.
Star Wars is 24 fps. When shown on an interlaced TV set it is telecined. You don't see full vertical resolution, you see interline twitter, and you see shimmer from insufficient 50 Hz refresh rate. Have you tried to use a consumer-grade TV set as a computer monitor? The difference is striking, which is why computer monitors have used progscan since at least 1980s. There is LESS detail in the interlaced scan.

To see full movie resolution you need to remove the pulldown, converting it back into 24p, and display it as progressive. This is what modern TV sets do. This is what good TV sets did since the late 1980s, with built-in deinterlacer, with 100 Hz or 120 Hz refresh rate, with 16:9 screen. Such TV sets became popular in Europe, much less so in the U.S., because NTSC has stuck with 4:3 interlace. Since your TV set is 1080i only, you cannot enjoy the full resolution of BD movies. OTOH, it can do 480p, so provided that it can do IVTC correctly, you can enjoy DVDs as intended.
Reply With Quote
  #18  
Old 02-18-2024, 04:33 AM
ARC Tech-109 ARC Tech-109 is offline
Retired Batwings Tech
 
Join Date: Jun 2020
Location: Planet Earth
Posts: 336
There is book smart then the reality of experience. Sadly I'm seeing a lot of "digital is always better" rhetoric here based on opinion than fact. Academics are no match for the reality of experience and these I have of 40 years of experience with. The reality is what you see on your fancy high-priced screen are little more than DSP enhanced figments. The movies like StarWars when released in 1977 (I was there at the theater for this) they were ALL shot at 24 FPS until the later episodes were done in CineAlta at a 24 FPS rate, this factually documented by Geoege Lucas himself. So to say that because my TV set is 1080i only I can't enjoy the full resolution of a BD doesn't hold any water, it is the number of scan lines NOT the interlacing that defines. The movies were transferred using a flying spot scanner BTW.
S-VGA was both interlaced and progressive depending on the scan rate and this was due to the horizontal sweep limitations and limitations of the RAMDAC of the day. There is just as much detail in interlace as there is in a full sweep being this is a function of bandwidth itself.
"The only use of an interlaced CRT TV set is watching interlaced programs." Once again false opinion. You just contradicted yourself with the later statement regarding the computer monitor. I can't speak for the remainder of the statement regarding the 1125 line format as Japan was doing their own thing and it ultimately failed.

"because NTSC has stuck with 4:3 interlace" what the??? 4:3 is the aspect ratio of the screen itself, divide the screen into 4 sections vertically and only 3 of them will fit one on top of the other, interlace is an odd/even field of 2:1 of equal number of scan lines one over the other. Because my CRT monitor is 1080i it CAN display the SAME bandwith and detail of your whiz-bang LCD or plasma with its DSP running in the background and look just as good and I CAN prove this without a doubt. The problem here is once again we are splitting hairs of opinion with the belief that one is "better" than the other. Digital has it's own set of issues and compromises just like analog has, the one difference here is digital has the technological advances to hide these flaws effectively making ice creme out of horse manure. In the real world we never see the real picture, everything is a compromise be it compression for transmission or storage space, raw costs and the buying public that for the most part judges by the cost on the bottom line.

Last edited by ARC Tech-109; 02-18-2024 at 05:10 AM. Reason: setting some facts straight
Reply With Quote
  #19  
Old 02-18-2024, 10:51 AM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is online now
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,207
Can't believe I'm seeing the interlace/progressive, analog/digital, original film vs. what's on your screen religious wars being rehashed. Both of you have correct and incorrect points.

The original digital TV tests showed plainly (but note, with tube pickup cameras and CRT displays) that 1080I had more visual resolution for still scenes (with the vertical resolution toned down to reduce twitter), while 720P was better for sports with high motion. You cannot say definitively one is always better than the other. The comparison is further affected by the introduction of solid state camera sensors (no smear) and different display technologies. But it never comes down to strictly "old good, new bad" or vice versa. It's a question of the whole system from glass to glass.

@Arc Tech 109, DV Tyro obviously meant 2:1 interlace, come off it!
__________________
www.bretl.com
Old TV literature, New York World's Fair, and other miscellany

Last edited by old_tv_nut; 02-18-2024 at 10:54 AM.
Reply With Quote
  #20  
Old 02-18-2024, 10:57 AM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is online now
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,207
Not withstanding all the above, yes, it is always interesting to see what is recovered nearest to the source, without any (possibly) noise reduction, introduction of artifacts, or whatever.
__________________
www.bretl.com
Old TV literature, New York World's Fair, and other miscellany
Reply With Quote
Audiokarma
  #21  
Old 02-18-2024, 12:13 PM
etype2's Avatar
etype2 etype2 is offline
VideoKarma Member
 
Join Date: Mar 2010
Location: Valley of the Sun, formerly Silicon Valley, formerly Packer Land.
Posts: 1,489
As a movie fan and consumer of A/V entertainment, one has to invest in the best possible delivery system and you don’t have to be wealthy to do so. Currently OLED soon to be surpassed by Micro LED and self emissive nano OLED is the best visual median.

Bit rate is key. BD and Apple TV currently have the highest bit rates. Sony has a proprietary system with high bit rate. OTA is slightly better than satellite for live video transmission.
__________________
Personal website dedicated to Vintage Television https://visions4netjournal.com
Reply With Quote
  #22  
Old 02-18-2024, 01:24 PM
DVtyro DVtyro is offline
VideoKarma Member
 
Join Date: Sep 2022
Posts: 137
Quote:
Originally Posted by old_tv_nut View Post
The original digital TV tests showed plainly (but note, with tube pickup cameras and CRT displays) that 1080I had more visual resolution for still scenes (with the vertical resolution toned down to reduce twitter), while 720P was better for sports with high motion. You cannot say definitively one is always better than the other.
720p vs 1080i is a wash, although 720p does not have interline twitter. I was talking about 1080p24 BD movie, which loses vertical resolution when watched on a 1080i monitor. Here, this is about DVDs, but the same is true regarding HD.

Quote:
Originally Posted by ARC Tech-109 View Post
to say that because my TV set is 1080i only I can't enjoy the full resolution of a BD doesn't hold any water
Of course it does, and of course you can't.

Quote:
Originally Posted by old_tv_nut View Post
"The only use of an interlaced CRT TV set is watching interlaced programs." Once again false opinion. You just contradicted yourself with the later statement regarding the computer monitor.
No, I don't. After the designers of computer consoles realized that interlacing sucks, they promptly switched to progscan. If you've tried using a computer with interlaced and progscan CRT monitor, you know the difference.

Quote:
Originally Posted by old_tv_nut View Post
"because NTSC has stuck with 4:3 interlace" what the???
The U.S. stuck with NTSC until HDTV came along, at least for OTA. Europe tried other formats, in particular PALPlus, which supported 25p and 16:9. As early as the late 1980s brands like Nokia, Siemens and other started offering TV sets with 16:9, progscan, 100 Hz refresh rate, and built-in deinterlacer. Compared to the-old school PAL, this felt almost like HD. Fun fact, in the early 2000s Australia adopted 625p50 as HD format.

Quote:
Originally Posted by old_tv_nut View Post
The problem here is once again we are splitting hairs of opinion with the belief that one is "better" than the other.
Digital IS better than analog. First generation analog can look great, given enough bandwidth, like the original MUSE HD, which needed 30 MHz. Even if you could afford a TV set or a VTR, with this bandwidth there were no hopes for broadcast.

Digital has better quality, requires less bandwidth to transmit (or bitrate to store, which is just the other side of the same coin), the devices are smaller, cheaper, and all around more democratic, and copies do not lose quality.

Digital vs analog is sort of like video vs film - until the 1970s film cameras were smaller, more dependable, portable, provided better image quality, etc. But as video developed, it moved farther and farther from film, which just could not miniaturize further, because the size of film roll was a given. Betacam spelt the end of Auricons and Eclairs. Similarly, DV spelt the end of analog Betacam, and there is no returning back as digital keeps on moving forward. The latest 4K and 8K CMOS sensors with global shutter fix the most glaring defect of modern digital cinematography, so film is finally dead.
Reply With Quote
  #23  
Old 02-20-2024, 09:31 AM
ARC Tech-109 ARC Tech-109 is offline
Retired Batwings Tech
 
Join Date: Jun 2020
Location: Planet Earth
Posts: 336
I'm going to stick with my legacy formats which do include DigiBeta DVCPro HD and HDCam along with Type-C and BetaSP. If you want to push your agenda go for it, I'm done arguing with the inexperienced and/or misinformed.

Last edited by ARC Tech-109; 02-20-2024 at 09:44 AM.
Reply With Quote
  #24  
Old 02-20-2024, 03:21 PM
Phil Phil is offline
VideoKarma Member
 
Join Date: Aug 2016
Posts: 134
"so film is finally dead."

Funny, I just shot 6 rolls of 120 Velvia last week.
Reply With Quote
  #25  
Old 02-20-2024, 06:32 PM
ARC Tech-109 ARC Tech-109 is offline
Retired Batwings Tech
 
Join Date: Jun 2020
Location: Planet Earth
Posts: 336
I give Phil two thumbs up
Reply With Quote
Audiokarma
  #26  
Old 02-20-2024, 08:30 PM
nasadowsk's Avatar
nasadowsk nasadowsk is offline
Damn does run fast…
 
Join Date: Jul 2004
Location: Catawissa, PA
Posts: 948
I still shoot a Hasselblad and an Arri S16.

Speaking of which, any of you going to this event? I’m on the fence right now :/
Reply With Quote
  #27  
Old 02-21-2024, 03:20 AM
DVtyro DVtyro is offline
VideoKarma Member
 
Join Date: Sep 2022
Posts: 137
Quote:
Originally Posted by Phil View Post
"so film is finally dead."

Funny, I just shot 6 rolls of 120 Velvia last week.
Shooting film is fun, but few outliers do not reverse the global trend. I shoot VHS and DV myself, fully realizing that I am a freak.

Last edited by DVtyro; 02-21-2024 at 03:33 AM.
Reply With Quote
  #28  
Old 02-21-2024, 05:07 AM
Alex KL-1 Alex KL-1 is offline
VideoKarma Member
 
Join Date: Jan 2021
Location: Brazil (Paranį)
Posts: 221
I'm believe in general, the original format being showing in the original (high) quality hardware, is superior than ANY conversion to another format, be digital or analog. Case especially in point with games: some older games are made taking into account defects and virtues of CRT, for example.

For sure, bad conversions to new digital format are made everywhere, including some documentaries show in famous streaming platforms (some are good, some are so-so).

But showing a old analog tape converted to a different animal (eg. to a 4k OLED TV), will never be equal to same content displaying into a old CRT TV with all it's very different caracteristics: gamma/linearity differs, motion perceptions differs due to impulsive operation mode of CRT vs the sample-and-hold operation from a OLED, and perceptual object brightness derived from both caracteristics plus video response aberrations or not. And the CRT natively reproduces the interlaced content, without any "strange" conversion for the natively fixed progressive panels. And, the old content *maybe* can produced in equipment taking into account all difficulties from old CRT reproducers, this being different in newer display techs.

Maybe the keyword here will be "natively". In the end, the image ALWAYS will differs.

And, we will need to take into account the various conversions needed to display into a eg. 4k panel: de-interlace, upscale etc. The 4k panel will show all these errors (plus the low original resolution) without any mercy if magic filters are off.
__________________
So many projects, so little time...
Reply With Quote
  #29  
Old 02-21-2024, 10:46 AM
ARC Tech-109 ARC Tech-109 is offline
Retired Batwings Tech
 
Join Date: Jun 2020
Location: Planet Earth
Posts: 336
And in the end it's the eyes of the beholder
Reply With Quote
  #30  
Old 02-21-2024, 12:23 PM
DVtyro DVtyro is offline
VideoKarma Member
 
Join Date: Sep 2022
Posts: 137
Quote:
Originally Posted by Alex KL-1 View Post
Maybe the keyword here will be "natively". In the end, the image ALWAYS will differs.
Thanks for the measured response, Alex. Hence my original question, how they are going to show it? On a largest CRT they could find? On a bunch of small CRTs, mimicking a 1960s living room? Via a projector? There were interlaced projection TVs back then, but I wonder were there ones that could fill a large movie theater screen?

I guess, using a bunch of 17-20 inch TVs would be the best option to replicate how it looked like back then.
Reply With Quote
Audiokarma
Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 12:03 AM.



Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
©Copyright 2012 VideoKarma.org, All rights reserved.