#16
|
|||
|
|||
Digital also loses quality with each generation of compression and jitter, not as substantial as the consumer analog formats but there is a loss that's why its called "lossy compression". There are time when a SD looks far better than the HD and again this comes down to the overall detail, just because it has 1080 progressive lines doesn't mean each line is unique in the sense it wasn't sourced from a lower format and upconverted. Then we can get into things like 4:1:1 sub-sampling, bit error rates and the all important dithering from the low sampling be it 8-bit or the more common 10-bit. How about the compression artifacts like mosquito noise and dynamic pixelations? The modern displays are really good at hiding these things with their 3D filters and other DSP functions but when they're disabled the truth comes out, you can only make a pile of pixels look so good.
Yes 34 inches of 16:9 CRT that covers native unprocessed analog plus HDMI to 1080i due to the CRT itself. It will show you every flaw in that pretty digital image without shame, I have a screen cap from the first StarWars from the original Laserdisc (where Hon shoots first) but the videokarma server won't accept it due to the size. What it does show it the real quality of the image without the dithering or other digital compromises needed to keep the costs low for the consumer world. The image is very bright and dynamic, no highlight crushing or dithering and without the blockiness of the pixelation during the fast action scenes. Once again analog wins this round. |
#17
|
||||||
|
||||||
Quote:
Visually lossless intermediate codecs have been used for the last two decades with great success for everything from home videos to big-screen movies. FX scenes in the original Blade Runner have seven exposures on the same strip of film, you can search for BTS comments about the loss of detail. Digital allows dozens of recompressions without visual loss. With modern storage solutions one can use lossless codecs to prevent data loss in the areas of the image that have not been changed. In short, analog has nothing to show for it. Quote:
Quote:
Quote:
Quote:
Quote:
To see full movie resolution you need to remove the pulldown, converting it back into 24p, and display it as progressive. This is what modern TV sets do. This is what good TV sets did since the late 1980s, with built-in deinterlacer, with 100 Hz or 120 Hz refresh rate, with 16:9 screen. Such TV sets became popular in Europe, much less so in the U.S., because NTSC has stuck with 4:3 interlace. Since your TV set is 1080i only, you cannot enjoy the full resolution of BD movies. OTOH, it can do 480p, so provided that it can do IVTC correctly, you can enjoy DVDs as intended. |
#18
|
|||
|
|||
There is book smart then the reality of experience. Sadly I'm seeing a lot of "digital is always better" rhetoric here based on opinion than fact. Academics are no match for the reality of experience and these I have of 40 years of experience with. The reality is what you see on your fancy high-priced screen are little more than DSP enhanced figments. The movies like StarWars when released in 1977 (I was there at the theater for this) they were ALL shot at 24 FPS until the later episodes were done in CineAlta at a 24 FPS rate, this factually documented by Geoege Lucas himself. So to say that because my TV set is 1080i only I can't enjoy the full resolution of a BD doesn't hold any water, it is the number of scan lines NOT the interlacing that defines. The movies were transferred using a flying spot scanner BTW.
S-VGA was both interlaced and progressive depending on the scan rate and this was due to the horizontal sweep limitations and limitations of the RAMDAC of the day. There is just as much detail in interlace as there is in a full sweep being this is a function of bandwidth itself. "The only use of an interlaced CRT TV set is watching interlaced programs." Once again false opinion. You just contradicted yourself with the later statement regarding the computer monitor. I can't speak for the remainder of the statement regarding the 1125 line format as Japan was doing their own thing and it ultimately failed. "because NTSC has stuck with 4:3 interlace" what the??? 4:3 is the aspect ratio of the screen itself, divide the screen into 4 sections vertically and only 3 of them will fit one on top of the other, interlace is an odd/even field of 2:1 of equal number of scan lines one over the other. Because my CRT monitor is 1080i it CAN display the SAME bandwith and detail of your whiz-bang LCD or plasma with its DSP running in the background and look just as good and I CAN prove this without a doubt. The problem here is once again we are splitting hairs of opinion with the belief that one is "better" than the other. Digital has it's own set of issues and compromises just like analog has, the one difference here is digital has the technological advances to hide these flaws effectively making ice creme out of horse manure. In the real world we never see the real picture, everything is a compromise be it compression for transmission or storage space, raw costs and the buying public that for the most part judges by the cost on the bottom line. Last edited by ARC Tech-109; 02-18-2024 at 05:10 AM. Reason: setting some facts straight |
#19
|
||||
|
||||
Can't believe I'm seeing the interlace/progressive, analog/digital, original film vs. what's on your screen religious wars being rehashed. Both of you have correct and incorrect points.
The original digital TV tests showed plainly (but note, with tube pickup cameras and CRT displays) that 1080I had more visual resolution for still scenes (with the vertical resolution toned down to reduce twitter), while 720P was better for sports with high motion. You cannot say definitively one is always better than the other. The comparison is further affected by the introduction of solid state camera sensors (no smear) and different display technologies. But it never comes down to strictly "old good, new bad" or vice versa. It's a question of the whole system from glass to glass. @Arc Tech 109, DV Tyro obviously meant 2:1 interlace, come off it! Last edited by old_tv_nut; 02-18-2024 at 10:54 AM. |
#20
|
||||
|
||||
Not withstanding all the above, yes, it is always interesting to see what is recovered nearest to the source, without any (possibly) noise reduction, introduction of artifacts, or whatever.
|
Audiokarma |
#21
|
||||
|
||||
As a movie fan and consumer of A/V entertainment, one has to invest in the best possible delivery system and you don’t have to be wealthy to do so. Currently OLED soon to be surpassed by Micro LED and self emissive nano OLED is the best visual median.
Bit rate is key. BD and Apple TV currently have the highest bit rates. Sony has a proprietary system with high bit rate. OTA is slightly better than satellite for live video transmission.
__________________
|
#22
|
||||
|
||||
Quote:
Quote:
Quote:
The U.S. stuck with NTSC until HDTV came along, at least for OTA. Europe tried other formats, in particular PALPlus, which supported 25p and 16:9. As early as the late 1980s brands like Nokia, Siemens and other started offering TV sets with 16:9, progscan, 100 Hz refresh rate, and built-in deinterlacer. Compared to the-old school PAL, this felt almost like HD. Fun fact, in the early 2000s Australia adopted 625p50 as HD format. Quote:
Digital has better quality, requires less bandwidth to transmit (or bitrate to store, which is just the other side of the same coin), the devices are smaller, cheaper, and all around more democratic, and copies do not lose quality. Digital vs analog is sort of like video vs film - until the 1970s film cameras were smaller, more dependable, portable, provided better image quality, etc. But as video developed, it moved farther and farther from film, which just could not miniaturize further, because the size of film roll was a given. Betacam spelt the end of Auricons and Eclairs. Similarly, DV spelt the end of analog Betacam, and there is no returning back as digital keeps on moving forward. The latest 4K and 8K CMOS sensors with global shutter fix the most glaring defect of modern digital cinematography, so film is finally dead. |
#23
|
|||
|
|||
I'm going to stick with my legacy formats which do include DigiBeta DVCPro HD and HDCam along with Type-C and BetaSP. If you want to push your agenda go for it, I'm done arguing with the inexperienced and/or misinformed.
Last edited by ARC Tech-109; 02-20-2024 at 09:44 AM. |
#24
|
|||
|
|||
"so film is finally dead."
Funny, I just shot 6 rolls of 120 Velvia last week. |
#25
|
|||
|
|||
I give Phil two thumbs up
|
Audiokarma |
#26
|
||||
|
||||
I still shoot a Hasselblad and an Arri S16.
Speaking of which, any of you going to this event? I’m on the fence right now :/ |
#27
|
|||
|
|||
Shooting film is fun, but few outliers do not reverse the global trend. I shoot VHS and DV myself, fully realizing that I am a freak.
Last edited by DVtyro; 02-21-2024 at 03:33 AM. |
#28
|
|||
|
|||
I'm believe in general, the original format being showing in the original (high) quality hardware, is superior than ANY conversion to another format, be digital or analog. Case especially in point with games: some older games are made taking into account defects and virtues of CRT, for example.
For sure, bad conversions to new digital format are made everywhere, including some documentaries show in famous streaming platforms (some are good, some are so-so). But showing a old analog tape converted to a different animal (eg. to a 4k OLED TV), will never be equal to same content displaying into a old CRT TV with all it's very different caracteristics: gamma/linearity differs, motion perceptions differs due to impulsive operation mode of CRT vs the sample-and-hold operation from a OLED, and perceptual object brightness derived from both caracteristics plus video response aberrations or not. And the CRT natively reproduces the interlaced content, without any "strange" conversion for the natively fixed progressive panels. And, the old content *maybe* can produced in equipment taking into account all difficulties from old CRT reproducers, this being different in newer display techs. Maybe the keyword here will be "natively". In the end, the image ALWAYS will differs. And, we will need to take into account the various conversions needed to display into a eg. 4k panel: de-interlace, upscale etc. The 4k panel will show all these errors (plus the low original resolution) without any mercy if magic filters are off.
__________________
So many projects, so little time... |
#29
|
|||
|
|||
And in the end it's the eyes of the beholder
|
#30
|
|||
|
|||
Quote:
I guess, using a bunch of 17-20 inch TVs would be the best option to replicate how it looked like back then. |
Audiokarma |
|
|