View Single Post
  #7  
Old 02-27-2018, 11:49 AM
etype2's Avatar
etype2 etype2 is offline
VideoKarma Member
 
Join Date: Mar 2010
Location: Valley of the Sun, formerly Silicon Valley, formerly Packer Land.
Posts: 1,487
Quote:
Originally Posted by dtvmcdonald View Post
Again I ask ... what about the TEST done on the way an actual show will
be seen on cable TV ... NOT BLU-RAY, NOT in a movie theater, BUT,
I insist on a real signal, that is verified to have a bitrate **no higher than** 5megabits/sec, the way broadcasters and cable companies will send it.

I don't care with it looks like with adequate bitrate. Except in a movie
theater I'll never see that.
I answered your question. Did you look at the test results on my Calman ISF charts? I provided a link to see them. A smaller high light output panel will produce better results for HDR them my projector.

5mbps won’t cut it. I think what you are talking about is the bit rate of an average HD program. Netflix is streaming about 10 mbps. It varies during the program. It go’s up to 12 mbps. Direct TV is 30 mbps on 4K. These are the only two I’m familiar with. I’ll check on Apple TV. If your going to invest in 4K you want to make sure the chipset in the TV is capable of at least 13 mbps, 18 is better. You need an internet speed of at least 25 mbps. If your going to invest in 4K, why wouldn’t you want the best quality of 4K Blu-Ray? Some televisions display the metadata of the program watched. My Sony has this feature. Having said that, your eyes are the final test.


Update. I looked at the variable bit rates of Spider-Man Homecoming on both 4K Blu-Ray and Apple TV 4K. The Apple TV averages 13 mbps maxing out to 25 mbps and the 4K disc averaged 70 mbps maxing out at 100 mbps.
__________________
Personal website dedicated to Vintage Television https://visions4netjournal.com

Last edited by etype2; 02-27-2018 at 02:16 PM. Reason: Typo. Note to self, proof read
Reply With Quote