Saturday, February 29th, 2020
Do you have a 4K TV? Are you contemplating an 8K TV? Maybe…take a breath.
I say this as a guy who has a 40 inch HD set (that’s 1920×1080 pixels) in a smallish room, and given the distances, the available content, and our aging eyes…that’s plenty good enough.
An article in TechHive sums it up in a headline: 8K vs 4K TVs: Double-blind study by Warner Bros. et al reveals most consumers can’t tell the difference.
In collaboration with Pixar, Amazon Prime Video, LG, and the American Society of Cinematographers (ASC), Warner Bros. recently addressed this question in a well-designed, double-blind study to see if people could discern a difference between 4K and 8K with a variety of content.
For the purposes of this article, “4K” refers to a resolution of 3840 x 2160, and “8K” refers to a resolution of 7680 x 4320. As you might already know, these definitions are something of a misnomer; to be technically accurate, “4K” really means 4096 x 2160 and “8K” means 8192 x 4320.
Here’s a PDF from a SMPTE conference in 2012 that dives deep—maybe too deep—discusses the concepts of simple acuity, hyperacuity, and Snellen acuity. This is an extremely technical paper and I’m certainly not urging that you plow through it. It’s filled with stuff like this:
Oh, that Snellen! The guy who came up with the standard eye chart!
Bottom line, there’s a basic and yet quite satisfying viewing experience for which immersion is not the point, and when you live in a small apartment, immersion might not even be possible—the angles and distances don’t work out. You just want to see the numbers on the election returns, or the soccer ball, or the hockey puck.
And perhaps more importantly, much of the content out there is barely ‘1K’—HD, which, by the way, is a very nice image to look at from across the room indeed. And if it’s upsampled to a higher resolution, that’s a procedure that scientists are improving on, but you still can’t manufacture pixels out of thin air. Real resolution is the resolution you start out with.