The electronics world has been abuzz about 1080p delivered over the equally hyped High-Definition Multimedia Interface (better known as HDMI). Beginning with the release of the Pioneer PDP5000EX in the summer of 2006, there has been a steady stream of 1080p televisions, DVD players, video game systems, A/V receivers, and (of course) content supporting this new higher resolution video format.
So here's the million dollar question:
Does 1080p even make a difference?
More specifically, can a person perceive the difference between a 1080p and a 1080i video? The answer, as you might have guessed, isn't so straight forward. I'm going to try and simplify that: The answer is YES, if you've got a good eye, and a good video source.
To give you some scientific background, there are two major psychological concepts at work when a person watches video: 1) visual acuity, the spatial threshold at which a person perceives two distinct "dots" as one single dot, and 2) persistence of vision, the temporal threshold at which a person perceives two distinct "frames" as motion (think cartoons and animation). The persistence of vision is intermingled with another concept called the flicker fusion threshold. These two concepts translate into the specs of your TV as resolution (visual acuity) and frame rate (persistence of vision). The average person has a visual acuity of 1/30th of degree (can see two distinct when separated by 0.03 degrees) and a flicker fusion threshold of 16 Hz (animation comes to life at 16 frames per second). Now, that you're asleep, we can continue...
I will skip over a whole bunch of steps and in-depth analysis; there are a couple of good articles you can read if you want to learn more. Audioholics has an article examining the acuity of vision and how it relates to 1080p and the other by Carlton Bale does a good job of analyzing the relationships between screen size, viewing distance, and resolution.
I'll assume you are a videophile with excellent eyes, and you can tell the difference between 30fps and 60fps. I'm also going to assume that the video source is a true 60 frames / second. While some video games meet this criteria, movies on Blu-Ray or HD-DVD don't.
Basically, the benefit of 1080p over 1080i under these assumptions (and they are some pretty big assumptions) is that you get 2X the spatial resolution. In other words, visual artifacts won't be detected until you are twice as close to the screen. Why is this good?
IT MEANS YOU CAN SIT TWICE AS CLOSE TO YOUR TV! Another way to think about it is you can buy twice as big a TV for the same space.
Based on my calculations, here are the min. viewing distances for some standard plasma TVs that can deliver 1080p:
| Screen size
|| How close can I sit?
While at these super-close distances to your TV, the image will fill a whopping 58 degrees of your field of view. Have you ever sat four feet away from a 60" plasma? It's pretty big.
Some final thoughts to consider:
- staring at a fixed focal distance can hurt your eyes (take a break from TV for crap's sakes)
- you may get motion sick (not because you're moving, but because you're not)
- there is almost NO content that will make use of the full 1080p
- even though you might think you've got great vision, there's a good chance you won't be able to see the difference. As you get older, it only becomes more likely.
- however, 1080p TVs are often built with better components delivering a better picture (colors, brights, darks, contrast, viewing angle) for reasons other than the signal specification
Given all of these conditions, why all the fuss? It has less to do with technology and science and a lot more to do with sales and marketing.