Does 1080p matter? or: How close can I sit to my TV?

by Erik Apr 5, 2007 fileunderFound in Televisions

The electronics world has been abuzz about 1080p delivered over the equally hyped High-Definition Multimedia Interface (better known as HDMI). Beginning with the release of the Pioneer PDP5000EX in the summer of 2006, there has been a steady stream of 1080p televisions, DVD players, video game systems, A/V receivers, and (of course) content supporting this new higher resolution video format.

So here's the million dollar question:

Pioneer Elite Pro 1140HD

Does 1080p even make a difference?

More specifically, can a person perceive the difference between a 1080p and a 1080i video? The answer, as you might have guessed, isn't so straight forward. I'm going to try and simplify that: The answer is YES, if you've got a good eye, and a good video source.

To give you some scientific background, there are two major psychological concepts at work when a person watches video: 1) visual acuity, the spatial threshold at which a person perceives two distinct "dots" as one single dot, and 2) persistence of vision, the temporal threshold at which a person perceives two distinct "frames" as motion (think cartoons and animation). The persistence of vision is intermingled with another concept called the flicker fusion threshold. These two concepts translate into the specs of your TV as resolution (visual acuity) and frame rate (persistence of vision). The average person has a visual acuity of 1/30th of degree (can see two distinct when separated by 0.03 degrees) and a flicker fusion threshold of 16 Hz (animation comes to life at 16 frames per second). Now, that you're asleep, we can continue...

I will skip over a whole bunch of steps and in-depth analysis; there are a couple of good articles you can read if you want to learn more. Audioholics has an article examining the acuity of vision and how it relates to 1080p and the other by Carlton Bale does a good job of analyzing the relationships between screen size, viewing distance, and resolution.

I'll assume you are a videophile with excellent eyes, and you can tell the difference between 30fps and 60fps. I'm also going to assume that the video source is a true 60 frames / second. While some video games meet this criteria, movies on Blu-Ray or HD-DVD don't.

Basically, the benefit of 1080p over 1080i under these assumptions (and they are some pretty big assumptions) is that you get 2X the spatial resolution. In other words, visual artifacts won't be detected until you are twice as close to the screen. Why is this good?

IT MEANS YOU CAN SIT TWICE AS CLOSE TO YOUR TV! Another way to think about it is you can buy twice as big a TV for the same space.

Based on my calculations, here are the min. viewing distances for some standard plasma TVs that can deliver 1080p:

Screen size How close can I sit?
42" 33"
50" 39"
60" 47"

While at these super-close distances to your TV, the image will fill a whopping 58 degrees of your field of view. Have you ever sat four feet away from a 60" plasma? It's pretty big.

Some final thoughts to consider:

  • staring at a fixed focal distance can hurt your eyes (take a break from TV for crap's sakes)
  • you may get motion sick (not because you're moving, but because you're not)
  • there is almost NO content that will make use of the full 1080p
  • even though you might think you've got great vision, there's a good chance you won't be able to see the difference. As you get older, it only becomes more likely.
  • however, 1080p TVs are often built with better components delivering a better picture (colors, brights, darks, contrast, viewing angle) for reasons other than the signal specification

Given all of these conditions, why all the fuss? It has less to do with technology and science and a lot more to do with sales and marketing.


1.  avatar KD1964 said:

great article
May 07, 2007 12:00pm
2.  avatar bob1029 said:

I think the difference between 1080p and 1080i is being overstated. The biggest deal with 1080p is that you dont get deinterlacing artifacts, and each frame is a complete, unaltered image. 1080i is popular in most transmission formats due to the fact that it uses much less bandwidth since the image height is taken from 1080 lines to 540 lines, and the vertical rate is doubled. The difference between the 2 formats makes itself painstakingly clear when a 1080p set is hooked up to a computer and set to 1080i and then 1080p for comparison. Even with a really good deinterlacer, scrolling through text and other rapid vertical movements involving fine detail are completely distorted and unreadable at 1080i.
May 16, 2007 10:39am
3.  avatar Erik said:

Great points, bob1029, and that's definitely a good way to test the difference. Hi-def, Hi-fi is often more of a marketing exercise. There are very few people that can notice differences at the bleeding edge of audio/video improvements. Think of people as they get older and their eyesight isn't as good as it used to be. You'll be amazed to find that some people can't really tell the difference between SD and HD even! Let alone between HD and full-HD.
May 17, 2007 11:51am
4.  avatar mdrejhon said:

For 1080i watching sports motion, there's actually 60 distinct images. Basically, it's 540 scanlines of one image (i.e. the odd scanlines) followed by 1/60th of a second followed by another 540 scanlines of the next images (i.e. even scanlines). These images are called 'fields'. Interlaced images at 30 frames per second have the same temporal resolution as 60 frames per second. (assuming 2 fields per frame -- two frames interleaved into one frame, and then displayed 1/60th of a second apart which results in the original temporal resolution of 60 per second).

... In simple terms, there's no difference in temporal resolution between 1080i/30fps (60i) and 1080p/60fps (60p) when watching full framerate video sources like sports. Therefore, the "30fps versus 60fps" argument doesn't apply here.  Granted, images can look better because they don't need to be deinterlaced, but 1080p/60 doesn't have the temporal resolution advantage that some people think, because they misunderstand the relationship between temporal resolution and framerate when it comes to interlaced displays -- the framerate is actually 60, but we call it the 'fieldrate' since a frame needs to be full resolution.  Two distinct images at half vertical resolution constitute one frame, and so, 2 separate images (taken 1/60th of a second apart by the videocamera), are woven ('interleaved') into one frame.  When broadcast/played/etc from a video-based source (not film-based), you get the original temporal resolution (at half vertical resolution per image) of 60 images per second.  Therefore, 1080p/60fps ('60p') has no temporal resolution advantage over 1080i/30fps ('60i').  Just spatial resolution advantage, and no need to deinterlace.

Some video cameras take 30fps and some take 60fps -- you can tell apart 30fps vs 60fps converted to 1080i, and you can tell apart 30fps vs 60fps broadcast at 1080p, simply because the temporal resolution of both 1080i and 1080p are both 60. (60 fields per second for 1080i, and 60 frames per second for 1080p)

P.S. - I have worked in the video industry, and also have designed some scaler/deinterlacer algorithms.  So as an engineer, I know what I am speaking about.  (Terminology can vary though - you have seen some sites confuse 1080i/30 with 1080i/60 -- without saying whether that's the framerate (30) or fieldrate (60) .... You get what I mean.)

Aug 29, 2007 1:08pm
5.  avatar V-Dawg said:

Wow - thanks for the explanations mdrejhon.. you really explained it well. The concept seem s so simple after reading your write-up. Very much appreciated.
Aug 31, 2007 11:49am
Some HTML is allowed. Your comments remain editable after you post.