I’ve read various articles debating the importance of the 1080p. I want to set the record straight once and for all: if you are serious about properly setting up your viewing room, you will definitely benefit from 1080p (and even 1440p.) Why? Because the 1080p resolution is the first to deliver enough detail to your eyeball when you are seated at the proper distance from the screen. But don’t just take my word for it: read on for the proof.
There are a few obvious factors to being able to detect resolution differences: the resolution of the screen, the size of the screen, and the viewing distance. To be able to detect differences between resolutions, the screen must be large enough and you must sit close enough. So the question becomes “How do I know if need a higher resolution or not?”. Here is your answer.
Based on the resolving ability of the human eye, it is possible to estimate when the differences between resolutions will become apparent. A person with 20/20 vision can resolve 60 pixels per degree, which corresponds to recognizing the letter āEā on the 20/20 line of a Snellen eye chart from 20 feet away. Using the Home Theater Calculator spreadsheet as a base, I created a chart showing, for any given screen size, how close you need to sit to be able to detect some or all of the benefits of a higher resolution screen. (Click the picture below for a larger version.)
What the chart shows is that, for a 50-inch screen, the benefits of 720p vs. 480p start to become apparent at viewing distances closer than 14.6 feet and become fully apparent at 9.8 feet. For the same screen size, the benefits of 1080p vs. 720p start to become apparent when closer than 9.8 feet and become full apparent at 6.5 feet. In my opinion, 6.5 feet is closer than most people will sit to their 50″ plasma TV (even through the THX recommended viewing distance for a 50″ screen is 5.6 ft). So, most consumers will not be able to see the full benefit of their 1080p TV.
However, front projectors and rear projection displays are a different story. They make it very easy to obtain large screen sizes. Plus, LCD and Plasma displays are constantly getting larger and less expensive. In my home, for example, I have a 123-inch screen and a projector with a 1280×720 resolution. For a 123-inch screen, the benefits of 720p vs. 480p starts to become apparent at viewing distances closer than 36 feet (14 feet behind my back wall) and become fully apparent at 24 feet (2 feet behind my back wall). For the same screen size, the benefits of 1080p vs. 720p start to become apparent when closer than 24 feet and become full apparent at 16 feet (just between the first and second row of seating in my theater). This means that people in the back row of my home theater would see some improvement if I purchased a 1080p projector and that people in the front row would notice a drastic improvement. (Note: the THX recommended max viewing distance for a 123″ screen is 13.7 feet).
So, how close should you be sitting to your TV? Obviously, you need to look at your room and see what makes sense for how you will be using it. If you have a dedicated viewing room and can place seating anywhere you want, you can use this chart as a guideline. It’s based on THX and SMPTE specifications for movie theaters; the details are available in the Home Theater Calculator spreadsheet.
Looking at this chart, it is apparent that 1080p is the lowest resolution to fall within the recommended seating distance range. Any resolution less than 1080p is not detailed enough if you are sitting the proper distance from the screen. For me and many people with large projection screens, 1080p is the minimum resolution you’d want.
In fact, you could probably even benefit from 1440p. If you haven’t heard of 1440p, you will. Here’s a link to some info on Audioholics.com. It is part of the HDMI 1.3 spec, along with 48-bit color depth, and will probably surface for the public in 2009 or so. You’ll partially be able to see the benefits of 1440p at the THX Max Recommended viewing distance and the resolution benefits will be fully apparent if you are just a little closer. I’ve read of plans for resolutions reaching 2160p but I don’t see any benefit; you’d have to sit too darn close to the screen to notice any improvement. If you sit too close, you can’t see the far edges of the screen.
In conclusion
If you are a videophile with a properly setup viewing room, you should definitely be able to notice the resolution enhancement that 1080p brings. However, if you are an average consumer with a flat panel on the far wall of your family room, you are not likely to be close enough to notice any advantage. Check the chart above and use that to make your decision.
ISF states the the most important aspects of picture quality are (in order): 1) contrast ratio, 2) color saturation, 3) color accuracy, 4) resolution. Resolution is 4th on the list and plasma is generally superior to LCD in all of the other areas (but much more prone to reflections/glare.) So pick your display size, then measure your seating distance, and then use the charts above to figure out if you would benefit from the larger screen size. So be sure to calibrate your screen! I recommend the following for calibration.
Recommended Calibration Tools
- Disney WOW: World of Wonder Blu-ray
- Disney WOW: World of Wonder DVD
- Alternative options:
- DVD: Digital Video EssentialsĀ (the original calibration disc dating back to the 1990s)
- Blu-ray: Spears & Munsil High-Def Benchmark Disc (my favorite but hard to find)
- Blu-ray: Digital Video Essentials: HD Basics (an update to the original, but I don’t like it as well)
- Automatic Hardware Calibrator: Datacolor Spyder 3
“I don’t like reading charts – just tell me what resolution I need”
If you don’t like reading charts and are looking for a quick answer, enter you screen size below to see how close you’ll need to sit to fully appreciate various screen resolutions.
Note about “or closer” viewing distances calculated above: if you sit closer than the distances shown above, you will be able to see some (but not all) of the detail offered by the next higher resolution.
What is this chart based on?
Testing it on my Laptop with a display that is the equivalent of a 1080p screen I can spot a single pixel from roughly twice the distance suggested in this chart. And my sight is pretty bad. Movie images usually are of a different nature but the article is talking about the ability of the human eye and that seams to be much higher to me.
i don’t get it, how can a 480p picture be better from further away than 1080i or 720p???????? can it be that the figures are inadvertently inverted? in my logic a 1080i/p should be better from further away than a 480i/p, no? and the closer one gets to the screen the better the higher rez pic get, no??? the graph suggest the exact opposite…., @ least to me and my logic…
1) contrast ratio
2) color saturation
3) color accuracy
4) resolution
That right there is why CRT still blows away all other TV tech.
Here is hoping for SED and FED in coming year.
Great work Carlton. Much appreciated.
Georg – I believe the basis here is not the ability to see a single pixel, but the ability to resolve two.
2160p? Maybe nothing more than a gee-whiz for video viewing… But, how about multi-player gaming? Although it would simply be a luxury, it would make 4-player split screen gaming a joy. And would open up 6 and 8 way splits on large screens where everyone ends up within 6-8 feet anyways. Think Wii Sports Extreme! (Nintendo got it wrong on this little part–HD output would have really opened up the “party console” appeal for the middle age/middle income market!) Playing games in split screen modes would be gorgeous. And megapixel displays open up a whole new world of computer user interfaces and interaction. Imagine your computer monitor being the wall in your office…literally grab and drag items on the screen a la Minority Report… All the naysayers need to open their eyes to the real reasons and applications for such high resolution displays.
This is backwards a higher resolution and screen size demands a longer viewing distance to be “ideal”! Sorry, 480p is bad but sitting 25-30 feet away does not help.
Cool that this misinformation got so highly ranked on Digg though.
Topher5
Topher, sorry, but I think you need to re-read the article. If you sit too far away, you will not be able to see any of the benefits of higher resolutions.
You are correct, 480p is “bad” and sitting further away does not help. But if you sit too far away the benefits of higher resolution are not apparent at all either. 480p=1080p at long viewing distances. At long viewing distances, all of the resolutions look the same because your eye can’t detect the differences. You have to sit close enough to the screen to detect the advantages of higher resolutions. This article gives an idea of about how close you’d have to sit for each resolution.
If this article was ranked highly on Digg, it’s probably because the readers understood the content.
The idea is not that 480p is better than 1080p a great viewing distances, but rather that the two are equivalent. When you’re sitting really far away, you are unable to detect the extra resolution of 1080p.
While this is an informative article, most people do not realize that 1080p content is not available (besides PS3/Blu-Ray discs) at this time. So if you buy a 1080p tv, there is not much out there that will take advantage of it… so is it worth it? Maybe, but you might not get to utilize the progressive scanning for a couple years.
I would assume that this is for “natural” images, not for non-antialiased, CG-rendered images from video game consoles. Any idea where those lines would be for gaming?
This was not really intended to be a comparison between scan rates (1080i vs. 1080p) but rather a comparison of resolutions. Since virtually no display capable of 1920×1080 is analog, the image they produce is always progressive 1080p even if the source is not. The question is whether or not the deinterlacer in the video processor is capable of converting 1080i to the display-native 1080p without negative side-effects. The one exception is film-based material presented at 1080i — it can be perfectly converted to 1080p if the video processor is capable of 2:3 pulldown (because film-based material is a 24 frames per second). So, if the video processor is capable, the display can project a near-perfect 1080p signal from a 1080i source for film-based material. Obviously, the latest generation gaming consoles, computers, and HD-DVD players can produce 1080p perfectly as well.
This is a great question but I unfortunately don’t have a great answer. The resolution-based viewing distances are more rough guidelines than exact values (due to person-to-person variation, differences between display technologies, etc.) So it’s impossible for me to give an exact difference between a natural anti-aliased and a non-anti-aliased image. But I’m positive that you are correct in assuming that the display resolution deficiencies would become apparent sooner on non-anti-aliased material; the jagged edges on lines would make individual pixels much more apparent (at a distance where they would be undetectable otherwise).
There may be limited 1080p video sources today but I use my 65″ 1080p HDTV to view my high quality 6 MegaPixel pictures (2MP on screen) at 5-8 feet viewing distance. Everyone seems to enjoy.
This piece of information is incredibly handy. I know there is an optimal distance:resolution ratio, but I’ve never seen any ‘viewing tests’ done. Now there is, and I appreciate it a lot.
Very useful to know if 1080 res is useless or not given a particular room size.
A good chart, because it provides a range rather than a single value.
However, projector resolution is irrelevant. It is the source resolution which is relevant. A 480p DVD projected by a 1080p projector is no better than projected by a 720p projector. In either case, the 480p image has to be scaled. In other words, if you project a 480p DVD out of a 1080p projector, you cannot sit closer to it than if you projected it out of a 720p projector.
The result is your argument is not a good argument for 1080p projectors, but is an excellent argument for Blu-ray and HD-DVD players (and more importantly, Blu-Ray disc and HD-DVD sources) for home theaters.
The visual acuity distance is useful, because it provides a rough “no closer than” distance. Some people mistakenly believe the visual acuity distance is a maximum distance. But if you sit closer, you will start to lose the picture in the pixels. If you sit further, you will lose some detail, but still see the picture. Sitting inside or the visual acuity distance can be big deal for some watching rear-projection LCD systems because of screen door effect.
The visual acuity distance varies tremendously with individuals, so the number should only be used as a reference. So if you only have a single row of seats, it is better for that row to be a little further back than the visual acuity value.
So those reading this data should take it as guidance, not gospel. Too many people are buying 60″ DLP sets based on data like recommended THX distance, HD visual acuity distance, or maximum recommended SMPTE viewing distance, only to realize 90% of the content they watch looks bad on it because they are too close. This is because they are sitting well inside the visual acuity distance for SD television or 480p DVD content for a screen of this size.
This, in my opinion, is the most overlooked issue for an HD purchase, the minimum viewing distance of a 4:3 SD image presented on the HD display. And if you zoom or stretch that 4:3 image to fit a 16:9 screen, you further reduce the resolution, and increase the visual acuity distance.
Because of this, source content (480p DVDs vs. HD-DVDs or Blu-ray discs) must be considered. As your chart demonstrates, the resolution of current 480p DVD content does not adequately support THX and SMPTE configured home theaters. But Blu-Ray and HD-DVD content does.
Mark, I agree with you regarding source material resolution. Both the source material and the display need to be at the same high resolution to fully appreciate the benefits of either.
The resolution of the display is of little benefit if the source is low resolution. A high quality HD video processor can make some improvements to SD content shown on a HD screen, but it will never look HD — there is no substitute. Most of my time is spent watching HD content recorded to a HD TiVo over-the-air using the ATSC tuner (it’s hooked to a giant antenna in my attic.) After making the switch, watching SD television shows now seems like a chore.
Your basis for the eye’s ability to resolve detail to “1 arc minute” and the corresponding graphs seems sound enough. It seems to be well documented, but it also seems to be quoted as a figure for resolving detail in a static image…
So how does a moving image effect that?
Tim, you are correct that the figure does relate to a static image. I’m not sure how to apply it to a dynamic image. Based on the fact that a high-motion scene can look crisp at full speed but extremely blurred when played frame-by-frame, I positive motion would tend to make resolution deficiencies more difficult to detect. On the other hand, motion artifacts seem to bring out weaknesses in video processors and displays and, if this is the case, might exaggerate resolution deficiencies (due to pixelation, ghosting from from slow pixel refreshes, etc.) I wish I could give a more definitive, numeric answer but this is a somewhat non-precise measurement to begin with, so the best answer I have is that high motion makes resolution deficiencies less apparent except for when there are motion artifacts and/or refresh rate problems.
I hope the 1440p standard you talk about will be in 50 or 60 frames per second, 1080p30 is ridiculous on fast moving scenes and that is a limiting factor.
I know 1080p60 exists, but nobody will use it…
The easiest way to look at the problem is: the larger the screen the further away you have to view it for the brain not to see any imperfections. Alternatively if you do not want to move away from a larger screen then you need a higher resolution display otherwise any imperfections will become visible.
Hi! Your mathematical calculations are interesting – but not the whole story! Have just bought a Samsung 40″ 1080p LCD after much viewing of many models. I can clearly see the difference between 1080p and 720 at over 10 feet.
Ian: The calculations are based on the resolution of the display and do not include any compensation for source material (nor quality of video processor, vision of the viewer, etc.) The differences you see are probably due to differences in the quality of the source material and/or the method used to process it before it is presented on the display. Or maybe some other form of variation not accounted-for in the calculations. The charts are not absolute cut-offs, but rather a general guideline, so results will vary by person and for each type of equipment setup combination. Thanks for the feedback!
I am contemplating purchasing a Samsung 46″ LCD and looking at 780p vs 1080p and I appreciate the information in this article. I currently have a 36″ CRT that I really like. I use it for xbox 360 at about 5″ and regular TV viewing at about 8-10. I’m really concerned given all that I’ve read about the quality of fast motion lacking in LCD and Plasma flat pannels as most of the content I favor is sports and action movie related. Should I be concerned? If so, what is my best choice to minimize the effect (LCD/Plasma/DLP)?
A note on the first reply-post. I got curious on distances the poster mentioned he could spot a pixel at twice the distance that the chart suggested. I use a Dell 2407 that has 1920*1200 (16:10) ie 1080p and decided to check how far away I could spot a pixel. According to chart 1 the point where one get full pay off for 1080p with a 24″ screen is approx. 3 feet and the point where one starts to benefit at all compared to 720p is approx. 5 feet. I constructed a very rudimentary test screen, the left half of the screen white with 9 single black pixels spread over approx 4 sq/inch and the right half black with similar white pixels. Starting to back away I found the black pixels starting to fade out at 6 feet, pretty close to the prediction but the white pixels on black where clearly visable. Moving further away up to the back wall and going into the kitchen they where still perfectly visable at some 25 feet, where I could get no further away in line of sight of the screen.
The note here is that the required resolution for transparency likely varies depending whether its a mostly light image with some dark or vice versa.
I sold my big screen and now I’m using a 5″ Black and White. I don’t know what the resolution is, but you have to sit pretty close to see anything. (Network TV) all looks the same (garbage), which is why I never watch it anyway. But, the next time there is a hurricane, and the power is out, and I have nothing to do, I can watch TV (since the 5″ B&W is battery operated). I can watch my roof blow off on TV!!
Brian: It sounds like you need a screen with a high refresh rate. Plasma is as fast as a CRT, so worries there. DLP is also very fast; the color wheels are all now at least 5x (meaning 300 times per second = 3.3 milliseconds). LCD is the only technology with signifiant problems with screen refreshes. Older models were in the 30 millisecond range. Anything below 15 milliseconds is going to result in minimal viewing problem; some new LCDs are as fast as 5 milliseconds (which will result in no visible artifacts from slow refresh).
Andreas: thanks for the additional follow-up. I think it reinforces the previous comments regarding anti-aliased images verses a non-anti-aliased images. Obviously, there are many factors to this.
I created a new post where anyone can test the Visual Acuity Viewing Distance of their LCD monitor (and their eyeballs). Check it out: https://carltonbale.com/blog/2006/12/visual-acuity-viewing-distance-test-it-for-yourself/
The point at which you stop seeing a black and white checkerboard and start seeing a gray box is the visual acuity viewing distance for your monitor/eyeball combination.
Andreas (and Carlton): The test with light pixels on a black background has absolutely nothing to do with resolution. In a completely dark room a human eye can detect a flash as faint as five photons. These could (at least theoretically) be originating from a source as small as a single atom that is repeatedly tweaked into sending out visible light.
Probably the best example of this phenomena is the stars. If you watch Sirius A, the visually brightest star in the sky and divide its diameter with the distance, you will get a figure that is equivalent to watching a single pixel of a sharp computer screen (0.24mm) from almost 10 km distance. The question here is all about light output, nothing about pixel size or resolution.
(What you will not see is that Sirius has a small companion star, or that Alpha Centauri (another close and bright star) actually consists of two stars of similar size, just as you of course will not be able to see your computer screen at all from 10 km, unless you put an extremely bright light source into it and watch it in the dark š
I know that you get a fair amount of degradation if you lower the resolution on a lcd monitor below the native resolution. Wouldn’t you get the same effect on a 1080p tv since most all HD content is less than 1080p?
Andrew, there is really not very much degradation from displaying a lower resolution (720p) video on a higher resolution screen (1080p) screen as long as it is video content. Pictures and video scale very well. However, text, smaller fonts, single-pixel lines, etc., do not scale well. So, as you mentioned, you would want to avoid scaling a PC desktop (for example).
Another factor is that most higher-end HDTVs include a video processor designed to scale lower resolutions up to native panel resolution, which results in generally equivalent (or perhaps slightly enhanced) video quality.
It gets hard to interpret the charts at close distances. I’m considering a 32″ LCD for viewing from 6′. Should I wait for 1080p to get a superior picture or will the current crop of 32″ LCDs at 720p be sufficient for most purposes? I’m guessing that the new models will be out in Febuary with 1080p available at 32″. I’ve noticed that some of the best 32″ LCD have just been reduced in open retail price and a new crop of 40″ 1080p sets are now being released at higher prices than the old models.
My screen and view distances are fixed but I don’t want to waste money on overkill I won’t notice. I doubt that my eyes will get better with age although my high end audio system has spoiled my listening of poorer systems.
Pete: For a 32′ panel, 720p is fine for distances 6.2 feet away or farther. Between 6.2 and 4.2, the advantage 1080p becomes apparent. At 4.2 feet, 1080p would be fully apparent. So you are probably OK with a 720p panel. Keep in mind that this is a generalized distance and that it varies somewhat from person to person. But unless you plan to connect the TV to a desktop computers, you can probably save some cash and skip the extra pixels.
Thank you, Carlton, for fabulous work. However, the 12 Dec response from Mark was a bombshell. You replied that you make use of DirecTV, but that’s only a medium carrying the original source material. I was planning on purchasing a 1080P 60″ inch LCD later this year to be viewed at a distance from 7 to 10 feet. But it will be most often used for general TV viewing and most of that is PBS. It now occurs to me that I might be shooting myself in the foot with such a screen. This is hardly a small point – could you go into that a little more?
Leo, at 7-10 feet for a 60-inch screen, you could definitely benefit from a 1080p screen resolution. But, as was briefly mentioned earlier, 1080p has a relatively minor benefit if the source material is standard definition NTSC video.
First of all, if you don’t have the capability for HD content, you should get it as soon as you purchase your new TV. I use a TiVo that records non-network HD (and SD) from the DirecTV satellite and broadcast HD from an antenna in my attic. You can probably get HD from your local cable company as well. It doesn’t matter how you get it, just get it. If fact, HD down-converted to play on a SD display looks much better than cable, satellite, or over-the-air.
PBS is broadcasting several of their shows in HD; they were about the first to start HD broadcasting. In the not-too-distant future, SD broadcasting will be shut-off and all of the networks will be HD-only. I’ve found that about 90% of the shows I regularly watch are HD, so the content is available.
Now, to get to you question, if you get a 1080p display and have only 480p content, is it a complete waste? No, it’s not a complete waste of resolution. If you sit close-enough to a display, you can see imperfections regardless of the source material. For example, I can see the gap between pixels on my projector when the screen is white (123″ screen viewed at 10′). The pixels are smaller on a 1080p display than on a 720 display, so this defect would be less noticeable (all else held constant). However, this is a minor point and generally not at all a factor for the vast majority.
It comes down to this: On a 60-inch display, SD content is going to look very soft (i.e. blurry) regardless of the panel resolution. The extra pixels of a HD display don’t have any information telling them how to improve the picture, so they basically revert to an SD-equivalent display. To see the benefit of a HD panel, each pixel needs to act independently, rather than basically acting just like the pixel next to it (as with SD). There is no video processor/scaler in the world that can take a soft, low-def signal and make it look anywhere close to true HD. So get HD if you can; if you can’t, you can save a lot of money on your 60′ panel and skip 1080p.
Mikael: Thanks so much for sharing this information; it is extremely useful to this discussion on screen resolution. Being able to see a single white pixel on a black background did not seem like a valid resolution test to me, but I didn’t have any technical knowledge to explain why it was not.
The intent of this article was really to show when a group of individual pixels become undiscernable and start to look like a smooth pattern, not being able to identify a singly highly contrasted pixel. Thanks for explaining the difference.
Carlton, Thanks for your analysis of screen size and viewing distances vs resolution. We are now planning to purchase a 720p HD TV and HD Cable soon to view from 6′ to 12′. In reading you conclusion resolution is not as important as contrast and color quality. The contrast rating is available but how does one compare color saturation and accuracy since each viewing area is different? And do you have simular analysis on contrast ratios, is their a relationship between contract ratios and distance?
According to this chart, if your viewing distance is 10 feet (~3.2m), then you should have something around a 70″ screen for optimal viewing. I was thinking of getting a 37″ screen, but I’m asking myself if I should even get a TV at all and not a beamer (although way to expensive). What’s your opinon?
Resolution is higher on the list if you hook up your display to a computer on occasion. There are plenty of uses for your TV beyond watching movies. If you want to display other information, in particular in the text form, 1080p LCD screen is the ony way to go right now.
Carlton, thanks for the great info.
You say that contrast ratio is the no. 1 aspect of picture quality, but, isn’t there a limit beyond which a higher ratio makes no difference? For example, can one notice the contrast ratio difference between a 6000:1 ratio LCD (such as the Samsung LN-S5296D), and and an “Up to 10,000:1” (whatever the “up to” means) ratio of a Panasonic plasma TV?
Hi, a note about contrast; typically crt has 1:100.000 in contrast, hence their black can be really black, whilst I have yet to see a plasma or LCD capable of displaying black.
No matter, what I’m really interested in is colourspace. I wonder if anyone else thinks that TV/DVD/HD colourspace is a joke that could really use some upgrading. Most of the encoding schemes are based on a 4:1:1 colourspace where there is colour information for only every fourth pixel, this is the primary reason for the big squares of same colour that you can see in even high quality transfers. I produce graphics and have learned the hard way that there are certain colours I never use and whole ranges of colour I totally avoid(red) if I can. I was hoping for salvation in HD(when rez goes up, so does colour-rez) and bought a 1080i camera for capture thinking that the superior rez would improve colours. However it captures in mpeg2 which is far worse than DV as far as colour is concerned and I have gained no extra pixels in colour-rez even though the rez is four times that of my DV camera. In fact sometimes the low bitrate is so obvious that it produces 8×8 pixel squares of exactly the same colour if what I’m filming is “wrong” enough for the codec, and that reduces usable rez to half that of DV…
anyways, great blog.
b