4k resolution televisions are now widely available and potential buyers are wondering if the extra resolution is worth it. In some cases it is, but in most, it’s not. The details below can help you decide.
4K (and 8K) Resolution Defined
The older 1080p HDTV standard has a resolution of 1920×1080 (2.1 million) pixels. The UHD resolutions are multiples of this base 1080p resolution.
4k resolution is named for the approximately 4,000 (4k) pixels that make up the horizontal resolution across the image. More specifically, the resolution is 3840×2160, which gives 8.3 million total pixels – 4 times that of 1080p. (4k is sometimes called 2160p, and is also known as QFHD – Quad Full High Definition.)
8k resolution has about 8,000 horizontal pixels. The resolution is 7680×4320 (33.2 million) pixels, which is 16 times that of 1080p. 8k is also called 4360p.
The ITU and the Consumer Electronics Association have officially dubbed both 4k and 8k resolutions as “Ultra High-Definition”, but to complicate things, these resolutions are also commonly called Ultra HD, UHD, UHDTV, and even Super Hi-Vision.
HDMI 2.0 (or later) is required to fully support the 4k specification. (The older HDMI 1.4 spec has partial 4k support, but is limited to a frame rate of 30 frames per second. But most components with HDMI 1.4 don’t contain the electronics to support 4k resolution, even though the HDMI interface does.)
How to Tell if You Will Notice the Additional Resolution
To be able to detect the additional resolution of 4k (or 8k), the screen must be quite large and you must sit fairly close. So how do you know if your particular setup would benefit? Here’s your answer…
Based on the resolving ability of the human eye, it is possible to estimate when 4k resolution will become apparent. A person with 20/20 vision can resolve 60 pixels per degree, which corresponds to recognizing the letter “E” on the 20/20 line of a Snellen eye chart from 20 feet away. Using the Home Theater Calculator spreadsheet as a base, I created a chart showing, for any given screen size, how close you need to sit to be able to detect some or all of the benefits of a higher resolution screen. (Click the picture below for a larger version.)
(Note for those of you not used to reading charts, just jump to the calculator below)
What the chart shows is that, for a 84-inch screen, 4k resolution isn’t fully apparent until you are at least 5.5 feet or closer to the screen. For a “tiny” 55-inch screen, you’ll need to be 3.5 feet or closer. Needless to say, most consumers aren’t going to sit close enough to see any of extra resolution 4k offers, much less 8k.
It’s important to note that research by Bernard Lechner (former VP of RCA Laboratories) found the average viewing distance of American TV viewers is 9 feet. This is substantially farther than the 5.5 foot distance required to fully resolve normal-sized 4k screens. I don’t imagine people rearranging their living rooms to take advantage of the otherwise unnoticeable UHD resolution benefits.
Verification of Calculations by Sony and THX
Sony lists identical required viewing distances in the Frequently Asked Questions section of their product description. Checkout the Amazon.com product description FAQ for the Sony 65X900A 4k Ultra HDTV. It shows the same distances I have calculated (i.e. 3.6 feet for a 55″ screen and 4.2 feet for a 65″ screen.) If you don’t believe my numbers, confirmation from Sony should help convince you.
Quote from Sony FAQ:How close to the TV must I sit to appreciate 4K?The short answer is that between 5 and 6 ft. is the ideal viewing distance for a 55” or 65” Sony 4K Ultra HD TV. However, on a 55“, you can now sit as close as 3.6 ft and enjoy a visibly smoother and more detailed picture (e.g you won’t see the individual pixels). On a 65“ TV, you can sit as close as 4.2 ft. to appreciate 4K.
On a 50-inch 1080p HD display, most consumers can begin to distinguish individual pixels only when standing within six feet of the screen. Therefore if your viewing distance is 10 feet or greater, an Ultra HD 50-inch display will likely have little perceived benefit in terms of image clarity and sharpness [source]
Availability of 4k and 8k Content
If you are among the rare few who has a giant screen and sits close enough to it to benefit from 4k resolution, you still need UHD content. Here’s a summary of your options:
Highest Quality Options (less compression, highest bitrate):
- Ultra-HD Blu-ray players and discs are available starting in 2016. This will be the highest-quality offering, with bitrates of up to 128 Mbps, giving the highest quality audio and video possible. Though discs don’t offer the convenience of streaming, it will be the best source of 4k video in 2016 and beyond. The quality of Ultra-HD Blu-ray will likely remain ahead of online stream options for years to come.
- Video download boxes such as the Sony FMP-X1 4K Ultra HD Media Player and the FMP-X10 4k Ultra HD Media Player support 4k. These devices download a limited set of movies from Sony Pictures in 4k resolution to an internal hard drive. Due to the limited amount of content, high price, and low adoption rate, this would seem to have only marginal impact on availability of UHD content.
- Kaleidescape Strato Players download full bitrate 4k movies from the Kalidescape online movie store. These are identical in quality to Ultra-HD Blu-ray. The company has had some recent financial issues, but appears to be up an running again. The hardware is expensive, but the quality is excellent.
Moderate quality options (more compression, lower bitrate):
- The built-in Netflix and/or Amazon Prime Video apps on most 4k smart TVs will play 4k for the few titles they stream in that format. The bit rate is only about 16 Mbps, compared to 48 Mbps for 1080p Blu-ray. What this means is that picture and sound quality are sacrificed in other ways (color depth, contrast ratio, frame rate) to achieve the 4k resolution, so don’t expect perfection.
- The Microsoft Xbox One and Sony Playstation 4 (and later versions) have hardware capable of 4k resolutions. Steaming video apps such as Netflix will be able to play 4K on these platforms. However, most games can’t be rendered in full 4k.
- The Sony PlayStation 3 can display static 4k pictures (not moving video) using the HDMI 1.4 connection at 24 or 30 Hz refresh rate. This may be worthwhile for photographers, but probably not for anyone else.
- Cable and Satellite: Cable and satellite companies are offering some 4k content on their new boxes. The quality is better than their 1080p channels, but it’s still highly compressed as compared even Blu-ray, and substantially lower than Ultra HD Blu-ray, and is generally comparable in quality to streaming services.
- Amazon 4k Fire TV: A good option for Amazon Prime subscribers who watch a lot of Amazon Prime Video
- Roku and nVidia Shield both offer versions with 4k outputs and apps that support 4k streaming.
Dubious quality options (upscaling of lower resolution content)
- Most 4k UHD TV advertise the ability to “upscale content to 4k”. The highest-end, stand-alone video processors offer only moderate improvements in quality. The video processors inside HDTVs are generally low-end, offering very little improvement in quality, and can make some up-converted content look worse. Don’t count on video processor upscaling to deliver any significant picture quality improvement.
Conclusion
The benefits of 4k and 8k are marginal. You have to sit unrealistically close to see the full detail and you need 4k source material, which is not readily available. If you use a 4k display as a computer monitor to view high resolution source material, you could benefit. Other than that, save your cash and purchase 1080p instead.
My recommendation for achieving the best picture quality for the lowest price is to focus on contrast ratio and look for these features:
- Look for the HDR (High Dynamic Range) feature: HDR adds a much more perceivable picture quality improvement than does higher resolution. HDR increases the contrast ratio between the brightest and darkest regions of the screen, which is the most beneficial thing you can do for image quality. Keep in mind that HDR source material is required for this to work, but I expect this to be much more broadly available because it can be “backwards applied” to existing 1080p content.
- Look for OLED instead of LED/LCD: the near infinite contrast ratio of OLED will offer a superior quality image. A 1080p OLED TV will have an overall better picture than a 4k LED/LCD. OLED is more expensive, but the prices are starting to come down.
ISF states that the most important aspects of picture quality are (in order): 1) contrast ratio, 2) color saturation, 3) color accuracy, 4) resolution. Resolution is 4th on the list, so look at other factors first. Also, be sure to calibrate your display! I recommend the following calibration tools.
Recommended Calibration Tools
- Blu-ray: Spears & Munsil High-Def Benchmark Disc 2nd Edition (my favorite)
- Free Burn-your-own Blu-ray: AVS HD 709 – Blu-ray & MP4 Calibration
- Blu-ray: Disney WOW World of Wonder (most popular)
- Automatic Calibrator: Datacolor Spyder
“Just tell me what resolution HD TV to get”
If you don’t like reading charts and are looking for a quick answer, enter your screen size below to see how close you’ll need to sit to fully appreciate various screen resolutions.
Note about “or closer” viewing distances calculated above: if you sit closer than the distances shown above, you will be able to see some (but not all) of the detail offered by the next higher resolution.
the same folks who now say 4k is of no value were among the folks who said 1080p and other forms of high def were of no value over standard definition.
now, i freely admit 3-D was a gimmick.
but that does not mean any advancement in resolution is a gimmick….and 3-d was not even an advancement.
odd situation that some socalled videophiles are now knocking resolution advancements.
You wrote: “The same folks who now say 4k is of no value were among the folks who said 1080p and other forms of high def were of no value over standard definition.”
You are incorrect. I say that 4K has no significant value but I was a very early adopter of 1080P because I recognized what a massive improvement it offered over SD.
You wrote: “odd situation that some socalled videophiles are now knocking resolution advancements.”
It’s not odd that intelligent videophiles want to see their equipment investments yield visibly superior results. If 4K is not visibly better, why not spend you money on something that is?
In ten years, if you have pixels the size of bacteria, how much will you be willing to pay to get pixels that are half that size?
jerabaub: I wrote an highly ranked article entitled “1080p Does Matter”, so I’m not one of the people who said it wasn’t important.
If I were choosing between the resolution of a 4K LCD panel and the near infinite contrast ratio of a 1080p OLED panel, I’d pick the latter every time. OLED would offer higher overall picture quality despite the lower resolution.
OMG you are killing me. (No, not really.) Here am I with hard-earned dollars ready to buy a nice, new monitor, and I’m only getting more and more confused the more I read and research.
4k computer monitor you sit close to = good
4k UHDTV you sit far away from = unnecessary
🙂
A chain is only as strong as its weakest link. A higher resolution tv can only display the quality of picture it gets from the source. A tv can’t display a 1080p resolution picture from a 720P source. No channels have a resolution higher than 1080p now. Most don’t deliver the 1080p that the tv’s can display now. Paying extra for a higher resolution tv won’t give you a better picture until there is a higher resolution source available. You would still see the same resolution picture as before until higher resolution channels are available.
There are limits on the bandwidth an analog channel or satellite transponder can carry. 5 or more digital channels can be put on one analog channel depending on the resolution. The resolution can get so high that it uses the entire bandwidth for 1 channel. Most of our programming comes from satellites which cost $200 an hour for 1 transponder about 20 years ago. I don’t know what it costs now but I bet it isn’t less. If the resolution increase doubles the number of transponders necessary to carry the same number of channels the cost of satellite transponders to carry the channels would double which would be passed on to the consumer. This brings up another question, If the cost doubled for satellite usage to get higher resolution would anyone be willing to pay for it?
“As of this writing, the only readily available content source for 4k is the Sony PlayStation 3, ”
That’s incorrect. A PC computer can utilize 4k resolution. Even if you wrote this 5 or more years ago this was still true.
You are correct that content is hard to find, but it is around. You almost certainly won’t find what you are looking for, but you might find some specially designed movie, similar to the type of movies shown at Imax cinemas.
But this is how high resolution works, it starts on PC, as it always has, then as the technology develops further it becomes available to the rest of the market. There are even guys out there right now running 11520*2160 resolution, for work, video/image editing and even for gaming. It’s actually pretty interesting that smartphones and portable devices are what is currently really pushing all the recent advances in resolution and screen technology in general.
4k right now is for playing PC games in incredible detail, having a huge amount of stuff open on your screen and watching amazing porn.
Just the same as when PCs were using 1080p before blueray was around we will need to wait for the content, but it will come, and it will be amazing.
i just recieved a 4k samsung gaming monitor 28″. I just went from 1440P which was alrdy noticably better to the human eye than 1080p while gaming on my computer. As of today, I now game at 4k resolution on max settings at 60fps constant. THE DIFFERENCE IS HUGE! There is 0 denying this. You can use any arguement you would like but as they say the proof is in the pudding. My neighbors and my gf are all like WOW that looks like we are looking out a window or something when we look at 4k landscape wallpapers. The gaming experience is so finely detailed I get lost in just looking at the landscape and details of the game now. So…is it worth it. A BIG YES. Couldn’t be happier, never going back.
A 4K computer monitor that view from 2 feet away is very different than an HDTV that you view from across a room. A 4K monitor on a computer is wonderful, but that has no bearing on whether 4K offers an advantage for home theater.
@fbmaxwell: Well said!
Thanks. It would have been even better had I not left out the word “you” in “that [you] view.”
If your getting a constant 60 @ 4k on max settings I suggest some newer games.
I game at a bit higher than 4k but am using 4 Titans and can’t manage a constant 60 in many of today’s games. Largely cpu bound but I’m using a Devils canyon @ 4.8 so their isn’t much to be done there either with current tech.
Basically I’m pulling your card. No offense.
More on topic – I’m in the market for a new tv and for the love of God do I go 1080p oled and have a superior picture now, or do I go 4k and be future proofed for later? I do a lot of gaming and id venture it will be 2 more generations before consoles are rendering games at 4k which is roughly 12 more years at best. I simply don’t know.
I sure hope some new Tvs emerge before tax time that make the decision easier!
what about the oculus rift i think it could really benefit from 4k and 8k resolutions
I’m really confused as to why we need all these fancy charts and techno-babble to tell us what our eyes can see. Just trust your eyes. If you can see the difference, than it’s worth it. You don’t need a chart telling you what your own eyes are seeing.
Yes, you do need a chart to tell you what your own eyes are seeing. Because if I show you two televisions, one labeled “4k” and one labeled “1080P,” you will probably convince yourself that the “4K” television looks better — even if they are both identical and showing 1080P content.
Robin Goldstein conducted over 6000 wine tastings involving over 500 people, many of whom were experts working in the wine industry. His tastings revealed that, when people know the prices of the wines, they strongly prefer more expensive wines. But when the the same wines are served without any labeling, the votes reverse and they express a marked preference for the least expensive wines.
That’s how expectation bias works. While you might want to dismiss science as “fancy charts and techno-babble,” that Dark Ages attitude toward science has, thankfully, fallen out of favor.
Just a side note. 4K could be a bit more beneficial for gaming to combat aliasing. And I’m just not talking about PC-gaming at a desk with a big screen close to you. Even at a “normal”-sized 1080p tv in a standard living room, you can often see jaggies in games as a distinct “crawling” in the edges when you move around. 4K would lower this issue. This is an issue normal video viewing doesn’t have though.
ROFL I love the calculator you included. My parents allays said I would hurt my eye sight sitting so close to the TV when I was younger. If I go by what your calculator says for 4K and how close I would need to be for the full benefit, I think they start telling me that all over again when they see me watching a 4K TV and trying to get the full 4K effect. ROFL 🙂 At least for the size of 4K TV’s I would have thought about getting. Whats funny is that when I go to the store and see a 4K TV displayed I always seem to stand right up on it when looking at it. So you may be on to something here. LOL
What you are talking about is the micro effect. We should also look at the macro or the effect as a whole. The bottom line is that 4k makes things look a lot more “lifelike”…just look at one yourself side by side and decide. Sales people used to tell us the same thing for 720p on a 32”. which is a bunch of bs!
This sounds a lot like the video equivalent to the audiophile BS that I’ve been dealing with for years.
You want to claim that there’s some “macro” effect that makes it look more “lifelike”? Then do the hard work of proving you can see it in double-blind test. Same thing for 1080P vs. 720P on a 32″ screen.
And don’t tire me with snooty, self-promoting crap about how you’re an astute viewer with hawk-like vision and the finest source components that money can buy. Similarly, don’t waste my time with deprecating remarks about how I must somehow be inferior, or have inferior equipment to yours, if I don’t immediately agree with your proclamations about “lifelike” video “macro” effects.
You are absolutely right. On my first copy of Stereophile, 20 years ago, I cancelled my subsciption because of the endless and nonsensical articles about “bolting down” record and cd players to extract better sound through less vibrations and getting separate amps per speaker and gold plating everything … etc. The HDTV market must be reaching a sales plateau and that’s why they are pushing this ultra HD “snake oil” onto us.
And where is your double-blind testing to prove the contrary?
Where’s your double-blind testing to prove that I cannot hear 100KHz, see infrared, and feel peas under my mattress? You don’t have any? Then I must be able to do all of those things if I claim that I can, right?
Learn some basics about science. You don’t just accept, as fact, any random claim that comes your way when there is no evidence provided. It’s up to the person making the claim to substantiate it.
your right dude my VHS tapes are still awesome and I have no need to upgrade it. All these idiots wasting money on big TVs when my TV is from 1986 and still works good as new.
Joe, why don’t you respond to what I actually wrote? Oh, that’s right: Because what I wrote is logically flawless and there is no valid argument against it.
Stop being a dick. We are talking about equal-sized large screens viewed from a normal distance. If a 65″ 4K screen does not look visibly superior to my 65″ Panasonic plasma, then it would be stupid to waste money on the 4K display. I would be much better served by making an upgrade that is visible or audible.
” … feet or closer for full benefit” is wrong.
” … feet or remoter for full benefit” is right.
No, what is written in the article is correct. For example, if what you stated were true, you could stand 1,000 feet away from a 42-inch display and see the full detail of 4k. Obviously, this is not possible.
Depending on ones visual acuity one has the be a specific distance from the screen moving(much) closer will not make the image sharper or more cohesive, there is a +/- distance that needs to be taken into consideration.
X number of pixels(image data) per degree, moving closer reduces the number of pixels per degree, thus making the image less sharp and reducing image quality. We need specific about that distance. Greater or closer to a point say x % no more or less.
Saying “or closer” gives the impression one can be right up to the screen and still get the benefit of 1080/2160, which is incorrect and misleading.
Absolutely, I have 20/10 vision and can’t even stand watching a 75 inch or larger 1080p screen at the store from under 12 feet they look like massive horrendous macro-blocking affairs. Bring on native 4k content and screens and be done with all the whining already. We should certainly acknowledge that anyone with vision above 20/20 could notice a drastic improvement from much greater distance.
That’s probably because almost all stores pump a 720p, or even 480i, signal to every TV in the store and then blow out all of the details by setting the TV to “Vivid” or “Dynamic” with the contrast and brightness set at 100%.
I’ve dealt with gifted listeners and viewers — just ask them. I’ve known audiophiles who say that their hearing and listening skills are so good that 192KHz/24 bit audio sounds far better than 96KHz/20-bit audio. They claim to get headaches if they even hear a CD. I’ve talked to videophiles who claim that they have such visual acuity that they can calibrate a monitor without a test disc or measuring equipment. I’ve talked to others who say that they can definitely see an improvement when they switch to gold-plated, $100 HDMI cables (must be the rounder 0s and sharper 1s).
The problem with such claims is that most are not backed up with double-blind tests. If we take a 4K monitor and position it at a normal viewing distance, can you reliably identify whether there’s a 1920 x 1080 or a 3840 x 2160 signal being displayed on it? I’m not asking if you believe you can — I’m asking if know you can because you’ve shown an ability to do so in such a test.
It’s all about expectation bias. All humans are subject to it. When people expect a difference, they identify one. If you serve them wine in two glasses, most will express a strong preference for the one you said was more expensive. If you play the same audio recording twice, once identifying it as a “studio master” and the second time identifying it as an MP3 file, most people will be sure that the MP3 file sounds far worse. It’s no different in the world of video.
Hi Carlton – I want to thank you for this article – It has been very very helpful in shaping my understanding of the relationship between pixels and viewing distance.
4k and 8k are completely pointless for the home. This is probably yet another money grab by the psycho dogs who rip babies of their candies in their sleep. The corporate whore bags of america. You’re telling us that a resolution that is used in the fucken movie theater where the image is blown up 100x is good for the home where the average screen size is 40″ or so. Suck a tail pipe already.
Then don’t buy the screen dickhead.
1080p is old now, I’ve been using it for over 10 years (same number of pixels ~2 million, different aspect ratio, 4:3). I’m going to buy it, and it’ll be awesome.
You calling someone a “dickhead” is like Donald Sterling calling someone a racist.
You don’t have a 1080P TV. You have an old computer monitor that you sit 2 feet away from. So that has nothing to do with 1080P television.
You wrote: “1080p is old now”
20KHz is old now. You need speakers that go to 40KHz, right? Because you have magic ears to go along with your magic eyes, right?
You wrote: “I’m going to buy it, and it’ll be awesome.”
You will buy it. And, lacking enough money, you’ll buy a crappy 4K television for the same price you could have bought a good 1080P plasma or OLED television. But it will look better than an old computer monitor, so you’ll be convinced that it’s better than the 1080P television that you should have bought.
I’ve seen your type before. I went to someone’s house where they were telling me how much better their new 1080P television was than the 720P television that it replaced. They showed me football from their HD cable box, telling me to look at the incredible detail. I took the remote from them, popped into the menus. Resolution still set to 720P. You’re that guy.
I don’t have the greatest vision, but I can easily see individual pixels on my 63″ screen from 10′.
The whole point is you want the pixels so small you CAN’T see them… You are measuring the other way around.
Furthermore — take a look at a good ‘hd’ tablet (one that fits into your “fully benefited” pixels-per-arc-minute calculation) and then look at a 1600×2560 Nexus (which is considerably higher resolution than you say we can benefit from). The Nexus looks considerably sharper.
Prop-up said tablets on a table side by side, stand 10 feet away and tell me which one has a better resolution mmmkay?
The point is in the DISTANCE from the screen.
Obviously if I hold an iPad2 with 1024×768 resolution and an iPad4 with a retina display (I forget the pixel density but it’s something in the 2k realm) about a foot from my face the iPad4 is definitely going to look much better because it’s within the distance that it will matter. But from 3, 5, or 10 feet away the difference will be marginal, and again that’s the Point. You don’t sit even 3 feet away from a typical 40″ or 50″ TV; as the article states the average is about 9′, and in my case it’s more like 13′. Not gonna waste money on something I can’t use to it’s fullest extent.
Great site! What formula is used to find the min distance you need to sit from the screen to enjoy the effects of 4K, i.e. not seeing jaggies? Thanks
Thanks. Great information, and I appreciate it.
Hi, very interesting article indeed. I have one question if you don’t mind, you’re claiming that the PS3 is able to display still at 4K, but don’t you need two HDMI cables to do that?
Eddy
routerunner, HDMI 1.4 supports 4k (both 3840 x 2160 and 4096 x 2160) but is limited to a frame rate of 30 frames per second. This is fine for viewing movies filmed at 24 frames per second, but not for 3D content or HFR.
The PS3 is only HMMI 1.3. It uses a firmware upgrade to be able to play 3d Blu ray.
The PS3 HDMI 1.3 output implements a subset of the HDMI 1.4 specification, which is the higher bandwidth required for 4k resolutions and 3D. It doesn’t meet the full HDMI 1.4 spec because it isn’t capable of HDMI Ethernet channel or audio return channel.
It’s interesting to note that Sony lists the same recommended viewing distances in their 65X900A 4k UHD TV product description as I have calculated above (i.e. 3.6 feet for a 55″ screen and 4.2 feet for a 65″ screen.) If you don’t believe my numbers, confirmation from Sony should help convince you.
Screen width would yield marginally better results for your calculator than diagonal for those with different screen ratios (mine is 2.34:1). Also, I find it odd that for my 166″ screen (16:9 equivalent dot pitch; actual diag. 158″ 2.34:1), 4K distance is 10′ but 1080p is 22′.
I wonder if being boundary-sensing, our eyes can detect the “screen door effect” better than it distinguishes line pairs? I’ve seen a demo of a Sony 65″ 4K TV at Fry’s, and even at 5-6 feet, the lack of pixel structure is obvious compared to a 1080p set at the same distance or farther.
David, you’re correct on both points. The chart does assume square pixels. My Home Theater Calculator spreadsheet does allow for non-square pixels and various aspect ratios.
The black gaps between pixels are definitely a more noticeable issue, especially for brighter, mostly white background scenes. Any time there is a high contrast difference, it will be noticeable. That’s one of the reasons high contrast ratio is more important than high resolution. In general, the screen door effect isn’t nearly as noticeable as it used to be several years ago.
Your calculations are shockingly off. In real life, for even a 46” tv at 4K resolution, the optimal viewing distance is somewhere around 11-13 feet. This is the distance where any closer you will see individual pixels, and any farther the image looks perfectly smooth. In contrast, for a 46” tv at 1080p, the optimal viewing distance is somewhere around 22-26 feet.
The difference gets even more pronounced at larger screen sizes.
Post the math you used to calculate the chart. Otherwise it is YOU who are selling snake oil.
^^ Don’t take my word for it. See with your own eyes. The image at this link is a 1080p image with no anti-aliasing. Throw the image up on your computer monitor (mine is 23 inches), then walk backwards until the image looks perfectly smooth and you can no longer see jaggies (i.e., individual pixels). Now imagine a 4K screen with the same pixel density and twice the diagonal size. Wallah. You have found the optimal viewing distance for a 46 inch 4K screen. When I do this test, I measure 13 feet. Whereas your chart suggests I should measure less than 2 feet.
http://www.freeimagehosting.net/newuploads/rvlwr.png
^^ And for good measure, repeat the test with the image at the following link, which is at 540p, representing one quadrant of a 1080p image. Walk backwards until the jaggies disappear. You have found the optimal viewing distance of a 1080p screen with twice the diagonal size of your computer monitor. When I do this test with my 23 inch monitor, representing a 46 inch 1080p screen, I get an optimal viewing distance of around 25-26 feet. In contrast, your chart suggests it should be around 6 feet.
http://www.freeimagehosting.net/newuploads/d2l8v.png
Chris, I can assure you I’m not selling snake oil nor anything else; everything is freely available.
As stated in the article, everything (including the calculations) is available in my Home Theater Calculator Spreadsheet: https://carltonbale.com/home-theater/home-theater-calculator/
I do agree that most people will see the jaggies for the images you posted at distances larger than those calculated in the article. That’s because those images in no way represent real world video viewing material. It’s a pure white / pure black image, with no gray-scale / no anti-aliasing. The majority of a movie scenes are composed of gradients, and the high-contrast sharp line content that is there is anti-aliased. Astronomers will correctly point out that stars with far smaller resolutions (dots per inch) can be seen because of the high contrast with surrounding space. As in the other case, this doesn’t apply to real-world movie viewing.
I have no incentive to post inaccurate information. The numbers Sony has posted pretty much exactly match mine, and they have a disincentive to do so. The numbers posted by THX also generally align with mine: https://carltonbale.com/home-theater/home-theater-calculator/ .
You wrote: “Chris, I can assure you I’m not selling snake oil nor anything else; everything is freely available, including the calculations. As stated in the article, everything is available in my Home Theater Calculator Spreadsheet”
That’s some rather circular reasoning; ‘you can trust what I say because I’ve produced a spreadsheet that concurs with what I say.’
Citing Sony and THX goes a long way towards supporting your numbers, but you might want to just show a few examples in your reply just to make it crystal-clear.
Please note that I believe that your numbers are in the ballpark (at the very least).
Fred M.: I edited my comment to clarify. I’m making 2 points there: 1) I’m not *selling* anything and 2) that I had in fact posted the math requested by Chris via the spreadsheet linked in the article.
The article provides the links back to the referenced Sony and THX material. I’d love to post some more examples and illustrations, but the post is already bordering on information overload as it is. 🙂
Carlton – You wrote: “I do agree that most people will see the jaggies for the images you posted at distances larger than those calculated in the article.”
By your own admission, your entire chart is bunk. Your chart is based on the premise that it reflects the maximum viewing distances at which the human eye can resolve an individual pixel based on arc-minutes. In admitting that most people will see jaggies at larger distances than those calculated in the article, you are admitting that the human eye can resolve individual pixels at distances greater than those you have calculated. Full stop.
Your reasoning that the images I posted don’t matter because they have no anti-aliasing is a red herring. That is precisely the point. Anti-aliasing is a visual trick used to HIDE the fact that the human eye can resolve individual pixels. It makes the image blurry, or less sharp. It also requires additional processing power for digital graphics, such as video games and digitally rendered animation.
So yes, anti-aliasing may make it so that the human eye cannot quite point out an individual pixel. That doesn’t mean that a 46” screen at 4K won’t look noticeably SHARPER than a 46” screen at 1080p, when viewing at distances of even up to 13-26 feet.
Nice try. The images I posted are undeniable. Viewers of this site, see for yourself.
Perhaps your premise is incorrect — i.e., that with 20/20 vision it is possible to resolve a maximum of 1/60th of a degree of an arc.
Chris,
Your understanding of human visual perception is flawed. When presented with a detail that can be seen up-close, people will still believe that they see the detail from much further away than they actually can.
At some distance, I could replace your images with ones that were ten times the resolution (assuming such a monitor existed) and have the majority of viewers believe that they still saw the stair stepping.
If you think that the human eye can pick out individual pixels at much greater distances, then produce a sentence with characters that are 5×7 pixels in size and then have people walk FORWARDS until they can just read the sentence.
Fred. Did you even look at the images?
4k quadrant:
http://www.freeimagehosting.net/newuploads/rvlwr.png
1080p quadrant:
http://www.freeimagehosting.net/newuploads/d2l8v.png
This is not some figment of the imagination. The numbers in his article are not anywhere close to reality.
Look at the 1080p quadrant displayed full screen on a 23” monitor (representing a quadrant of a 46” 1080p image), and tell me honestly that you cannot see CLEAR jaggies at a distance of 9 feet, the quoted average viewing distance of a TV quoted in the article. This proves definitively that 4K has a clear benefit at the average viewing distance, even for screen sizes as small as 46”.
In contrast, the calculator posted in the above article states that you would need to sit THREE FEET (3”) away from the screen to see a benefit of 4K over 1080p on a 46” screen. The images I posted prove this to be clearly, demonstrably false.
Chris, you’re mixing together contrast ratio and resolution. You have to separate the two variables to understand the individual impact of each. If you were to look at the images you posted on a screen with a 10:1 contrast ratio, they would look smooth. As stated in the article, high contrast ratio is more important than high resolution for overall image quality.
The issues with your images is that they do not represent real-world content. Test images represent extremes that do not reflect the way TVs in the living room are used. Let me give some examples of how relying on test images leads to conclusions that are not meaningful for real world scenarios. On one extreme is the astronomy example, with a single white pixel and a black background. If the contrast ratio is near infinite, in a completely dark room, a single bright pixel could be perceptible even on a 100 megapixel screen. At the other extreme, take a 100 megapixel test image of a pure white background. On a 100″ screen, it would look identical wither the screen resolution were 100 megapixels or 1 single pixel, indicating that resolution doesn’t matter at all. Taking these two extremes, you could argue that either 100 megapixel screen resolution is essential, or that anything more than 1 pixel of resolution is a waste. Reality, obviously, lies between these to extremes. The context needs to be in regards to watching TV shows and movies in a living room, not test images.
You need to test resolution independently of contrast ratio. The goal of the test image should be to represent the average contrast between two adjacent pixels that comprises the types of images most often viewed in the living room. Comparing a pure white to a pure black pixel does not accomplish this, and the pure white to pure black comparison is not the basis of the resolution chart posted here. The 1/60th arc-minute standard is derived from human perception of real world images. It obviously varies person-to-person and image-to-image; in whole it accurately represents real world conditions that accurately identify where, between the two extremes, the reference point should be established.
Carlton – You stated: “You need to test resolution independently of contrast ratio. The goal of the test image should be to represent the average contrast between two adjacent pixels that comprises the types of images most often viewed in the living room.”
You are now backpedaling and completely changing the rules of your own stated hypothesis, which is that you calculated the chart above based on the proposition that the human eye CANNOT SEE anything smaller than 1/60th arc-minute. As you stated, “Based on the resolving ability of the human eye (with 20/20 vision it is possible to resolve 1/60th of a degree of an arc), it is possible to estimate when 4k resolution will become apparent for the average eyeball.” You did not differentiate contrast ratio. Nor should you. This is because the goal of a higher resolution screen is to get the sharpest image possible, meaning that the human eye cannot see an individual pixel at ANY contrast ratio, thus eliminating the need to blur the image with anti-aliasing.
You did not address my proposition that, at normal viewing distances of 9 feet, a 4K image on a 46” screen is likely to look SHARPER to the human eye than a 1080p image, even if anti-aliasing blurs the 1080p image to hide the aliasing.
Nor did you address the very real-world application of video games and computer generated animation, all of which suffer from aliasing.
In the end, you spout a lot of talk. But I have produced a verifiable experiment (the images posted above) that directly refutures your claims (e.g., that you need to be within 3 feet of a 46” screen to observe the difference between 4K and 1080p). In contrast, you have produced no proof whatsoever to support your calculations, other than repeating the mantra that your 1/60th arc-minute premise is an accurate measurement of human visual acuity.
Chris, yes, I looked at the images. I’ll look at them again if I’m ever in the market for a monitor on which to view black and white, single-bit-per-pixel still images.
But such images are meaningless as a test for the benefits of a 4K color television. The human brain processes color, contrast, and motion in a more complex way than is represented by B&W still images. That’s all part of how we perceive detail.
Look at this image, which is a cat’s face produced by mirroring the right side to the left.
http://www.anti-spam.org/cat_face.bmp
The left half is 960 x 540 resolution and the right half is 1920 x 1080. No interpolation tricks were employed — simply cut the resolution in half and then double it via pixel doubling (1 pixel becomes 2×2 pixels). The stair-stepping which you claimed to be so visible at 9 feet on a 23″ monitor isn’t really visible, is it?
Adding motion further discredits the notion of black and white still images as a test; the brain’s processing of motion completely masks any difference in screen resolution.
The only valid test of 4K vs. 2K is an ABX test conducted with actual program material (movies, for example). Show me that there is a statistically significant ability by the test subject group to pick out the 4K rendition at that distance and then you’ve got something.
That’s a great picture Fred. Thanks for posting it. I hadn’t seen that before.
You hadn’t seen the picture before because I just created it for this discussion. You’re free to use it as you see fit. Thank you for the web site and this discussion.
As an aside, you need to lose the nasty tone in your replies to Carlton Bale. He’s not “backpedaling” or “changing the rules.” He didn’t ‘admit’ that his “entire chart is bunk.” That kind of juvenile tone is making you sound like some angry kid who thinks that he’s smarter than all of the adults.
Carlton Bale has provided, at no charge to visitors, a website and series of tools that many find to be useful and informative. But since you doubt Mr. Bale’s knowledge, methodology, and integrity, let’s look elsewhere:
cnet.com article: “Why 4K TVs are stupid
There’s all this buzz about 4K resolution. You don’t need it, and probably never will.” included: “The human eye, for all its amazingness, has a finite resolution. This is why you can read your computer screen from where you’re sitting, but not if you’re on the other side of the room. Everyone is different, but the average person with 20/20 vision can resolve 1 arcminute. One arcminute is 1/60th a degree. If you assume your field of vision is 180 degrees (it’s not, but go with me here), and you take 1 degree of that, you’re able to resolve a 1/60th sliver of that degree.”
Gee, that matched the numbers Carlton Bale used, didn’t it? The article goes on:
Joel Silver, founder of the Imaging Science Foundation, which consults with the TV industry on manufacturing displays, said that the most important specification is not resolution; it’s dynamic range. The darkness of the blacks is what’s most important. The next most important specification he named was color saturation; are the reds really red? More specifically, are they as red as they should be and not too red, either.
Next on his list was color accuracy. Do skin tones look as they should?
Mr Silver said “The last thing we look at is resolution.”
If you’re going to argue that Carlton Bale, editors at cnet, writers at the New York Times, and consultants to the TV industry are all wrong, you better have something better than some static, black and white images.
Fred – Your cat image is merely an example of how anti-aliasing blurs an image to hide the fact that the human eye can resolve individual pixels. All natural images have anti-aliasing as a result of sampling of an infinitely sampled image (reality). This doesn’t mean that a higher resolution image won’t look SHARPER, even if you cannot quite make out individual pixels in either image because of anti-aliasing.
Additionally, your image does not address the real-world application of video games and digitally rendered animation, all of which suffer from aliasing unless you employ processor heavy anti-aliasing algorithms, which blur the image and make it less sharp. Aliasing is made even more noticeable by motion, because it introduces flicker and shimmer to edges.
And again, you have provided no support for the article’s premise that the chart reflects the maximum distances from which the human eye can resolve individual pixels. You have again merely quoted the 1/60th degree standard as gospel, despite the fact that my test image proves it to be incorrect. If it were correct, you would not need anti-aliasing to hide stair stepping at the distances cited in the chart.
Chris, as I stated before, each “pixel” on the left is four physical pixels (2×2). Pull it up in a graphics editor and do some pixel peeping if you doubt me. Then step back 9 feet from that and tell me if you see a difference.
“This doesn’t mean that a higher resolution image won’t look SHARPER, even if you cannot quite make out individual pixels in either image because of anti-aliasing.”
Then step back 9 feet and tell me if one side looks sharper than the other.
“Additionally, your image does not address the real-world application of video games and digitally rendered animation,”
And your black and white, still-frame test pattern does? If you’re going to claim that you can see the difference between a 2K and a 4K monitor at 9 feet with a video game, then prove it, using a video game, rather than making unproven assertions that fly in the face of years of medical research.
“You have again merely quoted the 1/60th degree standard as gospel, ”
Sixty pixels per degree corresponds to recognizing the letter “E” on the 20/20 line of a Snellen eye chart at the prescribed distance of 20 feet. The Snellen Fraction of 20/20 represents standard visual acuity and corresponds to 60 pixels per degree and 30 line pairs per degree. Therefore, it is gospel. If you’re going to claim that the medical community is wrong, you better have something more convincing than your black and white image
“despite the fact that my test image proves it to be incorrect.”
No, your image is a test of “vernier acuity,” the ability of the people to pick out misalignment of high-contrast pixels. Humans are an order of magnitude more sensitive to pixel misalignment (vernier acuity).
Chris… chill out, dude! I do partially agree with you; with your black & white, jagged, still image, I can see some minimal jaggies at 11+ ft from my 15.6″ retina MBP screen (image 14.1 x 7.9″, 16.2″ diag, equiv. to a 32.4″ 4K screen). But that image is VERY unlike anything I’ll be watching in my theater. The cat image is much more real-world, and at 24″ from my monitor, it is hard to tell the difference between 540p and 1080p except in the fine hairs (the photo could be sharper). Add in motion, and our ability to appreciate the extra resolution decreases dramatically.
Bottom line, I prefer to sit closer than the 36 degrees viewing angle/1.5 times picture width THX recommends, and with my 20/15 vision, I fully expect to see more clarity and a lot less pixel structure (8′ from the 158″ 2.34:1 screen). Will my theater guests notice the difference? Probably not, especially from the back of the theater (18′ from the screen), but I will certainly relish it from my front row seat.
PS, Carlton, please post a higher resolution background image of your theater room… what a jagged eyesore on my retina MBP at 24″ from the screen! LOL (Joking about it being an eyesore, not about the visible jaggies. 🙂
Fred – Per your request for a blind side-by-side test:
“Earlier this month, we set out to investigate if the extra resolution offered by 4K over 1080p is visible at normal viewing distance, as part of an Ultra HD and OLED television showcase event organised by British retailer Richer Sounds. A 55-inch 4K UHD (ultra high-definition) TV was lined up alongside a 1080p HDTV of the same size, each displaying content that’s 1:1 pixel-matched to its native screen resolution. Both TVs had their identities masked by custom-built cabinets which were spray-painted black. Standing 9 feet away (enforced using crowd control posts), attendees were then asked to pick out the 4K television after sampling the displayed material.
The results are now in, and an overwhelming majority of participants correctly identified the 4K TV, indicating that there exists a perceptible difference even from as far as 9 feet away on a 55in screen. Out of 49 attendees who submitted their pick to enter a prize draw, only one thought that the 1080p set was the 4K display.”
– Vincent Teoh, 4K Resolution Is Visible vs. 1080p on 55” TV from 9′ Viewing Distance, HDTVtest.co.uk (Dec. 15, 2013).
Chris,
Thanks for pointing me to that test. I did read the article and found it unfortunate that they used two different televisions models with different panels. I’d have preferred two identical 4K televisions, with one receiving 2K and the other receiving 4K content. That way, the panel differences that they noted in their article would not have been present.
I’d also be very interested to see whether your contention that a 46″ at nine feet would have been properly identified, since the 55″ screen is 20% larger than a 46″ screen.
The article also went on to say that “resolution is only one of the many attributes of picture quality, and not the most important one. Amongst the swarm of 4K televisions on exhibit, it was actually a full HD 1080p set – the LG 55EA980W OLED TV – that hogged the attention of those attending the event, largely due to its ability to render true 0 cd/m2 blacks, contributing to an unrivalled contrast performance (which most video enthusiasts agree is the principal determinant of image quality).”
Identical 4K panels displaying 4K vs. 2K material would have tested viewers’ ability to distinguish the resolution, but not the impact of the screen door effect. For me, greatly reduced visible pixel structure is a large part of what excites me about 4K.
I disagree that contrast is more important than resolution or lack of pixel structure, since digital home theater projectors exceeded the blacks of movie theaters (at least film projectors) years ago, yet people still love the huge screen experience of a theater. I am quite anxious to see 4K vs 2K PJ material in person to judge whether 4K PJs are worth 4X the price for my home theater.
Fred – I do not dispute that other aspects to image quality matter as much if not more than resolution. That is not the dispute here. The dispute is whether the human eye can visually see the added resolution. All real world test evidence points to yes.
I’m sure the 1080p OLED screen does look better overall, due to the higher contrast ratio, response time, and color depth. However, I’m also sure that a 4K OLED screen would look even better still.
Chris, you wrote: “The dispute is whether the human eye can visually see the added resolution. All real world test evidence points to yes.”
No, there are no scientifically valid tests which ‘point to yes.’ Show me any that were run using the same 4K monitor at the normal viewing distances with ABX switching of resolution.
too many comments to read here, but i thought i would make you aware of an oversight in this post. i wont comment on the topic as i agree with some but not all, but one thing i want to say is the resolution calculator is set to state x amount of feet or CLOSER for full benefit. This should be x amount of feet away or FURTHER for full benefit.
I am sure i dont have to explain how sitting closer exposes more pixelation. So ideally, you should sit further away to avoid seeing pixelation. Recommend correcting this so as not to confuse or mislead people. The rest of your post is an opinion but this component is shown as factual and can be if corrected.
Boris, I get this feedback frequently and can see the reason for confusion. The page is correct in that you must sit at the specified distance or closer to get the full benefit, and let me explain why. You have to be at least close enough to a TV to experience the full resolution. For example, if you were 50 meters away from a 4K screen, it would look exactly like a 1080p screen and a 480p screen. As you get closer (much closer), you can start to tell the difference between the resolutions. Once you reach the distance calculated, you will be able to see the full resolution of the screen. Getting closer will yield no additional resolution benefit, but there is also no negative impact. But as you stated, as you get significantly closer, the pixel structure starts to break down and you can see individual pixels. This happens at a distance closer than the minimum distance calculated above.
The breakdown in pixel structure is highly dependent on display technology and is largely related to the fill ratio of each individual sub-pixel. On an ideal display, the fill ratio would be 100% and you wouldn’t be able to discern pixels even if your eye were about to touch the screen. Because fill ratio varies so much between displays and display technologies, it’s impossible to estimate a universal distance at which this happens. But for all modern displays, fill ratio is high enough that happens at a distance closer than the “full benefit distance” calculated above.
The one thing missing from this is the advantages in aliasing you get with higher resolutions as well as the elimination of the screen door effect. There have been studies that indicate the brain still picks up the “static”from the spaces in between pixels and smooths it out via its own interpretation.This is why when viewing the checkerboard pixel by pixel test, you can sometimes see purples or greens. As it is creating a moire effect.
Hi, Carlton can you comment on this http://auzed.com/crap/screensize.html
it compensates for visual acuity. How one considers change to higher res like, in 20/15 vision, 40″ 7 feet distance is 1080p and 8.5 feet is ~900p ?
Hi Bruno. I recommend that you take a look at my Home Theater Calculator spreadsheet. It has a field to change vision form 20/20 to whatever, and you can enter various other inputs as well. Hope this helps!
So, I have 20/10 vision. Given that I sit approximately 3 feet away from 28′ monitors, I could reasonably say that I’d benefit from 4K (given the output values of the chart have to be doubled for 20/20 –> 20/10). Right?
As a long time fan of Audio/Video equipment I’ve become quite accustomed to the “snake oil” some of you mentioned. In fact, on occasion I’ve found it necessary to place myself as subject in blind tests of my own careful design in order to make sure I wasn’t going to become a victim of some trickery. Of course, being an A/V fan, I’m familiar with the THX recommendations and the charts etc. the author provided. All of this, if anything, had made (at the very least) skeptical about 4k in the living room. However, all that all changed a few weeks ago. As I strolled through my local best buy I decided to take a look at Televisions while I waited for my love one who was picking out a CD. As I approached the newly set-up Samsung 4k TV display (and before I even had a chance to roll my eyes) my jaw dropped. Were the colors, brightness etc. all cranked up to high? Of course they were, I’m used to that and aware of it’s effect; that wasn’t it. What I saw was something that I couldn’t believe. At any reasonable viewing distance the difference wasn’t just noticeable it was amazing. Much like the time I first saw HD TV it made me instantly want one and wonder about what I had been missing.
I would suggest that anyone participating in this debate who has not seen a quality 4k TV in action go check one out – then argue about it’s merits. I would also like to say that If you can’t see the difference then good on you and save some money. However, if you’re one of the people here claiming some fancy A/V credentials and you say you can’t see the difference.. perhaps you’re in the wrong business? Some of us can see the difference at reasonable distances as Fred M. test seems to demonstrate.
One last thought, more and more I need to be able to read some tiny text somewhere on my TV screen. As poorly written GUI’s on our televisions become prolific, and convergence becomes a reality, can you 4k TV detractors really see no point in being able to read text on your TV (yes sometimes it’s even black on white)?
Love the discussion. Thanks.
In a way this debate over the merits of 4K resolution is a bit pointless because there’s no guarantee the new HD format has a viable future.
Think of all the thousands of film and tv titles that would still be in 480p SD without Sony’s Bluray player. If 4K is to have a future, Sony or another company will need to release a 4K media player and a ‘world standard’ optical disc. If no player is released in the next five years or so the format will die.
A few tv companies streaming 4K content won’t be enough to save the format. Blu-ray version#2 (if it happens) is the only way to secure 4K’s future.
I bought a SEIKI SE39UY04 39 inch 4k UHD TV to use as a computer monitor because I do a lot of old photo touchup and restoration and higher resolution on a very large screen would let me see more of a photo even at high magnification. (The unit can be found for $339 at Amazon and TigerDirect. No other 4k TV/monitor that size is available in anywhere near that price range.)
HERE’S WHAT *NO ONE* IS MENTIONING ABOUT USING A TV AS A MONITOR:
TV’s assume you are NOT sitting 3 feet from the screen. The LIGHT OUTPUT is MUCH stronger than a monitor. Web pages with a lot of text, wordprocessing documents, many utility programs (e.g., Windows Explorer), etc., ALMOST UNIVERSALLY use a WHITE background. Sitting 3 feet from a TV as a MONITOR is like STARING at an office FLUORESCENT LIGHT FIXTURE from 3 FEET away.
I had to turn Brightness down to 20 (out of 100) to reduce eye strain. The problem with doing THAT is that it distorts colors. MEDIUM blue at 20% brightness is ALMOST BLACK. Normal colors from an outdoor scene (e.g., Microsoft’s Win 7 default themes) look like they were heavily Photoshopped to “pop”.
ONE OTHER PROBLEM: Imagine holding the EDGE of a book 3 inches from your nose and trying to read an ENTIRE line. It would be pretty difficult to read the text farthest away. The effect with a VERY large monitor is similar (though of course not as bad). If your head is lined up with the middle of the screen, the text at the edges is pretty badly off angle and a bit hard to read.
Right now I’m only using the unit at 1920 x 1080 VGA, because my video card doesn’t support 4k. One thing I DO like is that 300 x 300 pixel thumbnails are about 3 inches square. The same “oversizing” does make the unit particularly useful in image editing — which is the main reason I bought a unit with such a large screen.
(No! Don’t be ridiculous! OF COURSE it’s not “AS BIG AS A TV”. My TV is FORTY inches. This MONITOR is only 38 and a half!)
Your stats on visual acuity are wrong. The number you post the point at which a person can still resolve two lines (roughly speaking). This is massively bigger than the finest detail a person with normal vision can perceive.
The finest detail a normally sighted person can discern (ie a single line on a uniform background) is actually .5 arc seconds, a whole 120 times smaller than your inaccurate 1 arc minute figure.
You are incorrect.
If that’s in response to me, no, I’m not.
Please look it up.
Yes, that was for you; and you are wrong. You can tell because I didn’t preface my statement with “I believe” or any other form of equivocation.
Specifically, you are wrong because you used the wrong unit of measurement. Resolution is the ability to discern individual objects (pixels, stars, dots, etc.) when they are side-by-side.
The star Altair subtends 0.005936 arc-seconds. You can see it. That doesn’t mean that your vision has a resolution of 0.005936″ or better.
Epsilon Boötis is a double star in the northern constellation of Boötes. It consists of two stars separated by 2.8 arcseconds. Even though the two individual stars have naked-eye-visible magnitudes, no human being is capable of seeing them as separate with their naked eyes.
From the Cornell University Department of Astronomy:
“[a circle is] divided into 360 degrees. A degree is divided into 60 arcminutes, and a minute is divided into 60 arcseconds. The arcsecond is the typical “working unit” of angular measure for astronomical images. It is the typical smallest unit of resolution that is delivered by a large telescope due to blurring by atmospheric “seeing” without active or adaptive telescope corrections. There are 3600 arcseconds in a degree, or 206265 arcseconds in a radian. The smallness of this dimension can be visualized by noting that an arcsecond is about the apparent size of the thickness of a human hair about 100 feet away. A human with excellent vision can see, unaided, angular separations down to about 2 arcminutes.”
So unless you can count human hairs at a distance of 200 feet, you don’t have 0.5 arcsecond resolution vision.
And I didn’t say you did have resolution to .5 arc seconds. Read again.
My point was that detail beyond the point where two lines can be distinguished as two lines is still perceptible and meaningful.
A 4k screen will, actually, be better than a 1080p screen at distances well in excess of those claimed in the article.
You wrote: “My point was that detail beyond the point where two lines can be distinguished as two lines is still perceptible and meaningful.”
Based on what? I’ve spent decades as an amateur astronomer and telescope maker. I just retired from a job where I built and tested satellites, including those that carried tele-optical instruments. I’ve never heard or read a claim that the minimum angle subtended by a barely visible object was a measure of the ability of the eye to distinguish detail.
“A 4k screen will, actually, be better than a 1080p screen at distances well in excess of those claimed in the article.”
Can you cite some peer-reviewed studies or statements from experts in human vision that support that contention?
Does your calculation consider factors like motion blur which would further decrease the benefit of high resolution?
hdhead, no it does not consider motion blur. Depending on the type of flat panel (such as LCD), there will be a tremendous loss in resolution during high motion content.
It is most likely true that the human eye can not see a difference in resolution the majority of the time. Most of you are far more intelligent than I am and I don’t have data that supports either side. Interesting to me is that earlier in this debate is was mentioned color and contrast were more important than resolution. This may be true as well and does make sense. It would also appear to me that 4 times the pixels in the resolution would make both color and contrast have more depth and consistency within the picture. If that is true it could account for people believing that there is a true difference in the quality of the resolution. In the end the level of enjoyment you get is what matters.It is said people can’t see the shutters move on active 3d glasses however many people I know, and myself, feel nausea and/or headaches after wearing them for movie long periods of time. This brings me to my next point. Passive 3d reduces the resolution of the video you are watching however it is also brighter in most cases and the glasses themselves are more enjoyable to wear. The density of the 4k resolutions, and even more important 8k resolutions, make passive 3d very viable. For those of you that don’t know passive 3d blocks half of the pixels on the horizontal line from each eye. This make the resolution 1920×540 for each eye. Obviously making 1920 visible by both eyes the entire time. With the level of detail 4k and 8k we could have images actually erupting from the screen in your living room. This is huge and cannot be over looked. Beyond that physical discs for this content don’t even have to be available.Most people in my age group do not use hard copies anymore As long as there is online and digital content it will last. No the content isn’t there yet however it will be and will be viable.