Every so often discussion of the ever-higher resolution of TV screens generates articles purporting to prove that you can't see the improvement unless you sit a few feet from the largest available screen. Most of these articles make the same three mistakes:
Fallacy 1: Normal vision is 20/20 vision
The term "20/20 vision" means only that you can see as well as a "normal" person. In practice this means that it is the lower threshold below which vision is considered to be in need of correction; most people can see better than this, with a few acheiving 20/10 (that is, twice the resolution of 20/20).
Fallacy 2: Pixel size = Resolution
If a screen has 200 pixels per inch then its resolution, at best, is only 100 lines per inch because otherwise you cannot distinguish between one thick line and two separate lines. For the more technically minded, this is the spatial version of the nyquist threshold. Wikipedia has a very technical article, but this picture demonstrates the problem:
The pixel pitch is close to the height of a brick, leading to the moire pattern because in some areas the pixels are focused on the middle of a brick and in some areas the pixels are focused on the white mortar.
So the resolution of the screen in the horizontal or vertical directions is half the pixel pitch. But it gets worse as soon as you have some other angle because those pixels are arranged in a grid. The diagonal neighbours of a pixel are 1.4 times further apart than the horizontal and vertical ones, so the worst-case resolution is the pixel pitch divided by 2*1.4 = 2.8. Call it 3 for round numbers.
So the conclusion is that the actual resolution of the picture on your screen is about 1/3 of the pixel pitch.
Fallacy 3: Resolution beyond visual acuity is a waste
The argument here seems to be that if HDTV resolution is better than my eyesight then getting HDTV is a complete waste and I would be better off sticking to my normal standard definition TV.
Clearly this is wrong: as long as my visual resolution outperforms my TV then I will get a better picture by switching to a higher definition format.
So when does HDTV become worth it?
20/20 vision is generally considered to be a resolution of 1 arc-minute. If we use the naive approach with all three fallacies then one pixel on a 40 inch HDTV screen subtends 1 arc-minute at a distance of 62 inches, so some articles on the subject have claimed that unless you sit closer than that you don't get any benefit
However on that 40 inch screen a standard definition pixel will be roughly twice the size (depending on which standard and what you do about the 4:3 aspect ratio on the 16:9 screen), so it will subtend 1 arc-minute at around 124 inches (just over 10 feet). So with 20/20 vision you will be able to separate two diagonal lines separated by one pixel at a distance of 30 feet, and with 20/10 vision that goes out to 60 feet. So if you sit less than 30 feet from a 40 inch screen then you will get a visibly better picture with HDTV than standard definition.
And what about Ultra HD?
With 20/20 vision you can just about distinguish two diagonal lines one pixel apart on a 40 inch HDTV screen from 15 feet away, and 30 feet if you have 20/10 vision. So if you sit closer to the screen than that then you will get a better picture with Ultra HD. And of course Ultra HD sets are often bigger than 40 inches. If you have a 60 inch set then the difference is visible up to 23 feet away with 20/20 vision and 46 feet with 20/10.
So higher resolutions are not just marketing hype.
Final point: Compression artifacts
Digital TV signals are compressed to fit into the available bandwidth. This shows up in compression artifacts; if there is a lot of movement across the image then you may see it become slightly blocky, and if you freeze the image then you can often see a kind of halo of ripples around sharp edges. Higher definition pictures are encoded with more data so that these artifacts are reduced. So even without the increased resolution you may still see an improved picture in a higher resolution format.
3 comments:
Another fallacy: resolution is the most important problem for TVs. I was in an electronics store the other day, and they have rows and rows of large, expensive TV sets, and they all look like crap. Quantized colors, blurry images, to begin with. (I guess the garish colors can be adjusted). I find it almost funny that TVs which have HD resolution and accept a HD signal, nevertheless scale and crop the signal so it no longer fits...
I find it amusing that people think that HDTV isn't worth it. They clearly haven't actually seen an HDTV signal or a Blu-ray on a decent screen. I only know one person who can't tell the difference between SD and HD versions of a simulcast channel, and she's 90 and her eyesight's gone beyond the point of being correctable to 20/20 anymore.
There's also something else, unrelated to visual resolution, but HD content usually comes with a noticeably superior audio signal - provided you have a decent set of speakers, anyway.
@Matt Walton: If being able to notice a difference makes HDTV 'worth it' then that's a bizarre notion of worth, since you've not taken the cost into consideration at all. If HDTV cost $1,000,000 then your argument still says they're 'worth it'.
Personally I think HDTV isn't worth the cost of supporting amoral practices like HDCP.
The only times I find myself caring about resolution are when online lectures are too low-quality to read the slides, but they're often available separately anyway.
Post a Comment