Every so often discussion of the ever-higher resolution of TV screens generates articles purporting to prove that you can't see the improvement unless you sit a few feet from the largest available screen. Most of these articles make the same three mistakes:
Fallacy 1: Normal vision is 20/20 vision
The term "20/20 vision" means only that you can see as well as a "normal" person. In practice this means that it is the lower threshold below which vision is considered to be in need of correction; most people can see better than this, with a few acheiving 20/10 (that is, twice the resolution of 20/20).
Fallacy 2: Pixel size = Resolution
If a screen has 200 pixels per inch then its resolution, at best, is only 100 lines per inch because otherwise you cannot distinguish between one thick line and two separate lines. For the more technically minded, this is the spatial version of the nyquist threshold. Wikipedia has a very technical article, but this picture demonstrates the problem:
The pixel pitch is close to the height of a brick, leading to the moire pattern because in some areas the pixels are focused on the middle of a brick and in some areas the pixels are focused on the white mortar.
So the resolution of the screen in the horizontal or vertical directions is half the pixel pitch. But it gets worse as soon as you have some other angle because those pixels are arranged in a grid. The diagonal neighbours of a pixel are 1.4 times further apart than the horizontal and vertical ones, so the worst-case resolution is the pixel pitch divided by 2*1.4 = 2.8. Call it 3 for round numbers.
So the conclusion is that the actual resolution of the picture on your screen is about 1/3 of the pixel pitch.
Fallacy 3: Resolution beyond visual acuity is a waste
The argument here seems to be that if HDTV resolution is better than my eyesight then getting HDTV is a complete waste and I would be better off sticking to my normal standard definition TV.
Clearly this is wrong: as long as my visual resolution outperforms my TV then I will get a better picture by switching to a higher definition format.
So when does HDTV become worth it?
20/20 vision is generally considered to be a resolution of 1 arc-minute. If we use the naive approach with all three fallacies then one pixel on a 40 inch HDTV screen subtends 1 arc-minute at a distance of 62 inches, so some articles on the subject have claimed that unless you sit closer than that you don't get any benefit
However on that 40 inch screen a standard definition pixel will be roughly twice the size (depending on which standard and what you do about the 4:3 aspect ratio on the 16:9 screen), so it will subtend 1 arc-minute at around 124 inches (just over 10 feet). So with 20/20 vision you will be able to separate two diagonal lines separated by one pixel at a distance of 30 feet, and with 20/10 vision that goes out to 60 feet. So if you sit less than 30 feet from a 40 inch screen then you will get a visibly better picture with HDTV than standard definition.
And what about Ultra HD?
With 20/20 vision you can just about distinguish two diagonal lines one pixel apart on a 40 inch HDTV screen from 15 feet away, and 30 feet if you have 20/10 vision. So if you sit closer to the screen than that then you will get a better picture with Ultra HD. And of course Ultra HD sets are often bigger than 40 inches. If you have a 60 inch set then the difference is visible up to 23 feet away with 20/20 vision and 46 feet with 20/10.
So higher resolutions are not just marketing hype.
Final point: Compression artifacts
Digital TV signals are compressed to fit into the available bandwidth. This shows up in compression artifacts; if there is a lot of movement across the image then you may see it become slightly blocky, and if you freeze the image then you can often see a kind of halo of ripples around sharp edges. Higher definition pictures are encoded with more data so that these artifacts are reduced. So even without the increased resolution you may still see an improved picture in a higher resolution format.