A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)
A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)
Hmm, I suppose quality of TV might matter. Not to mention actually going through the settings and making sure it isn’t doing anything to process the signal. And also not streaming compressed crap to it. I do visit other peoples houses sometimes and definitely wouldn’t know they were using a 4k screen to watch what they are watching.
But I am assuming actually displaying 4k content to be part of the testing parameters.
Yeah well my comparisons are all with local files, no streaming compression
Also, usually when people use the term “perfect” vision, they mean 20/20, is that the case for you too. Another term for that is average vision, with people that have better vision than that having “better than average” vision.
Idk what 20/20 is, I guess you guys use a different scale, last mandatory vision test at work was 12/10 with 6/7 on I don’t remember which color recognition range, but I’m not sure about the latter 'cause it was ok last year and 6/7 the year before also. IIRC the best score for visual acuity is 18/10, but I don’t think they test that far during work visits, I’d have to go to the ophthalmologist to know.
I would imagine it’s the same scale, just a base 10 feet instead of 20 feet. So in yours you would see at 24 feet what the average person would see at 20 feet. Assuming there is a linear relation, and no circumstantial drop off.
I doubt it’s feet, but if it’s a distance, I guess it doesn’t matter much