nicolas-j-leclercq-qDLLP0yP7FU-unsplash

Nicolas J Leclercq

By Stephen Beech

Forking out on an ultra-HD TV may be a waste of money for many people, suggests new research.

The human has a "resolution limit" - meaning it can't tell the difference between a 4K and 8K ultra-high-definition television, according to the University of Cambridge study.

In other words, there are only so many pixels the eye can see. Above that limit, a screen is giving our eyes more information than they can detect.

To calculate the resolution limit, the research team conducted a study that measured participants’ ability to detect specific features in color and greyscale images on a screen, whether looking at the images straight on or through their peripheral vision, and when the screen was close to them or further away.

They explained that the precise resolution limit depends on several factors, including the size of the screen, the darkness of the room, and the distance between the viewer and the screen.

pexels-cottonbro-4107152

(Photo by cottonbro studio via Pexels)

However, for an average-size UK living room, with 2.5 meters (8.2ft) between the telly and the sofa, a 44-inch 4K or 8K TV would not provide any additional benefit over a lower resolution Quad HD (QHD) TV of the same size.

The research team has also developed a free online calculator where users can enter the size of their room and the dimensions and resolution of their TV to determine the most suitable screen for their home.

They say shoppers are bombarded with technical information from manufacturers when it comes to buying a new TV, trying to persuade them that the display resolution of their screens - whether Full HD, 4K or 8K - offers them the best viewing experience.

Display resolution is considered equally important for other screens we use, on our phones or computers, whether we’re using them to take pictures, watch films or play games.

Study first author Dr. Maliha Ashraf said: “As large engineering efforts go towards improving the resolution of mobile, AR and VR displays, it’s important to know the maximum resolution at which further improvements bring no noticeable benefit.

“But there have been no studies that actually measure what it is that the human eye can see, and what the limitations of its perception are.”

Co-author Professor Rafał Mantiuk, also from Cambridge’s Department of Computer Science and Technology, said: “If you have more pixels in your display, it's less efficient, it costs more and it requires more processing power to drive it.

pexels-freestockpro-12956039

(Photo by Atlantic Ambience via Pexels)

“So we wanted to know the point at which it makes no sense to further improve the resolution of the display.”

The research team created an experimental setup with a sliding display that allowed them to measure exactly what the human eye can see when looking at patterns on a screen.

Instead of measuring the specifications of a particular screen, they measured pixels per degree (PPD): a measurement of how many individual pixels can fit into a one-degree slice of your field of vision.

The researchers explained that measuring PPD helps answer a more useful question than "how high is the resolution of this screen?"

Instead, it answers the question "how does this screen look from where I’m sitting?"

The widely accepted "20/20 vision" standard, based on the Snellen optician's chart, suggests that the human eye can resolve detail at 60 pixels per degree.

Dr. Ashraf said: “This measurement has been widely accepted, but no one had actually sat down and measured it for modern displays, rather than a wall chart of letters that was first developed in the 19th Century."

Study participants looked at patterns with very fine gradations, in shades of grey and in color, and were asked whether they were able to see the lines in the image.

The screen was moved towards and away from the viewer to measure PPD at different distances.

pexels-karola-g-5202957

(Photo by Karola G via Pexels)

PPD was also measured for central and peripheral vision.

The findings, published in the journal Nature Communications, showed that the eye’s resolution limit is higher than previously believed, but that there are important differences in resolution limits between color and black-and-white.

For greyscale images viewed straight on, the average was 94 PPD. For red and green patterns, the number was 89 PPD, and for yellow and violet, it was 53 PPD.

Mantiuk said: “Our brain doesn’t actually have the capacity to sense details in color very well, which is why we saw a big drop-off for color images, especially when viewed in peripheral vision.

“Our eyes are essentially sensors that aren’t all that great, but our brain processes that data into what it thinks we should be seeing.”

The research team modelled their results to calculate how the resolution limit varies across the population, which will help manufacturers make decisions that are relevant for the majority of the population.

For example, designing a display that has retinal resolution for 95% of people rather than an average observer.

Based on the modelling, the research team developed their online calculator, which enables people to test their own screens or help inform future buying decisions.

Study co-author Dr. Alex Chapiro, of Meta Reality Labs, added: “Our results set the north star for display development, with implications for future imaging, rendering and video coding technologies."

Originally published on talker.news, part of the BLOX Digital Content Exchange.

(0) comments

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.