On 7 June 2010, Steve Jobs launched the latest and most advanced smartphone on the market. The iPhone 4. Apart from many features it had that were the ‘best ever’, one boast particularly caught my eye (pre-pun). Jobs introduced a new display called ‘Retina Display’ which was past the limit where the human eye could make out different pixels. The screen had 326 individual pixels for every square inch of the screen. Commonly known as ppi (pixels per inch). Jobs claimed that the human eye could only see 300ppi at a reading distance and Apple had surpassed that.

Jobs got a lot of things right but in this scenario, he got it wrong.

The limit of the resolution of the human eye is a complicated field but I will somewhat oversimplify it. The human eye can discern individual pixels based on the person’s vision capabilities and the distance from the object. If we take a person with 20/15 vision and they hold an object at 25cm from their eyes, they can distinguish 460 pixels per inch. With 20/20 vision this drops back to 340ppi. Still above the claim made by Jobs.

Having moved forward by eight years, we now have phones that have managed to crack the 800ppi barrier. Even someone with 20/15 vision and holding the phone 15cm from their face would struggle to nominate individual pixels so you could say that we are pretty much at the limit now of the human eye.

What about televisions? Their screens are much larger and therefore need a much larger number of pixels to have the same pixel density. We were all excited when High Definition (full HD) televisions started appearing. These had a resolution of 1920 pixels by 1080 pixels and, on some of the first 42-inch plasma televisions, this resulted in a pixel density of 52ppi. That doesn’t sound impressive compared to mobile phones, but we typically view a television at a distance of four metres. At that distance, someone with 20/15 vision can only see 29ppi.

Jump forward to today and we now have televisions at 4K (3840 x 2160) and cricket broadcasts in Australia are now matching that resolution. On a modern large screen (75”) at 4K we are now talking about a screen with 58ppi. Unless you sit closer than two metres to the screen, you won’t pick out each pixel. Surely that seems like enough.

This is technology – where too much of anything is never enough.

Japanese broadcaster NHK (the equivalent of our ABC) launched a new service at the beginning of this month. The first broadcast in the world at 8K. The first broadcast was of 2001: A Space Odyssey. This film was made back in 1968 but Warner Bros scanned the original negatives at an 8K resolution for the broadcast. Broadcasting in 8K is a major breakthrough – except I would doubt anyone actually watched it in 8K. There are only a handful of televisions available at 8K and their price tag is enough to give you a trip to the movies every night for several years. A couple of the major manufacturers have launched 8K televisions at 85-inch and 88-inch but the price tag is the equivalent of a small car. When we do the maths on the pixel density, it becomes quite interesting. An 88-inch 8K TV has a pixel density of a staggering 100ppi. Even at just over a metre from this screen I would defy you to spot an individual pixel.

Once 8K televisions become common, the content will need to match. Despite NHK requesting a rescan of negatives for a launch, I can’t see the studios rescanning every movie and film ever made. That doesn’t mean we should give up – just don’t have high expectations on day one. Studios will start to produce content in 8K and manufacturers will slowly drop prices. It is just the next iteration in a constant technology improvement process that humans are on.

Did I hear someone say 16K?

Mathew Dickerson

Scroll to Top