A Brief History of High Resolution

With the consumer market still digesting 4K TVs, Sharp has already taken high-resolution technology to the next level by releasing the world’s first 8K TV this fall. Priced at over $133,000, Sharp’s 85-inch LV-85001 is aimed at a corporate market and mainly represents a brand positioning move for Sharp. By being the first to release an 8K TV, Sharp gets a jump on Samsung, which previewed its own 110-inch 8K TV this January at the Consumer Electronics Show before announcing in June that an 11K TV is being developed for the 2018 Pyeongchang Winter Olympics. LG also debuted a 98-inch 8K TV at this year’s CES, indicating how the competition is heating up in the emerging 8K market.

As technology continues to race ahead, consumers trying to keep up might take a moment to reflect on how high-resolution got to where it is in order to better understand where it might be going.

The Birth of High Resolution

Today’s high-resolution revolution represents the flowering of seeds sown in the early 20th century at the dawn of television, a long history documented most fully by Philip J. Cianci in his HDTV Archive Project. For the first half-century of television, resolution was measured in lines per screen rather than pixels. Pioneering TV high-resolution efforts in the 1930s and 1940s had 240 to 819 lines per screen, improving upon previous resolutions that used as few as 12 lines. The new resolution used a display method known today as progressive scanning, where each line of an image is displayed in sequence, in contrast to the traditional analog method where first odd and then even lines are drawn alternately.

In 1953, analog color TV debuted on U.S. markets with 525 lines, establishing the NTSC color standard. Europe followed up in the 1960s by introducing the 625-line SECAM and PAL standards. Japan began developing a 5:3 ratio HDTV system for commercial TV in 1979. A demonstration to President Ronald Reagan in 1981 spurred U.S. interest in developing HDTV. However, bandwidth barriers limited widespread adoption of analog HDTV, and digital would prove to be TV’s future direction.

The Coming of Pixels

By the 1980s, home computer users had begun to speak of screen resolution in terms of pixels. The term originated as an abbreviation of “picture element” in 1965 and was first used by image processing and video coding experts. During the late 1960s and early 1970s, TV specialists began using this terminology to refer to unit cells of TV image sensors, which capture the information that creates screen images. In the late 1970s the term “pixel” began appearing in textbooks, leading to more widespread usage.

Early home computers such as the TRS-80 and Commodore PET used cathode ray tubes to create monochrome displays. In 1977, the Apple II introduced color CRT display to home computers by adapting the NTSC color signal. The Apple II achieved a resolution of 280 pixels horizontally by 192 pixels vertically.

Computer screen resolution continued to advance, with IBM introducing a VGA standard display of 640×480 in 1987. Since then, demand for digital videos and video games has driven resolution ever upwards. Desktop monitors have now reached a standard resolution of 2560×1600, while mobile devices range from 240×320 for the smallest devices to 1536×2048 for iPad Retina displays.

Digital Transforms TV

As the computer revolution has advanced, technological innovations have transformed TV displays. Throughout the 1980s and early 1990s, the FCC’s Advanced Television Systems Committee (ATSC) reviewed different proposals for future TV standards, as the University of Colorado Boulder summarizes. By mid-1991, a decision had been made to shift from analog to digital TV. In 1996, digital was officially mandated as a new ATSC standard for future DTV/HDTV broadcasting. The ATSC’s new HDTV system publicly launched in October 1998 with a broadcast of astronaut John Glenn’s flight on the Space Shuttle Discovery.

HDTV uses a resolution of 1920x1080p, equivalent to 2,073,600 pixels (2.07 megapixels) per frame, a standard known as 1080p. Today’s 4K Ultra HDTV bumps this up to 3840x2160p, known as 2160p, which amounts to four times the amount of pixels and twice the resolution of HDTV. 8K would increase this to 7680×4320.

This resolution increase has accompanied a shift from cathode ray tube displays to other display methods. In the 1990s, plasma TVs and liquid crystal display (LCD) TVs introduced a trend towards thinner and lighter TVs. By 2006, LCDs had proven more popular due to lower prices. LCDs created colored images by selectively blocking and filtering a white LED backlight rather than directly producing light. OLED improved upon this by directly producing colored light, allowing for greater contrast. Sharp’s 8K TV retains LCD’s backlit method but uses a blue backlight with nano-sized quantum dots to create an effect that delivers brightness and high contrast comparable to OLED at a lower cost.

3D and the Limits of the Eye

The purpose of all this resolution is ultimately to better simulate the 3D experience of actual vision. However, there may be a limit to how far resolution can go. When the iPhone 4 was released, Steve Jobs claimed that the human eye cannot detect smartphone resolution beyond 300 pixels per inch (PPI). However, Sharp believes the eye can actually detect 1,000 PPI, which helps explain their 8K venture. Only time will tell.

Web