Television resolutions have become a moving target in the last ten years—every time a consumer decides to jump in with both feet and buy the apparent latest model, better screens seem to appear on the shelves within weeks. TVs took decades to go from standard to high-definition resolution. Only a few years have gone by, and now “Ultra HD” is the new glass ceiling.
At CES 2012, a few companies showed “4K” displays, resolutions with four times the pixel count (2160p lines of resolution, usually 3840×2160) of a full HD display (1080p lines of resolution, usually 1920×1080).
This year, all companies seem to have transitioned to showing “Ultra HD” displays instead of “4K” ones. Where did 4K go? Why are we back to describing displays in terms of HD?
Ultra HD, like vanilla HD, is a term defined by the International Telecommunication Union (ITU), a body that has existed since 1865 and, as an agency for the United Nations, acts as the allocator for global radio spectrum. One of the ITU’s sectors sets standards in areas like networking, signaling protocols, and telecommunications (which includes television resolutions).
The terms “Ultra HD” and 4K have co-existed for some time. The first Ultra HD prototype was developed by NHK Science and Technical Research Laboratories in Japan (the same lab that developed HD) back in 2003, for which they had to create a special camera to make sufficiently detailed footage. But just as the term “HD” before it technically covers both 720p and 1080p-resolution screens, “Ultra HD” describes two resolutions: 4K, or 2160p, as well as 8K, or 4320p, which is visually detailed enough to compare to IMAX.

Loading comments...