Friday, January 18, 2013

What is 4K? Next-generation resolution explained








The 84-inch LG 84LM9600 is the largest LCD released on the market so far, and one of the first with 4K resolution.
(Credit: LG)

If you keep track of next-generation TV technology, you're going to start hearing a lot more about 4K or Ultra HD. Here's what it is and why it exists.


As if LED and 3D TV weren't confusing enough, in the last few months we have seen a new HDTV technology called 4K, or its official name, Ultra HD. It's being heralded as the next high-def, and judging by the show floor at CES 2013, manufacturers are lining up to bring you a new array of products.
But just as was the case with 3D, it's the hardware chicken before the software egg: there's no consumer 4K content available. Still, if you listen to the industry, it'll tell you it's the last resolution you'll ever need. So what is 4K anyway, and what makes it different from high definition?

Digital resolutions: A primer

The latest in a line of broadcast and media resolutions, 4K/UHD is due to replace 1080i/p (1,920x1,080 pixels) as the highest-resolution signal available for movies and, perhaps, television.
Though there are several different standards, "4K" in general refers to a resolution of roughly 4,000 pixels wide and about 2,000 pixels high. That makes it the equivalent of four 1080p screens in height and length. Currently 4K is a catch-all term for a number of standards that are reasonably close to that resolution, and the TVs we'll see this year labeled 4K will actually be Ultra HD, which is defined below. But frankly, we think 4K is the catchier name.
Meanwhile, high definition (HD) itself has been with us for about a decade and is used in Blu-ray movies and HD broadcasts. There are three versions of HD: full high definition 1080p (progressive), 1080i (interlaced), and 720p (also called simply "high definition").
Most television programs and all DVDs are encoded in standard definition (480 lines). Standard definition is the oldest resolution still in use as it began life as NTSC broadcasts, switching to digital with the introduction of ATSC in 2007.

Four resolutions compared: standard definition; full high definition; Quad HD; and 4K/2K.
(Credit: CNET)

The beginnings of digital cinema

The roots of 4K are in the theater.
When George Lucas was preparing to make his long-promised prequels to the "Star Wars" movies in the late '90s, he was experimenting with new digital formats as a replacement for film. Film stock is incredibly expensive to produce, transport, and store. If movie houses could simply download a digital movie file and display it on a digital projector, they could save a lot of money. In a time when cinemas are under siege from on-demand cable services and streaming video, cost-cutting helps to keep them competitive.
After shooting "The Phantom Menace" partly in HD, George Lucas shot "Attack of the Clones" fully digitally in 1080p. This was great for the future Blu-ray release, but the boffins soon found that 1080p wasn't high-enough resolution for giant theater screens. If you sit in the front rows of one of these theaters as it's displaying 1080p content, you may see a softer image or the lattice grid of pixel structure, which can be quite distracting.
The industry needed a resolution that would work if the audience were sitting in the optimum "one-and-a-half times the screen height" from the screen or closer, and found it required that resolution to be higher than 1080p. The Digital Cinema Initiatives (DCI) was formed in 2002 with the goal of setting a digital standard. Based on these efforts, two new resolutions came about: a 2K specification, and later in 2005, the 4K format.
The first high-profile 4K cinema release was "Blade Runner: The Final Cut" in 2007, a new cut and print of the 1982 masterpiece. Unfortunately, at that time very few theaters were able to show it in its full resolution. It would take one of director Ridley Scott's contemporaries to truly drive 4K into your local cineplex.

The 4K 'standard'

"4K is at the point of diminishing returns." --Dr. Dave Lamb of 3M Laboratories
Despite the industry's best intentions, there is still no single 4K standard -- there are five or more different shooting resolutions available. In cinemas, you see projectors based on the DCI specification.
In August 2012, the Consumer Electronics Association attempted to clarify the situation for the home by introducing the term Ultra High Definition defined as resolutions of "at least 3,840x2,160 pixels". However, the next day Sony muddied the waters by saying it was going to call the technology "4K Ultra High Definition".
The HDMI organization recently added two types of 4K support to its latest 1.4 specification: the former "Quad HD" (strictly 3,840x2,160 pixels) and 4K/2K, also called 4Kx2K (4,096x2,160 pixels). While only Quad HD conforms to the classic 16:9 ratio of modern television screens, both Quad HD and 4K/2K qualify as "Ultra High Definition".
Meanwhile, some industry experts have questioned the necessity of 4K as a home format, given the lack of content and the need for very large displays to appreciate the extra resolution.
"There was a huge, noticeable leap from standard definition to HD, but the difference between 1080p and 4K is not as marked," said researcher Dave Lamb of 3M Laboratories.
Lamb added that "4K is at the point of diminishing returns," but there could be some benefits for screens over 55 inches.

3D


Parts of 'The Phantom Menace' were shot digitally, and the film enjoyed a new lease on life in early 2012 with a 3D cinema release
(Credit: 20th Century Fox/Lucasfilm Ltd.)
Did you see James Cameron's "Avatar 3D" in the theater? Then you've seen 4K in action. Cameron's movie about "giant blue dudes" helped drive high-resolution 4K Sony projectors into theaters around the world, and made a lot of money in the process. Movie studios keen to maintain that momentum have released a slew of 3D films -- mostly converted from 2D -- and continued the expansion of 3D cinemas.
However, this forward motion hasn't translated to a success for 3D TV in the home.
"Manufacturers would have wanted 3D to be bigger than it was; they wanted it to be the next LED, but it didn't work out," Lamb said.
Given a so-far-mediocre response to 3D, and the expense and bulk of active glasses, manufacturers have begun to search for an alternative, and 4K offers a way increase the quality of the 3D image with passive glasses or get rid of them altogether.

In-home 4K, now and the future

The 4K TVs will be big and expensive for the next couple of years.
Most companies have committed to releasing 4K displays in 2013, and in the absence of 4K media to watch, the main benefit would seem to be the enhancement of 3D quality. Theresolution disadvantages of LG's passive 3D system can, in theory, be overcome by doubling the number of horizontal and vertical pixels, allowing 4K passive displays like theLG 84LM9600 (due this summer) to deliver 1080p to both eyes.
The first consumer-grade 4K panel to hit the U.S. market was the LG 84LM9600 that features a UHD resolution of 3,840x2,160 pixels and currently goes for $17,000. Meanwhile,Sony's 84-inch XBR-84X900 TV will set you back 25K. More TVs will be coming your way in 2013.
Sony announced its 4K home-theater projector, the VPL-VW1000ES, in September 2011, but does not make the product available through its Web site or stores and instead sells it directly to custom installers. Meanwhile, JVC announced four projectors in 2011 that upscale 1080p content to 4K but currently are unable to display native 4K content.
In the absence of 4K content, players and displays will need to upscale 1080p or even standard-definition content. To this end, Sony has a Blu-ray player, the BDP-S790, that will upscale to 4K. Sony has also announced it will bundle a movie server that has 4K films stored on it with its X900 TV.
Looking to the future, Sony is reportedly keen to have the forthcoming "Spider-Man" reboot become one of the first 4K Blu-ray movies, and is apparently in talks with the Blu-ray Disc Association to finalize the specification.
Tim Alessi, director of home electronics development at LG, said he believed that such a development was not only inevitable but also potentially valuable.
"I do expect that at some point [4K] will be added [to the Blu-ray specification]. Having that content in the home is what the average consumer will want," Alessi said.
Just when we thought we had it all covered, 4K may not even be the final word in resolution. Japanese broadcaster NHK was the first to demonstrate 8K in 2008, and at CES 2012 there were industry murmurings -- and at least one prototype -- devoted to higher-than-4K resolution.

Conclusion

Will the extra resolution offered by 4K make movies better? You could argue that it depends on the format of the original film. For example, "The Blair Witch Project" and "28 Days Later" were both shot with standard-definition camcorders, and there would arguably be little extra benefit to buying either movie in a 4K native format over a DVD -- depending on the quality of the scaler in your brand-new 4K screen, of course.
Even with reference-quality native 4K material, however, a 4K-resolution TV or projector won't provide nearly the visible improvement over a standard 1080p model that going from standard-def to high-def did. To appreciate it you'll have to have sit quite close to a large screen -- sort of like being in the front few rows of a movie theater.
But whether it's 4K or 8K, you can bet that manufacturers haven't run out of cards when it comes to trying out the next "must-have" feature in the coming crops of televisions.
http://reviews.cnet.com/8301-33199_7-57364224-221/what-is-4k-next-generation-resolution-explained/?ttag=fbwp

No comments:

Post a Comment