8-Bit Vs. 12-Bit Color in HDMI

Save

High Definition Multimedia Interface, or HDMI, offers the ability to display millions of colors. Since the human eye is very sensitive to color gradations, it is important for the HDMI standard to accommodate it. As the HDMI standard evolves from 8-bit to 12-bit color, the amount of colors and visible shades of gray that the cabling can pass increases dramatically.

8-Bit

  • The original HDMI specification called for 8-bit color per channel. The system is limited in that this standard only results in 256 shades of color. This results in visible transitions between not only colors, but shades of gray. This results in banding, especially visible in shadowy scenes that transition from black to lighter shades.

Improving the Standard

  • HDMI version 1.3, introduced in 2007, offers increasing processing options. Known as Deep Color and xxYCC, these options create the ability to process the right colors called for on the image. Of the 16 million colors specified in the HDMI standard, thousands of shades of each color are not used. Deep Color expands the palette, increasing the level of complex shading and hue represented in the real world.

12-Bit Color

  • A major improvement from 8-bit color, 12-bit offers up to 69 billion colors on screen at any given time. At this bit rate the color banding that plagues 8-bit processing is gone, dramatically increasing the realism of the image. This capability is limited only by the resolution of the image, as provided by the source equipment and display devices. Only a few million picture elements, or pixels, are on screen at any given moment. The processing in the source and display devices must therefore correctly choose the right color, hence the introduction of Deep Color. As this capability increases, the capacity of the HDMI cables must increase to keep pace. The use of 12-bit color depth is in practice a guarantee of sorts that, as device resolution increases, the right amount of color accuracy is maintained.

Why Bother?

  • Modern high definition televisions are not limited in a practical sense to what colors they can display. Seeking incremental improvement for 9-bit bit rate sources like DVD, certain manufacturers have brought sets to market that offer an additional color in the pixel array, as digital camera companies did starting around 2003. This is a way of allowing more color combinations, thereby increasing the available color palette inside the set's processor. At 12 bits and higher, the limiting factor is the incoming signal. By increasing HDMI color depth to 12, 16, or 30 bits and beyond, color visibility limitations become a non-factor. The Deep Color-equipped hardware has a vast array of color selection to ensure the perfect color is selected. Since the human eye is sensitive to around 10 million colors, this increased palette helps ensure that the processor in the source or display device "chooses" the right hue to display at any given moment. Remember that this is all software-dependent; the movie or game disc must have this degree of data encoded, played on the right hardware, and passed through compatible cabling for it all to work.

References

  • Photo Credit Ethan Miller/Getty Images News/Getty Images
Promoted By Zergnet

Comments

Related Searches

Check It Out

Geek Vs Geek: Robot battles, hoverboard drag race, and more

M
Is DIY in your DNA? Become part of our maker community.
Submit Your Work!