Display Bit Depth Demystified


Another seldom thought of aspect of newer displays is bit depth. Most HDTVs were/are 8-bit which means they can display 256 gradations or shades of a particular color. For instance, from white to black there are 256 shades - 254 shades of gray with white and black. Breaking it down further, each pixel of a display consists of a red, green, and blue portion. There are 256 gradations of each of these three colors. And since each of these is responsible for producing whatever color the content is calling for in that pixel, an 8-bit panel/display each pixel is capable of reproducing 256X256X256 colors or 16,777,216 different colors.

Newer UHD/HDR displays are now capable of 10-bit color which expands the number of shades/gradations from 256 to 1024. With 10-bit color you get 1024X1024X1024 colors or 1,073,741,824 different color possibilities. That is quite the difference! So, what does this mean? It means that the gradation/blending of colors in scenes is smoother on a 10-bit panel. A good graphical representation of this is shown below:

full?d=1518837719.png


What does the future hold? We are already at phase one of the UHD roll-out which includes HDR, 10-bit depth, Rec.709 color space, etc. The next phase will be the addition of 12-bit displays, BT.2020 color space, 4:4:4 subsampling, etc. 12-bit displays will be capable of 4096 shades of red, green, and blue for just under 69 billion colors. The final phase will add 14-bit displays capable of 16,384 shades of red, green, and blue for just under 4.4 trillion colors.

What does this mean to you? What it means is that you will have smoother more natural looking image on your display. The way that bit depth issues are normally described is called banding or color banding. The transition from darker to lighter colors on better performing displays (10-bit and up) is much smoother than those of an 8-bit display. It should also be noted that 10-bit color depth is only available on HDR content. Everything else - cable, satellite, Blu-ray, game consoles, PCs, etc. - is 8-bit.

If you have Netflix you can view a test pattern here. Look at the gray ramp up on the top of the pattern. What you are looking for is banding in the grayscale ramp.
 
Last edited by a moderator:

Matthew J Poes

AV Addict
Joined
Oct 18, 2017
Messages
1,903
This is a great simple explanation of color bit depth, I love it!

So let me ask the first question: Why do I sometimes see obvious visible banding with Blurays, which should be 8 bit. I understand that it gets better with the newer formats capable of 10 bit and more. I'm more asking, if 8 bit can cause visible banding, why don't we see it more often? Why is it that just some scenes on some discs manage to have such visible banding?

I've often wondered if somehow in the rendering of the scene they didn't actually render it at 8 bit or some other artifact is causing it, rather than being a limitation of Blu-ray. I've never understood this.

Second Question: As we move through these phases, will current 12 bit displays be able to make use of future 12 bit content? For example the current JVC projectors are said to be 12 bit end to end, will they be able to make use of the future content or will there be changes necessary to support this that would render such devices obsolete (for 12 bit color purposes).
 

mechman

Senior Admin
Staff member
Thread Starter
Joined
Apr 4, 2017
Messages
2,047
Location
Empire, MN
More  
Preamp, Processor or Receiver
Pioneer VSX-832
Front Speakers
Definitive Technology Studio Monitor 55s
Center Channel Speaker
Definitive Technology CS8040
Surround Back Speakers
Definitive Technology DI6.5R
Other Speakers or Equipment
Apple TV 4K
Video Display Device
LG OLED65C7P
Remote Control
Logitech Harmony 650
Streaming Subscriptions
HBO Max, YouTubeTV, Hulu, Netflix, Disney+
This is a great simple explanation of color bit depth, I love it!

So let me ask the first question: Why do I sometimes see obvious visible banding with Blurays, which should be 8 bit. I understand that it gets better with the newer formats capable of 10 bit and more. I'm more asking, if 8 bit can cause visible banding, why don't we see it more often? Why is it that just some scenes on some discs manage to have such visible banding?

I've often wondered if somehow in the rendering of the scene they didn't actually render it at 8 bit or some other artifact is causing it, rather than being a limitation of Blu-ray. I've never understood this.

Some 8-bit displays utilize something called dithering to mimic a higher bit depth. How well this is achieved depends upon the processor of the display. My old Sony panel did a fairly good job right up until there was a dark sunset in the background. For some reason, that always threw that panel off and I would see banding. So if you have a higher end 8-bit panel, more than likely it has excellent processing which results in minimal banding.

Second Question: As we move through these phases, will current 12 bit displays be able to make use of future 12 bit content? For example the current JVC projectors are said to be 12 bit end to end, will they be able to make use of the future content or will there be changes necessary to support this that would render such devices obsolete (for 12 bit color purposes).

The simple answer is yes. There is no competing formats or anything like that, 12-bit is 12-bit. I think there is still a ways to go before we see anything in 12-bit though. Todd would probably know the answer to that better than I would. To be honest, I didn't think there were any 12-bit consumer displays out there yet. :scratchhead:

Did you go look at that Netflix pattern yet?
 
Last edited:

Matthew J Poes

AV Addict
Joined
Oct 18, 2017
Messages
1,903
Some 8-bit displays utilize something called dithering to mimic a higher bit depth. How well this is achieved depends upon the processor of the display. My old Sony panel did a fairly good job right up until there was a dark sunset in the background. For some reason, that always threw that panel off and I would see banding. So if you have a higher end 8-bit panel, more than likely it has excellent processing which results in minimal banding.



The simple answer is yes. There is no competing formats or anything like that, 12-bit is 12-bit. I think there is still a ways to go before we see anything in 12-bit though. Todd would probably know the answer to that better than I would. To be honest, I didn't think there were any 12-bit consumer displays out there yet. :scratchhead:

Did you go look at that Netflix pattern yet?
JVC Claims 12 bit end to end. Or at least, they claim 12 bit panels and 12 bit processing throughput. I guess I am not absolutely certain the HDMI input will pass 12 bit, but that was the impression I was given when this was discussed on some forums. I talked with Kris Deering about this and he gave me the impression that the old JVC's could pass 10 bit and the new ones 12 bit. I certainly could have misunderstood. If I understood him correctly, even my projector will pass 10 bit color, however, I'm not sure how to send 10 bit color to it to even test that.
 

mechman

Senior Admin
Staff member
Thread Starter
Joined
Apr 4, 2017
Messages
2,047
Location
Empire, MN
More  
Preamp, Processor or Receiver
Pioneer VSX-832
Front Speakers
Definitive Technology Studio Monitor 55s
Center Channel Speaker
Definitive Technology CS8040
Surround Back Speakers
Definitive Technology DI6.5R
Other Speakers or Equipment
Apple TV 4K
Video Display Device
LG OLED65C7P
Remote Control
Logitech Harmony 650
Streaming Subscriptions
HBO Max, YouTubeTV, Hulu, Netflix, Disney+
It will pass it, but you need material. I don't think there's any 12-bit material out there.
 
Top Bottom