By mechman on Mar 10, 2018 at 4:23 PM
  1. mechman

    mechman Senior Admin
    Staff Member
    Thread Starter

    Joined:
    Apr 4, 2017
    Messages:
    659
    Likes Received:
    126
    Location:
    Empire Township, MN
    Display Bit Depth Demystified


    Another seldom thought of aspect of newer displays is bit depth. Most HDTVs were/are 8-bit which means they can display 256 gradations or shades of a particular color. For instance, from white to black there are 256 shades - 254 shades of gray with white and black. Breaking it down further, each pixel of a display consists of a red, green, and blue portion. There are 256 gradations of each of these three colors. And since each of these is responsible for producing whatever color the content is calling for in that pixel, an 8-bit panel/display each pixel is capable of reproducing 256X256X256 colors or 16,777,216 different colors.

    Newer UHD/HDR displays are now capable of 10-bit color which expands the number of shades/gradations from 256 to 1024. With 10-bit color you get 1024X1024X1024 colors or 1,073,741,824 different color possibilities. That is quite the difference! So, what does this mean? It means that the gradation/blending of colors in scenes is smoother on a 10-bit panel. A good graphical representation of this is shown below:

    [​IMG]

    What does the future hold? We are already at phase one of the UHD roll-out which includes HDR, 10-bit depth, Rec.709 color space, etc. The next phase will be the addition of 12-bit displays, BT.2020 color space, 4:4:4 subsampling, etc. 12-bit displays will be capable of 4096 shades of red, green, and blue for just under 69 billion colors. The final phase will add 14-bit displays capable of 16,384 shades of red, green, and blue for just under 4.4 trillion colors.

    What does this mean to you? What it means is that you will have smoother more natural looking image on your display. The way that bit depth issues are normally described is called banding or color banding. The transition from darker to lighter colors on better performing displays (10-bit and up) is much smoother than those of an 8-bit display. It should also be noted that 10-bit color depth is only available on HDR content. Everything else - cable, satellite, Blu-ray, game consoles, PCs, etc. - is 8-bit.

    If you have Netflix you can view a test pattern here. Look at the gray ramp up on the top of the pattern. What you are looking for is banding in the grayscale ramp.
     
    #1 mechman, Mar 10, 2018
    Last edited by a moderator: Oct 9, 2018
    tesseract likes this.

Comments

Discussion in 'Tech Talk' started by mechman, Mar 10, 2018.

    1. Matthew J Poes

      Matthew J Poes Staff Writer
      Staff Member

      Joined:
      Oct 18, 2017
      Messages:
      1,443
      Likes Received:
      245
      This is a great simple explanation of color bit depth, I love it!

      So let me ask the first question: Why do I sometimes see obvious visible banding with Blurays, which should be 8 bit. I understand that it gets better with the newer formats capable of 10 bit and more. I'm more asking, if 8 bit can cause visible banding, why don't we see it more often? Why is it that just some scenes on some discs manage to have such visible banding?

      I've often wondered if somehow in the rendering of the scene they didn't actually render it at 8 bit or some other artifact is causing it, rather than being a limitation of Blu-ray. I've never understood this.

      Second Question: As we move through these phases, will current 12 bit displays be able to make use of future 12 bit content? For example the current JVC projectors are said to be 12 bit end to end, will they be able to make use of the future content or will there be changes necessary to support this that would render such devices obsolete (for 12 bit color purposes).
       
    2. mechman

      mechman Senior Admin
      Staff Member
      Thread Starter

      Joined:
      Apr 4, 2017
      Messages:
      659
      Likes Received:
      126
      Location:
      Empire Township, MN
      Some 8-bit displays utilize something called dithering to mimic a higher bit depth. How well this is achieved depends upon the processor of the display. My old Sony panel did a fairly good job right up until there was a dark sunset in the background. For some reason, that always threw that panel off and I would see banding. So if you have a higher end 8-bit panel, more than likely it has excellent processing which results in minimal banding.

      The simple answer is yes. There is no competing formats or anything like that, 12-bit is 12-bit. I think there is still a ways to go before we see anything in 12-bit though. Todd would probably know the answer to that better than I would. To be honest, I didn't think there were any 12-bit consumer displays out there yet. :scratchhead:

      Did you go look at that Netflix pattern yet?
       
      #3 mechman, Mar 11, 2018
      Last edited: Mar 12, 2018
    3. Matthew J Poes

      Matthew J Poes Staff Writer
      Staff Member

      Joined:
      Oct 18, 2017
      Messages:
      1,443
      Likes Received:
      245
      JVC Claims 12 bit end to end. Or at least, they claim 12 bit panels and 12 bit processing throughput. I guess I am not absolutely certain the HDMI input will pass 12 bit, but that was the impression I was given when this was discussed on some forums. I talked with Kris Deering about this and he gave me the impression that the old JVC's could pass 10 bit and the new ones 12 bit. I certainly could have misunderstood. If I understood him correctly, even my projector will pass 10 bit color, however, I'm not sure how to send 10 bit color to it to even test that.
       
    4. mechman

      mechman Senior Admin
      Staff Member
      Thread Starter

      Joined:
      Apr 4, 2017
      Messages:
      659
      Likes Received:
      126
      Location:
      Empire Township, MN
      It will pass it, but you need material. I don't think there's any 12-bit material out there.
       

Share This Page