Comment by naikrovek

Comment by naikrovek 2 days ago

6 replies

The switch from 24-bit color to 30-bit color is very similar to the move from 15-bit color on old computers to 16-bit color.

You didn’t need new displays to make use of it. It wasn’t suddenly brighter or darker.

The change from 15 to 16 bit color was at least visible because the dynamic range of 16-bit color is much lower than 30-bit color, so you could see color banding improve, but it wasn’t some new world of color, like how HDR is sold.

Manufacturers want to keep the sales boom that large cheap TVs brought when we moved away from CRTs. That was probably a “golden age” for screen makers.

So they went from failing to sell 3D screens to semi-successfully getting everyone to replace their SDR screen with an HDR screen, even though almost no one can see the difference in those color depths when shown with everything else being equal.

What really cheeses me on things like this is that TV and monitor manufacturers seem to gate the “blacker blacks” and “whiter whites” behind HDR modes and disable those features for SDR content. That is indefensible.

jakkos 2 days ago

> Manufacturers want to keep the sales boom that large cheap TVs brought when we moved away from CRTs. That was probably a “golden age” for screen makers.

IMO the difference between LCD and OLED is massive and "worth buying a new tv" over.

I've never tried doing an 8-bit vs 10-bit-per-color "blind" test, but I think I'd be able to see it?

> What really cheeses me on things like this is that TV and monitor manufacturers seem to gate the “blacker blacks” and “whiter whites” behind HDR modes and disable those features for SDR content. That is indefensible.

This 100%. The hackery I have to regularly perform just to get my "HDR" TV to show an 8-bit-per-color "SDR" signal with it's full range of brightness is maddening.

  • o11c a day ago

    > I've never tried doing an 8-bit vs 10-bit-per-color "blind" test, but I think I'd be able to see it?

    In my tests with assorted 24-bit sRGB monitors, a difference of 1 in a single channel is almost always indistinguishable (and this might be a matter of monitor tuning); even a difference of 1 simultaneously in all three channels is only visible in a few places along the lerps. (Contrast all those common shitty 18-bit monitors. On those, even with temporal dithering, the contrast between adjacent colors is always glaringly distracting.)

    (If testing yourself, note that there are 8 corners of the color cube, so 8×7÷2=28 unique pairs. You should use blocks of pixels, not single pixels - 16x16 is nice even though it requires scrolling or wrapping on most monitors, since 16×256 = 4096. 7 pixels wide will fit on a 1920-pixel-wide screen naturally.)

    So HDR is only a win if it adds to the "top". But frankly, most people's monitors are too bright and cause strain to their eyes anyway, so maybe not even then.

    More likely the majority of the gain has nothing to do with 10-bit color channels, and much more to do about improving the quality ("blacker blacks" as you said) of the monitor in general. But anybody who is selling something must necessarily be dishonest, so will never help you get what you actually want.

    (For editing of course, using 16-bit color channels is a good idea to prevent repeated loss of precision. If also using separate alpha per channel, that gives you a total of 96 bits per pixel.)

    • erincandescent a day ago

      > In my tests with assorted 24-bit sRGB monitors, a difference of 1 in a single channel is almost always indistinguishable (and this might be a matter of monitor tuning); even a difference of 1 simultaneously in all three channels is only visible in a few places along the lerps. (Contrast all those common shitty 18-bit monitors. On those, even with temporal dithering, the contrast between adjacent colors is always glaringly distracting.)

      Now swap the sRGB primaries for the Rec.2020 primaries. This gives you redder reds, greener greens, and slightly bluer blues (sRGB blue is already pretty good)

      This is why Rec.2020 specifies a minimum of 10-bit per channel colour. It stretches out the chromaticity space and so you need additional precision.

      This is "just" Wide Colour Gamut, not HDR. But even retaining the sRGB gamma curve, mapping sRGB/Rec.709 content into Rec.2020 without loss of precision requires 10-bit precision.

      Swap out the gamma curve for PQ or HLG and then you have extended range at the top. Now you can go super bright without "bleeding" the intensity into the other colour channels. In other words: you can have really bright things without them turning white.

      Defining things in terms of absolute brightness was a bit of a weird decision (probably influenced by how e.g. movie audio is mixed assuming the 0dBFS = 105dB(SPL) reference level that theaters are supposed to be callibrated to) but pushing additional range above the SDR reference levels is reasonable, especially if you expect that range to be used judiciously and/or you do not expect displays to be able to hit their maximum values on that across the whole screen continuously.

    • leguminous a day ago

      On my 8-bit-per-channel monitor, I can easily see banding, though it is mostly obvious in the darker areas in a darkened room. Where this commonly manifests itself is "bloom" from a light object on a dark background.

      I can no longer see banding if I add dither, though, and the extra noise is imperceptible when done well, especially at 4k and with a temporal component.

  • jiggawatts 2 days ago

    > I've never tried doing an 8-bit vs 10-bit-per-color "blind" test, but I think I'd be able to see it?

    It's only really visible on subtle gradients on certain colours, especially sky blue, where 8 bits isn't sufficient and would result in visible "banding".

    In older SDR footage this is hidden using film grain, which is essentially a type of spatial & temporal dithering.

    HDR allows smooth gradients without needing film grain.

ttoinou 2 days ago

As long as the right content was displayed I instantly saw the upgrade to HDR screens (first time I saw one was a smartphone less than 10 years ago I believe), I knew something was new.

The same way I could instantly tell when I saw a screen showing a footage with more than 40 fps. And I see constantly on youtube wrongly converted footage from 24 fps to 25 fps, one frame every second jumps / is duplicated