Comment by o11c

Comment by o11c a day ago

2 replies

> I've never tried doing an 8-bit vs 10-bit-per-color "blind" test, but I think I'd be able to see it?

In my tests with assorted 24-bit sRGB monitors, a difference of 1 in a single channel is almost always indistinguishable (and this might be a matter of monitor tuning); even a difference of 1 simultaneously in all three channels is only visible in a few places along the lerps. (Contrast all those common shitty 18-bit monitors. On those, even with temporal dithering, the contrast between adjacent colors is always glaringly distracting.)

(If testing yourself, note that there are 8 corners of the color cube, so 8×7÷2=28 unique pairs. You should use blocks of pixels, not single pixels - 16x16 is nice even though it requires scrolling or wrapping on most monitors, since 16×256 = 4096. 7 pixels wide will fit on a 1920-pixel-wide screen naturally.)

So HDR is only a win if it adds to the "top". But frankly, most people's monitors are too bright and cause strain to their eyes anyway, so maybe not even then.

More likely the majority of the gain has nothing to do with 10-bit color channels, and much more to do about improving the quality ("blacker blacks" as you said) of the monitor in general. But anybody who is selling something must necessarily be dishonest, so will never help you get what you actually want.

(For editing of course, using 16-bit color channels is a good idea to prevent repeated loss of precision. If also using separate alpha per channel, that gives you a total of 96 bits per pixel.)

erincandescent a day ago

> In my tests with assorted 24-bit sRGB monitors, a difference of 1 in a single channel is almost always indistinguishable (and this might be a matter of monitor tuning); even a difference of 1 simultaneously in all three channels is only visible in a few places along the lerps. (Contrast all those common shitty 18-bit monitors. On those, even with temporal dithering, the contrast between adjacent colors is always glaringly distracting.)

Now swap the sRGB primaries for the Rec.2020 primaries. This gives you redder reds, greener greens, and slightly bluer blues (sRGB blue is already pretty good)

This is why Rec.2020 specifies a minimum of 10-bit per channel colour. It stretches out the chromaticity space and so you need additional precision.

This is "just" Wide Colour Gamut, not HDR. But even retaining the sRGB gamma curve, mapping sRGB/Rec.709 content into Rec.2020 without loss of precision requires 10-bit precision.

Swap out the gamma curve for PQ or HLG and then you have extended range at the top. Now you can go super bright without "bleeding" the intensity into the other colour channels. In other words: you can have really bright things without them turning white.

Defining things in terms of absolute brightness was a bit of a weird decision (probably influenced by how e.g. movie audio is mixed assuming the 0dBFS = 105dB(SPL) reference level that theaters are supposed to be callibrated to) but pushing additional range above the SDR reference levels is reasonable, especially if you expect that range to be used judiciously and/or you do not expect displays to be able to hit their maximum values on that across the whole screen continuously.

leguminous a day ago

On my 8-bit-per-channel monitor, I can easily see banding, though it is mostly obvious in the darker areas in a darkened room. Where this commonly manifests itself is "bloom" from a light object on a dark background.

I can no longer see banding if I add dither, though, and the extra noise is imperceptible when done well, especially at 4k and with a temporal component.