Comment by naikrovek
The switch from 24-bit color to 30-bit color is very similar to the move from 15-bit color on old computers to 16-bit color.
You didn’t need new displays to make use of it. It wasn’t suddenly brighter or darker.
The change from 15 to 16 bit color was at least visible because the dynamic range of 16-bit color is much lower than 30-bit color, so you could see color banding improve, but it wasn’t some new world of color, like how HDR is sold.
Manufacturers want to keep the sales boom that large cheap TVs brought when we moved away from CRTs. That was probably a “golden age” for screen makers.
So they went from failing to sell 3D screens to semi-successfully getting everyone to replace their SDR screen with an HDR screen, even though almost no one can see the difference in those color depths when shown with everything else being equal.
What really cheeses me on things like this is that TV and monitor manufacturers seem to gate the “blacker blacks” and “whiter whites” behind HDR modes and disable those features for SDR content. That is indefensible.
> Manufacturers want to keep the sales boom that large cheap TVs brought when we moved away from CRTs. That was probably a “golden age” for screen makers.
IMO the difference between LCD and OLED is massive and "worth buying a new tv" over.
I've never tried doing an 8-bit vs 10-bit-per-color "blind" test, but I think I'd be able to see it?
> What really cheeses me on things like this is that TV and monitor manufacturers seem to gate the “blacker blacks” and “whiter whites” behind HDR modes and disable those features for SDR content. That is indefensible.
This 100%. The hackery I have to regularly perform just to get my "HDR" TV to show an 8-bit-per-color "SDR" signal with it's full range of brightness is maddening.