lucyjojo 2 days ago

well technically there's a bunch of stuff that happens after the sensor gets raw data. (also excluding the fact that normal sensors do not capture light phase)

demosaicing is a first point of loss of data (there is a tiling of monochrome small sensors, you reconstruct color from little bunches with various algorithms)

there is also a mapping to a color space of your choosing (probably mentioned in the op video, i apologize for i have not watched yet...). sensor color space do not need to match that rendered color space...

note of interest being that sensors actually capture some infrared light (modulo physical filters to remove that). so yeah if you count that as color, it gets removed. (infrared photography is super cool!)

then there is denoising/sharpening etc. that mess with your image.

there might be more stuff i am not aware of too. i have very limited knowledge of the domain...

  • ttoinou 2 days ago

    But even before sensor data we go from 100 bits of photons data to 42 bits counted by photosites. Mh well maybe my calculations are too rough

strogonoff 2 days ago

The amount of captured sensor data thrown out when editing heavily depends on the scene and shooting settings, but as I wrote it is probably almost always 90%+ even with the worst cameras and widest possible dynamic range display technology available today.

In a typical scene shot with existing light outdoors it is probably 98%+.