Comment by lucyjojo

Comment by lucyjojo 2 days ago

1 reply

well technically there's a bunch of stuff that happens after the sensor gets raw data. (also excluding the fact that normal sensors do not capture light phase)

demosaicing is a first point of loss of data (there is a tiling of monochrome small sensors, you reconstruct color from little bunches with various algorithms)

there is also a mapping to a color space of your choosing (probably mentioned in the op video, i apologize for i have not watched yet...). sensor color space do not need to match that rendered color space...

note of interest being that sensors actually capture some infrared light (modulo physical filters to remove that). so yeah if you count that as color, it gets removed. (infrared photography is super cool!)

then there is denoising/sharpening etc. that mess with your image.

there might be more stuff i am not aware of too. i have very limited knowledge of the domain...

ttoinou 2 days ago

But even before sensor data we go from 100 bits of photons data to 42 bits counted by photosites. Mh well maybe my calculations are too rough