Comment by ttoinou
Ok so similar to the other commentator then, thanks. According to that metric its much more than 90% we’re throwing out then (:
Ok so similar to the other commentator then, thanks. According to that metric its much more than 90% we’re throwing out then (:
The amount of captured sensor data thrown out when editing heavily depends on the scene and shooting settings, but as I wrote it is probably almost always 90%+ even with the worst cameras and widest possible dynamic range display technology available today.
In a typical scene shot with existing light outdoors it is probably 98%+.
well technically there's a bunch of stuff that happens after the sensor gets raw data. (also excluding the fact that normal sensors do not capture light phase)
demosaicing is a first point of loss of data (there is a tiling of monochrome small sensors, you reconstruct color from little bunches with various algorithms)
there is also a mapping to a color space of your choosing (probably mentioned in the op video, i apologize for i have not watched yet...). sensor color space do not need to match that rendered color space...
note of interest being that sensors actually capture some infrared light (modulo physical filters to remove that). so yeah if you count that as color, it gets removed. (infrared photography is super cool!)
then there is denoising/sharpening etc. that mess with your image.
there might be more stuff i am not aware of too. i have very limited knowledge of the domain...