Remix.run Logo
Abh1Works 8 hours ago

Why is the native picture (fig 1) in grayscale? or more generally why is black and white the default of signal processing? Is it just because black and white are two opposites that can be easily discerned?

loki_ikol 6 hours ago | parent | next [-]

It's not really grayscale. The output of an image sensor integrated circuit is a series of voltages read one after the other, that could be from –0.3 to +18 volts for example, in an order specific to the sensor's red, green and blue "pixels" arrangement. The native picture (fig 1) is the result of converting a sensor's output voltage to a series of values from black (let's say -0.3 volts for example) up to white (let's say +18 volts for example) while ignoring if they are from a red, a green or a blue image sensor "pixel".

The various "raw" camera image formats kind of work like this, they include the voltages converted to some numerical range and what each "pixels" represents for a specific camera sensor setup.

seba_dos1 6 hours ago | parent | prev [-]

It's just a common default choice to represent spacial data that lacks any context on how to interpret the values chromatically. You could very well use a heatmap-like color scheme instead.