By helmsm |

Single Sensor Cameras and the Bayer Filter

Published 2/15/2022

Since television began, multi-camera and live events have been captured with cameras that use separate Red, Green and Blue image sensors. Because a single sensor camera is potentially less complicated and can be produced less expensively, some single-sensor cameras have been tried for live events. But there are drawbacks when using a single sensor for live production.

This introduction to the Bayer filter overlay explains how color is created in a single sensor camera. A single sensor can only record luminance information. The Bayer filter is a microfilter overlay on the sensor and is the most common type used in nearly all single sensor digital cameras.

With the Bayer filter design, every pixel in the final image has just one of the red, green or blue components. This filter uses a mosaic pattern of two green pixels, one red and one blue to interpret the color information arriving at the sensor. Generally, we can estimate the other colors by averaging the nearest several pixels. Once recorded, digital de-mosaic algorithms are needed to reconstruct a full color image from the incomplete color samples of a Bayer filter.

Many different mosaic filter overlays have been tried. These various approaches all have different pros and cons that optimize for sharpness, minimum artifacts, noise, color fidelity, and more. Basic de-mosaic algorithms dont do any of these things very well.

The Bayer filter design will give us a result that looks acceptable on scenes that dont contain too many high frequency edges. But with sharper images, these high frequencies interact with the pattern of the overlay filters on the sensor. The term for this artifact is called false coloring or color bleeding.

These artifacts can be reduced by using less saturated filters. However, a less saturated color picture is the result. Normal color can be recovered by increasing the saturation, but that leads to an increase in noise and can create issues with color accuracy. The balance between sharpness, colorimetry and noise can be tuned by the manufacturers, but there is no ideal solution.

The other reason manufacturers sometimes reduce the saturation of the color filters is sensitivity. Red, green and blue primary color filters can absorb a significant amount of light, reducing the cameras sensitivity by more than a f-stop.

All these problems affect different filter array designs to varying degrees. Some of them may have better sensitivity, or better sharpness, or better color reproduction. These limitations are also highly variable depending on the de-mosaic algorithm. All this represents the engineering compromises between sensitivity, color accuracy, sharpness, noise and aliasing.

Finally, using standard television lenses with single sensor cameras has its own issues. The widely available, industry standard B4 lens mount was designed for three sensor video cameras, not single sensor cameras. A variety of adapters that enable the use of B4 lenses on single sensor cameras exist, but the tradeoff can be a significant loss of sensitivity and potential picture aberrations.