1. »
  2. Blog
  3. »
  4. Light Metrology
  5. »
  6. How Many Megapixels are…

How Many Megapixels are Too Many?
Blog How Many Pixels are too Many Credit MAria Geller Pexels

By Russell Bailey, October 2019

Introduction

Pro-Lite provides a solution for almost any application in measuring the colour and brightness of light sources and displays. From a simple, inexpensive lux meter, to the world’s most advanced imaging photometers, we have a light meter that you can rely upon to give you accurate, repeatable data. Imaging photometers and colorimeters are a relatively recent innovation, being a type of high performance digital still camera that captures images in which we know the absolute brightness (luminance) and colour from the object or light source under test.

How Many Megapixels are Too Many?
Figure 1: Westboro Photonics WP6 Series

Imaging photometers or colorimeters such as the WP6 Series from Westboro Photonics (see Figure 1) have been deployed extensively for the on-line production QC of consumer electronic devices, such as smart phones and tablet computers. They are also used widely in the development and testing of vehicle lighting, displays and instrument panels.

In the world of imaging photometers and colorimeters, there has been a trend towards ever larger image sensors. The theory sounds promising: the bigger the camera sensor, the greater the number of pixels, and in turn the higher the image resolution and the greater the image quality. Whereas as few as 5 years ago, 3 megapixels was considered high resolution, manufacturers are now marketing imaging photometers with upwards of 29 megapixels. Which leads me to the following, seemingly redundant question: just how many megapixels is too many?

The answer may surprise you, as would my assertion that an imaging photometer can achieve the same (or even better) image quality as a model with twice the number of pixels. Clearly, the size of your image sensor is not the full picture (pun intended).

Imaging Photometers & Colorimeters

Imaging photometers and colorimeters have revolutionised the way in which we measure the colour and luminance of displays, projected beams and instrument clusters. Compared with traditional, so-called “spot luminance meters”, they capture the colour and luminance (cd/m2) or illuminance (lux) at literally millions of points within the light source or illuminated object, revealing subtle yet important brightness and colour differences that might otherwise be missed by cursory examination with a spot meter.

How Many Megapixels are Too Many?
Figure 2: Schematic Imaging Colorimeter

An imaging photometer (or colorimeter) is most simply described as a very high performance digital still camera (see Figure 2). However, compared with a standard DSLR camera, the CCD or CMOS sensor in an imaging photometer is spectrally matched to the human photopic observer function, which we term the CIE V(λ) response of the eye). A so-called “photopic” filter is placed over the image sensor and scales the brightness of an object to how a human would perceive it, with green wavelengths being received as brighter, and blue and red wavelengths correspondingly dimmer.

Other important differences between an imaging photometer and a DSLR camera include the correction of aberrations inherent in imaging systems (cosine and cosine4 intensity fall-off through the lens), pixel gain normalisation over the sensor array, linearity correction and the absolute photometric calibration of the instrument to traceable NMI standards. Basically, a DSLR takes pretty pictures, while an imaging photometer takes calibrated pictures encoded with the brightness of the object at every pixel in the image.

An imaging colorimeter functions in a similar way, but instead of a single filter that tunes the spectral response of the sensor to V(λ) photopic response, a colorimeter employs a set of three or four coloured glass filters that tune the camera response to the CIE tristimulus colour matching response of the eye. The 3 or 4 tristimulus filters are generally deployed in a motorised filter wheel, such that a full colour image is captured through one filter at a time, with the true colour image being recreated by the stitching together of the component RGB images in the camera software.

Image Resolution

How Many Megapixels are Too Many?
Figure 3: Speedometer

A key metric when using an imaging photometer or colorimeter is the image resolution, or to put it another way, the number of camera pixels that are used to image the object per mm. Consider the measurement of an automobile instrument panel (see Figure 3). The speedometer will typically have a diameter of approximately 120mm, but it will feature legends, symbols and text that may only be 1mm or less in “stroke width”. If we were to capture an image of the speedometer with a camera having a 1 megapixel (1024 x 1024 pixel) sensor, that would result in an image with 1024 pixels across a 120mm object width. That equates to 9 camera pixels per mm. With “edge effects”, we would consider that we have about 5 effective camera pixels over a 1mm object (see Figure 4). An edge effect is where a camera pixel is located at the boundary of a lit area, and may be collecting light from the surroundings. Hence, for precise metrology, we tend to ignore those pixels in an image that are at or close to a boundary area.

How Many Megapixels are Too Many?
Figure 4: Speedometer Detail

It is widely accepted that you should aim for at least 10 camera pixels per mm for acceptable accuracy and repeatability with instrument panels and switch packs, so clearly in this example a 1 megapixel camera is insufficient for measuring a the speedometer in one image. By comparison, a 9 megapixel camera with 3388 horizontal pixels achieves a resolution of 28 pixels per mm. So clearly, more sensor pixels are better in terms of the ability to faithfully image and quantify the colour and luminance of small details in a bigger object.

The trend towards larger, LCD instrument panels in vehicles has fuelled the development of imaging photometers with large, high resolution sensors. To measure the full instrument panel would demand an imaging photometer with a large, multi-megapixel image sensor. However, the number of pixels is not the only parameter that determines the image quality, spatial resolution and colorimetric accuracy of imaging photometers and colorimeters. Having a high pixel resolution is important, but you can have too many pixels, as I will discuss later.

The Filter Path Length Error

Let’s return to the construction of an imaging colorimeter or photometer. As explained above, coloured glass filters are used to scale the spectral sensitivity of the camera to match (as closely as possible) the human visual response (photopic or tristimulus colorimetric). The filters transmit light at certain wavelengths, and absorb light at others so as to modify the native spectral sensitivity of the image sensor.

The Beer-Lambert Law tells us that the transmittance of a filter varies in linear proportion to its thickness (or more correctly, the path length of light through the filter). The longer the optical path length that the ray of light experiences, the more light will be absorbed. If all the rays of light pass through a filter at normal incidence, each ray would experience the same degree of absorption. However, the rays of light in an imaging photometer or colorimeter do not all pass through the filter in parallel. This causes a shift in the spectral response of the camera at higher angles compared with the response to those rays that pass through at normal incidence. In other words, while the imaging photometer may match the human V(λ) photopic or tristimulus colour response along the optical axis, it will suffer an increasing colorimetric error as the light is collected off-axis.

How Many Megapixels are Too Many?
Table 1

Table 1 compares the physical characteristics of the WP6120, 12 megapixel imaging colorimeter from Westboro Photonics with rival 12 and 29MP cameras. The angle subtended from the optical axis to the corners of the image sensor in the WP6120 is 40% that of the rival 12 and 29MP cameras (10° versus 26°). The larger the half angle, the greater the colorimetric error compared to the true photopic or tristimulus response of the human eye as you capture light towards the edge and corners of the image sensor. This error is eliminated where the spectrum of the light source that is used to calibrate the photometer matches that of the test source. The trouble with relying upon this is that the light source most commonly used to calibrate photometric equipment is a tungsten halogen lamp, whereas the most often encountered light sources today are white (phosphor-converted blue) LEDs or RGBW colour tunable LEDs, the spectrum from which differ markedly from that of the tungsten bulb.

We can conclude this discussion by stating that the errors caused by a colour shift at high angles through filters is an inherent limitation of any filter-based imaging light meter. However, the error is minimised by choosing an instrument with as small an image etendue as possible. The error can also be negated by calibrating the camera with a lamp whose spectrum matches exactly that which the camera will be used to measure. Manufacturers of calibration equipment have responded to this issue by developing spectrally tunable reference light sources that can be tuned to the same spectrum of the light source that you wish to measure. An example is the Labsphere CCS-1100 which is a calibrated integrating sphere uniform light source with LED light engine that can be tuned to yield almost any spectrum and colour temperature. This can be used to calibrate the imaging colorimeter and in so doing, eliminates the filter shift error.

Lens Modulation Transfer Function (MTF)

How Many Megapixels are Too Many?
Figure 5: Resolution Test Chart

One may be forgiven for thinking that the only specification that matters to some manufacturers of imaging photometers and colorimeters is the number of pixels in their sensor arrays. Whereas 1 megapixel was considered high resolution 5 years ago, we now see instruments with 29 or even more megapixel sensors being offered. As stated previously, for the precise metrology of small objects in the wider field of view of a camera, more sensor pixels are very desirable. However, the sharpness of an image (and the accuracy of colour and luminance measurements) will also depend on the quality of the lens used with the camera. Think about it this way: if you were to take an abrasive material and badly scratch a lens, it won’t matter whether you have 1 pixel or 29 million – you will still obtain a fuzzy image.

The metric used to define the quality of a lens is the Modulation Transfer Function, or MTF. MTF describes how well a lens reproduces contrast as the object spatial frequency (or object resolution) varies, and is defined as the ratio of relative image contrast to the relative object contrast. Compared to the theoretical, diffraction limited performance of a lens, the actual image quality will be reduced as a result of inherent aberations in the lens. A lens system with poor MTF will have difficulty resolving fine features in an object. Both the image resolution and the absolute contrast will be incorrect. Figure 5 (shown right)  illustrates the problem of poor MTF and “line spreading”; the image on the left shows a resolution test object, while the image on the right shows the effects of line spreading (blurring) caused by poor MTF.

How Many Megapixels are Too Many?
Figure 6: MTF Chart for a 20mm lens
How Many Megapixels are Too Many?
Figure 7: MTF Chart for a 35mm lens

Lens MTF is typically presented as a chart (see Figure 6 and Figure 7) with the sensor size plotted along the x-axis , and the MTF plotted up the y-axis, on a scale of zero to 1.0, with MTF curves presented for different object spatial frequencies (resolution). An MTF of greater than 0.8 is considered to be “excellent”, while a “good” MTF is greater than 0.6. MTF values of less than 0.6 means that the image starts to become less defined, rendering the image unsuitable for photometric and colorimetric metrology.

The red arrow on the MTF chart shows the physical size of the Westboro Photonics WP6120 12MP sensor. The black arrow indicates the physical size of the competitor’s 29MP sensor. Despite the competitor’s sensor having nearly three times as many pixels, the MTF for the 29MP sensor is much worse through the 35mm lens. This means the useable resolution of the 29MP camera is lower than the 12MP system and has a reduced ability to resolve contrast differences.

There are a number of different commercially available lens used by manufacturers of imaging photometers and colorimeters. Westboro Photonics uses the Canon EF mount lenses and Kowa C-mount lenses due to their good MTF. These lenses typically have an MTF of > 0.6 up to about 30 lines per mm, depending on the lens in question and whether we are considering the tangential or sagittal view (this is the orientation of the lines with respect to the image sensor). Typically, camera lenses tend to have a reduced MTF when used to resolve images on to larger area detectors.

In summary, the limitations of lens MTF explains why image sensor size and number of pixels are not the limiting factor in image quality. For the case of the 29MP camera, you are literally wasting money on surplus pixels as the limitations of the optical system mean that this photometer is incapable of resolving to better than 30 lines per mm. The use of a lens with poor MTF helps to explain why the detailed images captured using a 29MP camera suffer from a lower resolution (increased fuzzyness, or high line spread function) than those recorded using the 12MP Westboro WP6120 colorimeter.

Conclusion

Imaging photometers and colorimeters have become the instrument of choice for demanding applications in testing consumer electronic displays, avionics instrument panels & head-up displays and automotive instrument panels & switchpacks. To ensure the capture of high resolution images of small details in the display, large, multi-megapixel image sensors are an obvious choice. However, large image sensors have significant drawbacks that often obviate their advantages. Filter shift causes the photometric and colorimetric calibration of the camera to be lost as the rays of light are imaged at high angles towards the edges and corners of the image sensor. Moreover, the lens Modulation Transfer Function (MTF) becomes the limiting factor in resolving fine detail in an object, rendering large image sensors redundant. This explains why a 29MP imaging photometer has a lower resolving power than a rival 12MP camera.

Share Article on Social Media

More
articles

Recalibration Lab