top of page
Image tuning

Image Quality
Tuning Terminology 

Basic Terminology of Image Tuning:

Raw image - An uncompressed version of the image file. Raw files are named so because they are not yet processed while containing all information for image processing.

YUV image: YUV is a color encoding system used as part of a color image pipeline. It encodes a color image, allowing reduced bandwidth for chrominance components. 

CMOS sensor is an image sensor type that consists of an array of pixels, each containing a photo detector and active amplifier widely used in digital camera technologies such as cell phones, CCTV, web cameras, etc.


A Bayer pattern is an array for arranging RGB color filters on a square grid of photo sensors. Its particular arrangement of red, blue, and green color filters live above the millions of light-sensitive photosite on the surface of a sensor chip.


Image Sensor - A device that captures light when it strikes the lens of a camera and converts it into an electronic signal and then transmits it to an imaging device processor, which transforms the electronic signal into a digital image

Image signal processing (ISP) pipeline - a method to convert an image into digital form by performing operations like demosaicing, noise reduction, auto exposure, autofocus, auto white balance and image sharpening designed for digital processing and image quality enhancement. 


Distortion is an aberration that causes straight lines to curve.  Distortion tends to be most serious in extreme wide angle, telephoto, and zoom lenses. It can be highly visible on tangential lines near the boundaries of the image, but it is not visible on radial lines.

Distortion can take on two forms: Barrel and Pincushion.

Low Distortion values are better; however, distortion is typically not major issue for most consumer photography, such as camera phones, although it is critical in other areas such as architectural photography.

Luminance/Color shading are measurements of intensity and color non-uniformities in an image as a result of lens vignetting, IR cut filters, sensor lenslets, as well as other types of image, illumination, and sensor irregularities.

Luminance and color shading are measured using a diffuse, uniform light source. We recommend three different color temperatures (~2800K , ~4100K, and ~6500K) one with out IR content and two with.

Since luminance and color shading can vary in geometry and hue for each camera device, some manufacturers implement individual luminance and color shading calibration methods for both ~2800K and ~6500K so as to cover a wide range of color temperatures that the camera will see in the field.

Sharpness is a measurement of an imaging system’s ability to render detail:

   - It is one the most important performance metrics from the consumer’s perspective

   - Consumers expect increased megapixel counts to correlate with increased sharpness, although this is often not the case 

   - Camera sharpness is typically degraded as ISO speed increases because of increased software noise reduction

The SFRplus test chart is used to measure sharpness because it provides detailed sharpness results across the entire image plane

Color/White Balance Accuracy - Color accuracy is an important but ambiguous image quality factor. Many viewers prefer enhanced color saturation; highly accurate color is not necessarily pleasing. Nevertheless, it is important to measure a camera's color response: its color shifts, saturation, and the effectiveness of its White Balance algorithms.

Color/White Balance accuracy is measured with the X-rite Colorchecker. 

We recommend testing across at least two colortemperatures (~2800K and ~6500K) and at varying light levels from normal to low, which can give insight into the camera’s color correction matrix and AWB tuning

Gamma (γ) is the exponent of the equation that relates scene luminance with image pixel level.

Gamma correction is used to display much more even steps in perceived brightness to a human.

Lateral Chromatic Aberration (LCA) - Chromatic aberration (CA) occurs because the index of refraction of glass varies with the wavelength of light — glass bends different colors by different amounts.  Although minimizing chromatic aberration is one of the goals of lens design, it remains a problem, most notably in ultra wide, long telephoto, and extreme zoom lenses.  Lateral chromatic aberration (LCA) is color fringing that occurs because image magnification changes with wavelength.  It is most visible near the edges on the image. 


LCA is measured using the Imatest SFRplus test chart and results are displayed in percentage of the distance from the image center to the corner (percentage of sensor diagonal/2) , corrected for the angle of the ROI with respect to the center. This measurement gives the best overall results, because it is relatively independent of the measurement location and the number of pixels. 

Noise is a random variation of image density, visible as grain in film and pixel level variations in digital images.  Noise can increase dramatically at low light levels, especially for sensors with small pixels.  There are two basic types of noise: 

   1) Temporal noise, which varies randomly each time an image is captured, and 

    2) Fixed pattern noise, caused by sensor non-uniformities.  


Noise may also be separated into noise in the luminance (non-color) channel and noise in the chrominance (color) channels. 

Because the pixel level of patches where noise is measured may vary, noise is best expressed as a Signal-to-Noise Ratio (S/N or SNR), where the signal S is the average patch value of the luminance (Y) channel, where Y = 0.3*R + 0.59*G + 0.11*B and noise N is the standard deviation of the patch value with slow variations from uneven lighting removed. 

SNR is frequently expressed in decibels (dB), where SNR(dB) = 20*log10(S/N). SNR is more meaningful than simple noise measurements.

Inadequate temporal SNRs result in a grainy image and reduce video compression efficiency. 

Noise does not vary widely with color temperature because it is mainly a function of illuminance, therefore noise can be analyzed at either 2800K or 6500K. 

HDR - Optimize the HDR algorithm to improve camera performance in scenes with challenging lighting environments

Contact us for more information.

bottom of page