The light-sensitive elements, called photosites, create an electrical signal proportional to the amount of light striking that element. All photosites on the imager chip eventually get linked to “picture elements” or “pixels” in a display. Each pixel shows a photosite’s collected light value on an 8-bit grayscale from 0 (darkest) to 255 (brightest). Such an imager is said to detect 256 shades of gray with each image stored as thousands of individual light measurements.
Successful vision applications depend on creating contrast between the feature of interest and the background. Optimal contrast means the edge pixels that make up the feature of interest have different grayscale values from the surrounding background pixels — the greater the difference, the better.
Contrast is commonly created through proper lighting. In a good, high-contrast image, these edge pixels are present at the boundaries between the feature of interest and the background. Thus, if we can identify the edge pixels, we can identify the feature of interest. Vision tools look for these “edges” or locations where drastic changes exist in the grayscale values of neighboring pixels.
Grayscale vision sensors can also usually tell the difference between different colored parts. Each color appears as a slightly different shade of gray in the image. In some cases two colors may appear as the same shade. In such a case, a color filter placed in front of the imager compensates for the similarity by letting only certain wavelengths of light through to the imager. For example, a redbandpass filter paired with a white light helps differentiate between green and red parts. The filter lets only red light pass, so red parts appear bright while green parts appear dark.
However, neither green nor blue parts reflect red light, so both appear dark in the red-filtered image. The detection of three or more colors or identification of a specific color usually demands a color vision sensor. Most color vision sensors are actually grayscale imagers overlaid with a special filter. The filter alternates between red, blue, and green-bandpass areas over each photosite in what is called a Bayer pattern. Thus each photosite responds to only red, green, or blue light. A special color-processing chip uses the imager data to determine the red, green, and blue content of each pixel.
Banner Engineering (bannerengineering.com) supplied information for this column.