Choosing lenses for a motion design's machine vision

July 1, 2006
Designing a machine vision system is new territory for many engineers. Fortunately, it's a fairly systematic process, beginning with selecting a lens.

Designing a machine vision system is new territory for many engineers. Fortunately, it's a fairly systematic process, beginning with selecting a lens. In a machine vision system, the lens' job is to gather light and focus it onto a sensing plane. How well it does that depends on its field of view and resolution, the two most critical qualities associated with a lens.

Field of view (FOV), with respect to a lens, is the portion, or area, of an image that appears in focus on the sensing plane. If the lens' FOV extends beyond the perimeter of the sensor — be it a CCD, CMOS, or other photoelectric device — the system's FOV, naturally, will be smaller than that of the lens.

Resolution, a measure of image sharpness, is the lateral distance between the closest discernable features. This number can be misleading, however, because low values represent high resolution. To eliminate confusion, resolution may be quantified in terms of spatial frequency — the number of (discernable) line pairs per millimeter (lp/mm) in an image, equal to one divided by the distance between lines. Units of spatial frequency are more intuitive because high values represent high resolution, and low values represent low resolution.

Pixel by pixel

A good way to understand how FOV and resolution work is by relating them to an application. Suppose an assembly robot uses machine vision to position a 10-mm bolt in a hole and drive it home. If the initial position tolerance is ±10 µm, then the minimum FOV should measure 10 mm, allowing the vision system to see the entire hole and locate its center. Resolution, then, must be greater than 10 µm to stay within the error margin.

There is an underlying element that contributes to FOV and resolution. This is the pixel, the smallest unit into which an image can be divided. A minimum number of pixels is required for any vision sensor. Mathematically, this is stated as:

Np = 2Fov/dobj

where Np is the number of pixels in a line across the sensor. (The factor of two results from taking two adjacent pixels to identify the edge of an object — one to show where the object is, and the other to show where it is not.)

Fov is the linear size of the system's FOV

dobj is the size of the object's smallest feature of interest

For the assembly application, the robot positions a bolt within 10 µm of a circular hole's center, so dobj is 10 µm. At this object resolution in a 10 mm FOV, the machine vision system requires a sensor with at least 2,000 × 2,000 pixels, or a four megapixel camera.

After satisfying FOV requirements, the next step is choosing a lens to project the FOV onto the sensor. Each lens must contain appropriate primary or optical magnification, Mp, which is determined by:

Mp = Wsensor/Fov

where Wsensor is the image sensor's physical width.

Using Mp, we can find dobj :

dobj = dimg/Mp

where dimg is the real image's resolution projected onto the image sensor.

Suppose the image sensor on this four-megapixel camera is 15.2 mm across. This means a magnification less than 1.52X is required. To determine the necessary resolution, rewrite the dobj equation as:

Robj = Mp × Rimg

where Robj and Rimg are alternate definitions of object and image resolution respectively, in lp/mm. This definition of resolution leads to the modulation transfer function (MTF), which is graphed as image contrast versus spatial frequency. MTF describes a lens' ability to transfer object detail or contrast to the image plane.

Limitations

Before obtaining the best resolution from a lens, designers must overcome obstacles presented by the camera. Two obstacles are the number of pixels on a camera's image sensor and the sensor's width — both of which affect image-space resolution. This is better explained in the equation,

Rimg = Np/2Wsensor

For our example, a 15.2 mm × 15.2-mm sensor consists of 2,048 × 2,048 pixels, resulting in an image-space resolution of 67.4 lp/mm horizontally. With a lens magnification of 1.52X, the object-space resolution is 102.3 lp/mm, or a minimum feature size of 9.78 µm — in line with the required 10 µm object resolution.

Another example of resolution being limited by a camera and not the lens is a diffraction-limited system where an infinity-corrected objective lens and tube lens connect to a CCD camera. The image-space f-number affects the diffraction-limited spot diameter, given by:

δ = 1.22λ/NA = (2.44λ)(ƒ/#)

where δ is the diameter of a diffraction-limited spot

λ is the wavelength of light used to image the scene

NA is the lens' numerical aperture

ƒ/# is the f-number

If λ is 0.55 µm and ƒ/# is 10, then the resulting spot diameter is 13.42 µm. For a pixel size greater than this value, the system resolution is camera-limited. But when the pixel size is smaller, as in our example, system resolution is limited by the lens, in which case choosing a camera with a smaller pixel size does not affect object resolution.

To reach the 10 µm resolution goal requires a larger lens to lower ƒ/#. Solving the diffraction-limited spot equation for ƒ/# and inserting the desired value for δ gives ƒ/# = 7.45. Therefore, any lens large enough to lower ƒ/# to less than 7.45 is appropriate resolution for the assembly robot machine vision system.

These equations for a diffraction-limited vision system apply to two-lens, as well as multi-element lens systems. Even with aberrations, these equations are useful for quickly evaluating whether or not a vision system meets specific requirements.

For more information, contact Edmund Optics at (800) 363-1992, visit edmundoptics.com, or write the editor [email protected]

Sponsored Recommendations

From concept to consumption: Optimizing success in food and beverage

April 9, 2024
Identifying opportunities and solutions for plant floor optimization has never been easier. Download our visual guide to quickly and efficiently pinpoint areas for operational...

A closer look at modern design considerations for food and beverage

April 9, 2024
With new and changing safety and hygiene regulations at top of mind, its easy to understand how other crucial aspects of machine design can get pushed aside. Our whitepaper explores...

Cybersecurity and the Medical Manufacturing Industry

April 9, 2024
Learn about medical manufacturing cybersecurity risks, costs, and threats as well as effective cybersecurity strategies and essential solutions.

Condition Monitoring for Energy and Utilities Assets

April 9, 2024
Condition monitoring is an essential element of asset management in the energy and utilities industry. The American oil and gas, water and wastewater, and electrical grid sectors...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!