Tight Squeeze: Optical Imaging For Cramped Quarters

July 5, 2000
A six-step strategy helps fit imaging systems onto semiconductor equipment where space is at a premium.

JOHN STACK
President
Edmund Industrial Optics
Barrington, N.J.

Fundamental parameters of an imaging system include the depth of field, field of view, resolution, working distance, and sensor size. A magnification of one means the imaged area equals that of the image sensor. A magnification greater than one means the imaged area is less than the sensor area. The reverse is true for magnifications less than one.


A variety of factors contribute to the overall image quality, including resolution, image contrast, depth of field, perspective errors, and geometric errors.


Now more than ever before semiconductor equipment such as wirebonders and surface profiling equipment requires integrated sensors that can monitor a process or locate material. These sensors are often optical imaging systems. Yet I have visited many semiconductor equipment manufacturers (employing entire groups of mechanical, electrical, and software engineers) who seldom have more than a single engineer in charge of optical systems. And he may have been elected to the position while on vacation.

I may be exaggerating, but armies of optical engineers are a rare luxury for most equipment manufacturers. Yet the need for integrating optics into machinery has never been greater. And because space is always at a premium within a fab clean room, system designers have little elbowroom. Often, integrating a vision system means snaking optical gear through equipment without interfering with the primary process, be it wirebonding, die packaging, aligning wafers, or registration marks before lithography or metrology.

A few tips can help manufacturers get up and running with imaging, even if they are unable to afford large optical engineering departments.

Image quality
The primary purpose of any imaging system is to get images with enough quality to extract necessary information. There is no single number that determines image quality. Factors that enter into judgments about image quality have a lot to do with the object you want to view.

The fundamental parameters of any imaging system include its field of view (FOV), working distance, resolution, depth of field (DOF), and sensor size. FOV is the viewable area of the object under inspection. In other words, it is the portion of the object that fills the camera sensor.

The distance from the front of the lens to the object under inspection is the working distance, and the minimum feature size that the system can discern is its resolution. Its DOF is the maximum object depth that can be maintained entirely in focus. It's also the amount of object movement (in and out of focus) allowable while maintaining an acceptable focus. The active area of the camera sensor is referred to as its size and is typically specified in the horizontal dimension. It enters into determining the degree of lens magnification needed to get a desired field of view.

Another useful descriptor is the primary magnification of the lens. It is the ratio between the sensor size and the field of view. It doesn't typically serve as a fundamental parameter.

Also bearing on image quality are three other properties: image contrast, perspective errors, and distortion. The point of considering all these factors is to determine the minimum acceptable image quality. Defining the minimum image quality is crucial. Tightly packed optical systems all have one thing in common: They sacrifice lots of image quality to accommodate mechanical constraints. In addition, a good understanding of image quality requirements can avoid a lot of wheel spinning and extra costs.

Will it fit?
With minimum needs nailed down, it is time to find a combination of focal lengths and object/image distances that will work — or to determine that the requirements you've come up with are impractical.

The bad news is that this usually involves working through thin-lens equations. In addition, those equations can effect very misleading results. Fortunately, any modern optical company has optical design software that can quickly and easily provide a preliminary solution. Unless your problem is extremely complex, this service is usually free.

A prototype of the configuration that these equations call for can employ off-the-shelf components. My own experience is that offthe-shelf prototyping is fast, inexpensive, and will confirm image quality requirements. Even the best optical designer cannot perfectly predict illumination effects and object surface qualities.

Another word of hard-won wisdom: set up the initial prototype in a straight line. The final optical system will, no doubt, contain numerous bends and twists. But the prototype should show the basic effects of lenses, apertures, CCDs and illumination. Familiarity with these qualities at this point prevents problems in debugging later on. Even taking time to build special fixtures that hold the system in a straight line is well worth the effort.

Finally, make sure aperture sizes are realistic. Chances are you've chosen lenses with diameters that won't fit in the mechanical space allotted for the optics. Use apertures to simulate the diameters you can realistically expect.

Illuminate!
Most imaging system failures arise because objects in the FOV are improperly illuminated. Insufficient illumination makes the system contrast suffer, and image quality suffers as well. Contrary to popular belief, contrast is more important than resolution in many imaging systems. (See "Looking at Modulation Transfer Functions for Machine Vision," MACHINE DESIGN, 4/20/00, pg. 78 for a more in-depth discussion of the relationship between contrast and resolution.)

An image appears well defined if its black details appear black and white details are white. The greater the difference in intensity between them, the better the contrast. But the object must be illuminated to give good contrast before the imaging system has a chance of transmitting a well-defined image.

Illumination is all about geometry. Consider the relationship between lighting geometry and surface features in a few examples. Fluorescent linear or ring lamps can provide diffuse light from the front. This sort of lighting minimizes shadows and specular reflections, but also makes surface features less distinct.

Single-directional glancing incidence lighting, as from fiberoptic light guides, goes to the opposite extreme. Surface defects and topology show up well, but there are also extreme shadows and bright spots. Directional illumination, from one or more fiberoptic light guides, offers more moderate properties: strong, relatively even lighting but with some shadows and glare. Ring lights, from fiber-optic or LED-ring light guides, reduce shadows and provide relatively even illumination. But ring lights can sometimes be difficult to mount. They may also create a circular glare problem with highly reflective surfaces.

Polarized lighting, via regular light source with a filter attached, provides even illumination but diminished intensity through the polarizer. Diffused axial lighting can be offered by LED axial illuminators or fiber-optic-driven axial adapters, and offers shadow-free, even illumination with little glare. But this lighting mode requires an internal beam splitter which reduces the intensity.

Structure light, from a line-generating laser diode or a fiber-optic line lightguide, is useful for extracting surface features. The disadvantage of a laser is that some colors may absorb the intense light and heat up.

Optical engineers love LEDs. The use of monochromatic LEDs solves a lot of imaging problems and simplifies optical designs. The main benefit is that use of just one light color avoids problems from chromatic aberration. As with most things, however, there is a price: LED illumination can be uneven and provide too little energy where it's needed. LED light may need to be reshaped, diffused, or directed by a lens.

Illumination debugging can be tricky. Two tools are strategic: a flat mirror and chrome ball bearing. Their surfaces accurately show the location and intensity of illumination sources. And they are, of course, not affected by qualities of the object surface being illuminated.

With the basic optical path and illumination ironed out, it is time to make the system fit in its allotted space. Adding folds and combining optical paths looks easy on paper, but it can be a tolerancing and debugging hell. While I can't make this easy, I can mention some details to think about:

Mirror thickness affects image quality directly. It is tempting to specify mirrors and beam splitters that are ultrathin. Doing so makes it impossible for optical manufacturers to guarantee surface flatness and, thus, image quality. Just holding a thin mirror can deform it.

If you need surface flatness of a quarter wavelength or less, a good rule of thumb is to use a 6:1 ratio between surface size and thickness. Situations demanding thinner optics also need a great deal of care during the mounting of parts. Otherwise strain in the mechanical fixtures or from bonding can deform them.

One last point on mirrors: Mounting them from the front can alleviate the need for tight thickness tolerances.

Systems using infrared LEDs also use IR mirrors, but good ones take a little getting used to. High-quality IR mirrors are often gold, and are soft and easily damaged. It's best to discuss these issues with both suppliers and production personnel before the design hits the manufacturing floor.

Experienced designers allow for adjustments. Long optical paths can be sensitive to centering, boresight, and angular tolerances. Folding the optical path multiplies this problem by a factor of three. What works well in the lab may fail on the production floor. If possible, add gimbal adjustments in all folds. An X-Y adjustment in the CCD plane can help adjust for boresight errors. Where this is not possible, a stringent geometric tolerance analysis can minimize the effect.

Don't ignore back reflections. When a beam splitter combines illumination and imaging optics into the same path, only 20 to 40% of the illumination is used. The rest passes out of the system. But when the stray light hits metal in the machine and reflects back into the optical system, you get problematic back reflections. Even black surfaces can reflect light. Cure this problem by baffling the excess light. Threaded barrels can really make a difference.

Production
Though most systems start strictly with offthe-shelf parts, they usually end up with some custom components. Common customizations are relatively inexpensive and take little time. They include edging lenses (to a smaller diameter) and resizing mirrors and beam splitters.

If custom lenses are a possibility, consider the quantity. For jobs that need at least 500 pieces of a single lens or doublet, custom lenses may make sense. Off-the-shelf lenses will probably be more economical for fewer pieces.

Determining whether to order custom or off-the-shelf compound CCD lenses is more complicated. If you need more than 250 pieces, a custom lens can really make sense. One advantage of a custom lens is that it can eliminate adjustable aperture stops and helical focus, reducing costs dramatically. In addition, most off-the-shelf lenses turn out to be larger than if designed for a specific application. So a custom lens can be a real advantage if you are working in a tight space. All in all, a good optical company can make ordering a compound CCD lens painless.

Finally, lenses should be adjustable too. The more leeway for focus and alignment, the easier life is on the production floor. At the very least, you must provide for focus adjustments.

What you can see depends on the light shed on the object. Different types of illumination can solve or create problems for an imaging system. Also used (but not illustrated here) is polarized illumination, which removes specularities and gives even lighting. The price paid is somewhat lower intensity through the polarizer.


Six steps to fewer optical design hassles

  • Define the required image quality
  • Determine whether the quality is feasible
  • Prototype
  • Place the lighting
  • Make it fit
  • Reduce production costs

Sponsored Recommendations

MOVI-C Unleashed: Your One-Stop Shop for Automation Tasks

April 17, 2024
Discover the versatility of SEW-EURODRIVE's MOVI-C modular automation system, designed to streamline motion control challenges across diverse applications.

The Power of Automation Made Easy

April 17, 2024
Automation Made Easy is more than a slogan; it signifies a shift towards smarter, more efficient operations where technology takes on the heavy lifting.

Lubricants: Unlocking Peak Performance in your Gearmotor

April 17, 2024
Understanding the role of lubricants, how to select them, and the importance of maintenance can significantly impact your gearmotor's performance and lifespan.

From concept to consumption: Optimizing success in food and beverage

April 9, 2024
Identifying opportunities and solutions for plant floor optimization has never been easier. Download our visual guide to quickly and efficiently pinpoint areas for operational...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!