Tips Any Good? Weld Rod Inspector Can Tell

Nov. 23, 2008
Device for inspecting weldingrod tips uses machine vision to determine whether redressed tips are good enough to resume welding.  Weld Tip System from the Orbitform Group and  Cure (for concurrent universal recognition engine), the processor chip comes from Neural ID

A device for inspecting weldingrod tips uses machine vision to determine whether redressed tips are good enough to resume welding.

Neural ID

(650)288-1180

neuralid.com

Orbitform Group

orbitform.com

Traditional machine-vision equipment has had a hard time telling good welding tips from bad ones, particularly after a redressing operation. The problem is that assessment of tip health can be subjective. So humans have done a better job of deciding whether tips are ready to weld or need more work.

The Weld Tip System from the Orbitform Group, Jackson, Mich., is said to solve this problem. At the heart of the WTS is a special processor that makes decisions about weld tips using neuralnet algorithms. Called Cure (for concurrent universal recognition engine), the processor chip comes from Neural ID in San Mateo, Calif. “Traditional machine vision is good at handling exact matching problems,” says Neural ID CEO Tim Carruthers. “But our approach is completely different. We use a sorting engine that can handle situations where there is a certain amount of subjectivity in making decisions.”

The Orbitform WTS consists of a camera and a PC-based computer in a hardened chassis. For speed, Cure algorithms are carried out in an FPGA which lets vision operations take place in parallel. A prism sits at the camera lens to let the machine inspect weld both tips by having the robotic cell tilt its end effector in front of the lens. The machine is typically shown at least 15 examples of redressed tips that have been given passing grades by human operators. After logging these references, the device is usually ready to go into autonomous mode and start assessing tips.

“In our approach to vision, preprocessing and feature extraction takes place as with traditional machine vision. But WTS learns and sorts images using search and data-mining operations. It creates structure from existing data rather than by trying to break down what it sees into components,” says Carruthers. “For weld tips, various pass conditions can include chatter and deformity, variations in tip diameter, and different levels of opacity or shininess. The problem with conventional vision equipment is that it can learn what new tips look like and what dressed tips look like, but cannot learn what a new and dressed tip looks like.”

A cell controller can read WTS status and take appropriate action. For example, if WTS fails a set of tips after a dressing operation, the controller can send the tips back to the dresser. Three straight failures for a set of tips alerts a human operator for intervention.

Sponsored Recommendations

How to Build Better Robotics with Integrated Actuators

July 17, 2024
Reese Abouelnasr, a Mechatronics Engineer with Harmonic Drive, answers a few questions about the latest developments in actuators and the design or engineering challenges these...

Crisis averted: How our AI-powered services helped prevent a factory fire

July 10, 2024
Discover how Schneider Electric's services helped a food and beverage manufacturer avoid a factory fire with AI-powered analytics.

Pumps Push the Boundaries of Low Temperature Technology

June 14, 2024
As an integral part of cryotechnology, KNF pumps facilitate scientific advances in cryostats, allowing them to push temperature boundaries and approach absolute zero.

The entire spectrum of drive technology

June 5, 2024
Read exciting stories about all aspects of maxon drive technology in our magazine.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!