Machinedesign 2882 Direct Sm 0

When human gestures direct machines

May 19, 2011
Using popular gaming technology, production lines will soon let workers direct equipment with speech and gestures

Authored by:
Rakesh Kumar
Global Industry
Product Director, Manufacturing,
Microsoft Dynamics ERP Microsoft Corp.
Redmond, Wash.
Edited by Leslie Gordon
[email protected]
Resources:

Microsoft Corp.,
www.microsoft.com

Today’s factory employees are much more likely to leave home each morning with a smartphone than a lunch pail. Upon returning home, they will be snapping up the TV remote — no longer to watch the welterweight title bout but, for instance, rather to stage a virtual boxing match with friends across the country. Smart technology in communications devices and advances in video-motion sports have transformed the way members of today’s workforce live and communicate. That same technology is about to change the way employees work on the plant floor. Future workers can become more productive by using the same technology in the workplace that they enjoy at home.

In the near future, assembly-line workers will be controlling their computers with simple hand motions, instead of a mouse or keyboard. In place of the graphical-user interface (GUI), which lets users click on icons with a mouse or keyboard button to select an application or issue a command, the next generation of computers on the production line will be operated by a natural user interface (NUI). This new type of interface, which began with touchscreens, has evolved to incorporate voice and gesture recognition as well as biometrics, such as iris scans and fingerprints to identify users.

Gesture and voice recognition are the core aspects of gaming technology that let people kick a virtual ball, throw a punch in an electronic ring, or slam an on-screen serve just by carrying out those arm and body motions in front of TV sets equipped with a game player that operates with a sensor accessory. The innovation that drives such consumer technology is about to begin driving assembly lines.

How does gesture and voice recognition work?
Inside the sensing accessory resides a color video camera, a depth sensor that provides a 3D perspective, and an array of four microphones that isolate the voices of players from extraneous room noises. Advanced software senses the layout of the room and tracks dozens of points on each player’s body, calculating the body shape and skeletal structure. Referencing the distance between each individual’s joints, the software monitors motions and responds to movements.

Gestures — often combined with biometrics — can be applied to a number of processes on the assembly line. For example, instead of logging into a machine with a user ID and password, workers can simply look into a device that scans their retina, using that pattern to log in. With a simple hand gesture, an employee can command the computer to start operations and, with a standard outstretched-arm signal, can direct it to stop. The machine can be programmed to ask for confirmation of such gesture-based commands, in which case, the employee would respond verbally with a “yes” to initiate the function.

In another example, a line worker may need to move partially finished goods to a pallet or another machine. NUIs will let workers indicate which subassemblies need to be moved and their destination. Robotics can take over from that point to complete the transfer, providing a safer process and a speedier operation.

The sensing technology embedded in the computer is not distracted by random movements or conversations of others on the plant floor. It can identify the individual who logged in and respond only to that person’s motions. Users can easily require that when they are out of the computer’s sight for more than, say, 20 sec, the computer will shut down. Should a worker move away from a workstation for a few minutes, the computer will not respond to someone else unless it had previously been authorized to let another person (for example, the machine operator on the next work shift) manage the equipment.

How will manufacturers benefit from gesture-based technology?
In the manufacturing plant, gesture-based interfaces offer several advantages over conventional interfaces. On the plant floor, employees often wear gloves and work in settings containing lots of dust, dirt, and grease. These conditions may present challenges for touchscreens, which can easily become smudged and hard to read. Workers usually need to remove their gloves before selecting an icon on a touchscreen.

A mouse and keyboard can also hinder productivity. Workers must pause their activities to guide the mouse or enter commands on the keyboard. Moreover, the devices are subject to considerable wear and tear and are exposed to potential contaminants that can clog their operation or cause breakage. When a control device is down, it sometimes even halts production until repairs or replacements are available.

In contrast, a gesture-based interface does not require touching the screen, so the devices remain cleaner and more functional. Productivity stays high because gestures can be incorporated in the workflow and completed quickly.

Removing keyboards and mice from factory floors eliminates the costs of buying and maintaining the devices. Furthermore, companies don’t have to spend money on separate input devices, because workers naturally have gesture-generating abilities. Certainly the plant will need to invest in some different skill sets to maintain the new embedded technology, but its maintenance likely will require considerably less time and cost than conventional systems.

Another benefit is that gesture-based systems will demand only minimal training. Since a large proportion of the new generation of employees will be familiar with the technology from consumer applications, adopting it in the plant will be easy and natural.

The new generation of manufacturing employees now use gesture-based interfaces (through touchless technology or touchscreens) in their homes — not only for gaming but also to flip through and resize images on their mobile phones and to conduct mobile transactions. It’s natural to anticipate that people would prefer to use similar technology on the job as well.

Finally, interfaces relying on gestures can overcome the social and monetary costs of translation barriers. Everyone will use the same “vocabulary” of gestures to operate their computers. Manufacturers in countries where governments demand that businesses use the native language can employ the same gestures there as in other nations, demonstrating the inherent “universal language” nature of gestures. And training doesn’t entail explaining keyboard techniques and symbols.

Where will the first gesture-based technology start?

Currently, Microsoft’s gesture-based technology is a stand-alone complement to any of a wide range of business applications. Rather than being built into business software, it is available to be integrated into factory computer systems in the most logical and productive manner for each company or facility.

For example, the recently introduced Microsoft Dynamics AX 2012 solution for enterprise resource planning features a touch-based user interface on the shop floor. Independent software developers have expressed a strong interest in extending the technology to a gesture-based interface.

Gesture-based NUIs will probably first show up in facilities that produce heavy equipment, such as automobiles and large machine tools, as well as in the chemical industry. That’s because these factories use a wide range of substances that can dirty screens and gum up computer control units. NUIs would help maintain clean and continuous operations. The interfaces would also be highly effective in specialized settings such as cold rooms where employees wear heavy gloves and have difficulty moving to and from the computer controls. Because touch-based systems can result in product contaminations, NUIs will also benefit certain pharma and food-processing industries where hygiene is of the utmost importance.

What’s next?
Once gesture-based technology is incorporated into the plant-floor workflow, it will migrate into office settings for accounting, human resources, communications, and conferencing. The productivity and cost effectiveness of the new style of interface will generate rapid adoption, and the entire facility will be more streamlined.

As transforming as it appears to be, gesture-based technology is simply an early stage NUI. Innovators around the world are working on even more-advanced NUI technologies. One is a system that tracks workers’ eyes and, thereby, anticipates what they may want to do next. Another is a technology that uses the human body itself as an interface, deploying highly sensitive acoustic sensors in a band worn on the upper arm to pick up finger taps on the employee’s lower arm and translate them into computer commands.

© 2011 Penton Media, Inc.

Sponsored Recommendations

From concept to consumption: Optimizing success in food and beverage

April 9, 2024
Identifying opportunities and solutions for plant floor optimization has never been easier. Download our visual guide to quickly and efficiently pinpoint areas for operational...

A closer look at modern design considerations for food and beverage

April 9, 2024
With new and changing safety and hygiene regulations at top of mind, its easy to understand how other crucial aspects of machine design can get pushed aside. Our whitepaper explores...

Cybersecurity and the Medical Manufacturing Industry

April 9, 2024
Learn about medical manufacturing cybersecurity risks, costs, and threats as well as effective cybersecurity strategies and essential solutions.

Condition Monitoring for Energy and Utilities Assets

April 9, 2024
Condition monitoring is an essential element of asset management in the energy and utilities industry. The American oil and gas, water and wastewater, and electrical grid sectors...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!