|
|
|
|
|
Just 20 years ago, useful humanoid robots were little more than science fiction. Most robots back then either looked like ordinary manufacturing equipment or were special devices for undersea exploration. But that's all changed with recent advances in technology that puts the age of personal robots just around the corner. Recent announcements indicate it may already be here.
To the home-robot market comes the ER1 from Evolution Robotics, Pasadena, Calif. The company supplies the hardware and software, you supply a laptop PC. The unit comes unassembled or preassembled with extruded aluminum beams forming the main support structure. Software called the Robot Control Center runs on Windows 98 and up. Users can instruct the robot to respond to voice commands, take photographs and/or video of its environment, and send images to an e-mail address. It can also play music from a CD it recognizes, read books from prerecordings, and send reminders to its owner.
"With the processing power now available in Pentium processors, and a patent-pending vision algorithm that can recognize objects in a realworld setting, ER1 is the first personal robot that can navigate and act based on vision," said Bill Gross, founder and executive chairman of Evolution Robotics. The company launched in February and announced its personal robot at the Electronic Entertainment Expo in May.
Most personal robots now available are little more than toys and electronic animal companions. Examples include the Furby or Sony's AIBO. The introduction of Lego Mind Storms and Sony's AIBO in 1998 spurred much interest in owning robots. The ER1 aims to make robots useful and practical.
Home robots made their first appearance two decades ago. In the early 1980s, several manufacturers introduced home robots, including B.O.B. (for brains on board), Topo, and the Hero I from Heath (see MACHINE DESIGN, March 24, 1983, pp. 85-90). The relatively slow processors of the time set limits on functions the robots could perform. Moore's Law, with processing power rising as prices drop, has made affordable personal robots possible. The ER1 is the first reasonably priced robot for consumers which will actually do something useful.
BRAINS BEHIND THE ‘BOTS
The robot's primary sensor is the video camera. It is a standard, off-theshelf Web cam with maximum resolution of 640 480, up to 30-fps capture rate, and a USB link to the laptop. A proprietary vision algorithm lets the robot see, recognize, and avoid running into objects. The bot can also recognize where it is, analyze images, and select features such as colors and edges for comparison to a database of instances it knows about. For example, it knows it has collided with a wall when the image stops moving.
The robot can be trained to recognize thousands of objects from examples it sees. The vision algorithm lets the robot recognize objects even if their orientation or lighting differ from the example. Recognition capabilities increase with a higher resolution camera.
An optional second camera points down in front of the robot for more advanced obstacle avoidance. This lets the main camera concentrate on other tasks. An 802.11b wireless network card lets the robot be controlled from another PC. The software also supports a remote link to the robot if its owner happens to be away from home.
The camera and a separate motor controller plug into USB ports on the laptop. The software interprets the vision data and figures out the motor commands to send the motor controller. But this isn't exactly new. Back in the 1960s, robots such as Stanford's Shakey used cameras for vision. The difference is in the processing. Two DEC computers did the calculations for Shakey's vision algorithm. By contrast, the ER1 uses the laptop processor to handle the vision algorithm numbercrunching and two Texas Instruments 16-bit DSPs for motor control.
Two independently controlled stepper motors from Shinano Kenshi of Japan (www. shinano.com) move the robot. Power can be controlled to save battery life and manage torque, which at 2 A is about 100 oz-in.
The drive system is tooth-and-belt with a spur gear on the motor and a pulley on the wheel. The differential drive operates so any variability in the left or right motor velocity makes the robot turn. The controller sends outputs to PWM-based H bridges (two for each motor) which power the stepper phases.
The biggest engineering challenge for the ER1 was balancing cost and performance, according to Rich Diephuis, vice president of engineering. As an example, stepper motors beat out servos because they cost less, though they have some vibration problems. The motors get microstepped to compensate. They run 200 steps/rev with 64 steps/step, for a total of 12,800 steps/rev.
The robot moves on hardwood floors, linoleum, tiles, and relatively smooth carpets. It can also scoot across small thresholds but not across thick carpets. This capability would have cost more, necessitating more powerful motors and bigger wheels.
Power comes from a 5.4 A-hr, 12-V lead-acid battery which is good for nearly 3 hr of operation. Power to the motors can vary to lengthen battery life. Moving across a hardwood floor requires little torque so it is an opportunity to cut power and conserve battery life. Reducing the power reduces torque and speed, but it lets the robot run longer. Image processing can also speed up or slow down depending on the amount of CPU power available. The more capacity, the faster the images can process.
Users train the robot through a graphical interface. Training centers around IF-THEN statements. Autonomous movements include moving or rotating toward an object, or moving a specific distance or a set of degrees. There are 96 behavior paths which can sequence together to make up complex tasks. For instance, a sample program might consist of announcing the arrival of a Fed Ex truck. The robot would first look for the Fed Ex logo, then move forward 10 ft, turn 90 and play the .wav file saying "Fed Ex has arrived." Sequences can then be saved as .rbt files and called up individually later. Robot owners can also share files with others over the Web, where a file-sharing community has sprung up.
Action can also take place after a given string of events. But the software doesn't yet support stringing these together with logical AND functions. To get sequences, several behaviors must be chained together. Users can program either directly on the laptop or on another PC in the house.
A gripper arm will be available soon. For now, engineers are experimenting with the arm in the lab. For instance, a robot equipped with a gripper arm can roll over to a small refrigerator with a handle mounted lower to the ground. A hook on the robot arm grabs onto the door handle. The robot opens the door, picks up a soft drink or beer, closes the door, and brings it to its master. This sequence of events is actually about 40 behaviors linked together.
ROBOTS LIKE US
On the higher end of the robot spectrum is Honda's Asimo (Advanced Step in Innovative Mobility), representing the latest in biped, walking robots.
The VXWorks real-time embedded operating system from Wind River, Alameda, Calif.(www. windriver.com), controls all of Asimo's functions. The seemingly simple act of walking is the most complicated function that Asimo performs. To walk, the robot must remain upright, manage a series of servomotors that control movement, and adjust to both the angle and terrain of the walking surface. In addition, the operating system manages the wireless sending and receiving of commands, in addition to data from camera images, gyroscopes and accelerometers.
Asimo weighs 52 kg and is approximately 4 ft tall. He can walk at approximately 1 mph and is powered by a nickel-zinc battery which can run for approximately 30 min. His skin is a tough, lightweight magnesium alloy.
Honda has built on the knowledge begun with the inception of the project in 1986. "A single Asimo computer program must simultaneously run multiple tasks," says Asimo chief engineer Toru Takenaka. "For instance, there is a task to take control of leg balance and a task to operate the arms, plus there is the communication between the motors and computer enabling the actual movement of Asimo's joints, not to mention wireless communication with external systems."
The robot's movements mimic that of a human with joints designed to articulate and move. Smooth walking comes via a real-time, predictive motion-control technology called i-Walk. It lets Asimo walk continuously while changing directions and improves stability in response to sudden movements. Earlier methods of walking were based on stored walking patterns. Different patterns were used for straight walking and for turning, with a slight, awkward pause during the transition.
The new method uses predictive movement control for more flexible, smoother, natural walking. The key is adjusting the robot's center of gravity. When humans walk straight and start to turn a corner, they shift their center of gravity toward the inside of the turn. Predictive control predicts the next movement in real time and shifts the center of gravity in anticipation of the turn.
"We want Asimo to be applied in the home to perform tasks that we do everyday such as daily housework or heavy lifting," says Takenaka. "There will also be some use in public areas such as robots working as guides at museums or as body guards."
With a lease price of $150,000 per year, Asimo may be a bit out of the price range of your average consumer for now. But the falling price of computing power virtually ensures that one day soon more and more Asimo-type robots will become part of people's everyday lives.
ROBOTS ON THE WEB
There are dozens of Web sites devoted to all things robotic; from robot history and current developments to future technology and competitions. Use these Web sites as starting points.
• Honda http://world.honda.com/ASIMO (or go directly to http://world.honda.com/ASIMO/movies/index.html to view amazing videos of Asimo in motion.)
• Evolution Robotics www.evolution.com
• USC Robotics Research Lab www.robotics.usc.edu
• Lego MindStorms http://mindstorms.lego.com
• MIT Artificial Intelligence Laboratory www.ai.mit.edu
YOU'VE COME A LONG WAY, SHAKEY...
Shakey was the first mobile robot to reason about its actions. He was developed by Stanford Research Institute's Artificial Intelligence Center from 1966 through 1972.
Shakey had a TV camera, a triangular rangefinder, and bump sensors and was connected to DEC PDP-10 and PDP-15 computers via radio and video links. The computers handled the number crunching, up to 250,000 calculations/sec, for perception, world modeling, and action programs.
Low-level action routines took care of simple moving, turning, and route planning. Intermediate-level actions strung together low-level ones to accomplish more complex tasks. Higher-level programs could make and execute plans to achieve user-specified goals. The system also generalized and saved these plans for future use.
On a good day, it could formulate and execute, over a period of hours, plans involving moving from place to place and pushing blocks to achieve a goal.
HISTORY OF ROBOTS
Robots have a long and varied history beginning in the 19th century. They've been used in industry, exploration, and medicine as well as in more leisurely pursuits such as in entertainment and toys.
1801 – Joseph Jacquard invents a punch-card-operated textile machine, or a programmable loom.
1892 – Seward Babbitt designs a motorized crane with a gripper to remove ingots from a furnace.
1898 – Inventor Nikola Tesla demonstrates a radio-controlled robotic boat.
1921 – The first use of the term "robot" appears in a play in London by the Czech writer Karel Capek. The term derives from the Czech "robota" meaning a serf or one in subservient labor.
1938 – Americans Willard Pollard and Harold Roselund design a programmable paint-spraying mechanism.
1941 – Isaac Asimov first uses the word "robotics" to describe the technology of robots and predicted the rise of the robot industry.
1948 – Norbert Wiener publishes "Cybernetics," describing the concept of communication and control in electronic, mechanical, and biological systems.
1954 – The first programmable robot is designed by George Devol, who coins the term Universal Automation, and then shortens it to Unimation, which becomes the name of the first robot company.
1956 – George Devol and Joseph Engelberger form the first robot company, Unimation.
1961 – The first industrial robot is installed in a General Motors production line in New Jersey.
1968 – Stanford Research Institute builds Shakey, a mobile robot with vision.
1970 – An electronic, computer-controlled robot arm is developed at Stanford University.
1973 – A minicomputer-controlled robot called the T3 (The Tomorrow Tool) is developed by Richard Hohn for Cincinnati Milacron.