Image

Drones learn to observe environment and steer clear

July 17, 2014
Late in 2012, maker-tinkerer and technical writer Paul Wallich created a buzz when he wrote about the drone he built to walk his son to the bus stop each morning. Fast-forward to 2014: Now researchers think they have a new way to make such UAVs and other vehicles more nimble in the face of environmental variables. The design uses neuromorphic sensors (that get triggered by sudden events) instead of the standard inertia-measuring sensors (such as accelerometers and gyroscopes) to track motion.
About a year and a half ago, maker-tinkerer and technical writer Paul Wallich created a buzz when he wrote about the drone he built to walk his son to the bus stop each morning. The unmanned aerial vehicle (UAV) took the form of a quadrotor (four-rotor) copter, and did the job ... mostly — but Wallich admitted to having troubles with getting the UAV to power through windy days and avoid collisions.

Well, drone technology advances by leaps and bounds.  Now researchers think they have a new way to make these UAVs and other vehicles more nimble in the face of environmental variables. The programming uses neuromorphic sensors (that get triggered by sudden events) instead of the standard inertia-measuring sensors (such as accelerometers and gyroscopes) to track motion.

Even autonomous vehicles with cameras and controls need time interpret camera data about the environment. Here, state-estimation algorithms identify image features first (usually boundaries between objects through shade and color differences) then select a subset unlikely to change with new perspectives. Some dozen msec later, cameras fire again and the algorithm attempts to match current features to previous ones. Once the algorithm matches features, it calculates the vehicle’s change in position. The sampling takes 50 to 250 msec depending on how dramatically the environment changes, and the whole control cycle to correct course takes 0.2 sec or more — not fast enough to react to sudden changes in a vehicle’s surroundings.

Neuromorphic-sensor-based designs give autonomous vehicles ultra-fast reaction times — something current models lack.

To address this limitation, researcher Andrea Censi of MIT’s Laboratory for Information and Decision Systems and others have developed a way to supplement cameras with a neuromorphic sensor that takes measurements a million times a second.

Censi and colleagues presented the new algorithm at the International Conference on Robotics and Automation earlier this year. Vehicles running the algorithm can update location every 0.001 sec to make nimble maneuvers. "Other cameras have sensors and a clock, so with a 30-frames-per-sec camera, the clock freezes all the values every 33 msec" says Censi — and then values are read. In contrast, neuromorphic sensors let each pixel act as an independent sensor. “When a change in luminance is larger than a threshold, the pixel … communicates this information as an event and then waits until it sees another change."

The algorithm tracks every change in luminance every 1 µsec and supplements camera data with events, so doesn’t need to identify features. Comparing the before and after of a situation's change is easier, because even dynamic environments don't change much over a µsec. The algorithm doesn’t match all the features in the previous and current situation at once, either — but instead generates hypotheses about how far the vehicle moved. Then over time, the algorithm uses a statistical construct called a Bingham distribution to pick the hypothesis that’s confirmed most often and track vehicle orientation more efficiently that other approaches.

Recent experiments with a small vehicle fitted with a camera and event-based sensor show the algorithm is as accurate as existing state-estimation algorithms. Censi says with that done, the next step is to develop controls that decide what to do based on state estimates.

What's most interesting is that the algotrithm is said to work particularly well for making quadrators with only onboard perception and control nimbler. So maybe it's time for Wallich to perfect his son-walking UAV at last.

About the Author

Elisabeth Eitel Blog

Elisabeth is Senior Editor of Machine Design magazine. She has a B.S. in Mechanical Engineering from Fenn College at Cleveland State University. Over the last decade, Elisabeth has worked as a technical writer — most recently as Chief Editor of Motion System Design magazine.

Sponsored Recommendations

How to Build Better Robotics with Integrated Actuators

July 17, 2024
Reese Abouelnasr, a Mechatronics Engineer with Harmonic Drive, answers a few questions about the latest developments in actuators and the design or engineering challenges these...

Crisis averted: How our AI-powered services helped prevent a factory fire

July 10, 2024
Discover how Schneider Electric's services helped a food and beverage manufacturer avoid a factory fire with AI-powered analytics.

Pumps Push the Boundaries of Low Temperature Technology

June 14, 2024
As an integral part of cryotechnology, KNF pumps facilitate scientific advances in cryostats, allowing them to push temperature boundaries and approach absolute zero.

The entire spectrum of drive technology

June 5, 2024
Read exciting stories about all aspects of maxon drive technology in our magazine.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!