Machinedesign 36 Eitel

Drones learn to observe environment and steer clear

July 17, 2014
Late in 2012, maker-tinkerer and technical writer Paul Wallich created a buzz when he wrote about the drone he built to walk his son to the bus stop each morning. Fast-forward to 2014: Now researchers think they have a new way to make such UAVs and other vehicles more nimble in the face of environmental variables. The design uses neuromorphic sensors (that get triggered by sudden events) instead of the standard inertia-measuring sensors (such as accelerometers and gyroscopes) to track motion.
About a year and a half ago, maker-tinkerer and technical writer Paul Wallich created a buzz when he wrote about the drone he built to walk his son to the bus stop each morning. The unmanned aerial vehicle (UAV) took the form of a quadrotor (four-rotor) copter, and did the job ... mostly — but Wallich admitted to having troubles with getting the UAV to power through windy days and avoid collisions.

Well, drone technology advances by leaps and bounds.  Now researchers think they have a new way to make these UAVs and other vehicles more nimble in the face of environmental variables. The programming uses neuromorphic sensors (that get triggered by sudden events) instead of the standard inertia-measuring sensors (such as accelerometers and gyroscopes) to track motion.

Even autonomous vehicles with cameras and controls need time interpret camera data about the environment. Here, state-estimation algorithms identify image features first (usually boundaries between objects through shade and color differences) then select a subset unlikely to change with new perspectives. Some dozen msec later, cameras fire again and the algorithm attempts to match current features to previous ones. Once the algorithm matches features, it calculates the vehicle’s change in position. The sampling takes 50 to 250 msec depending on how dramatically the environment changes, and the whole control cycle to correct course takes 0.2 sec or more — not fast enough to react to sudden changes in a vehicle’s surroundings.

Neuromorphic-sensor-based designs give autonomous vehicles ultra-fast reaction times — something current models lack.

To address this limitation, researcher Andrea Censi of MIT’s Laboratory for Information and Decision Systems and others have developed a way to supplement cameras with a neuromorphic sensor that takes measurements a million times a second.

Censi and colleagues presented the new algorithm at the International Conference on Robotics and Automation earlier this year. Vehicles running the algorithm can update location every 0.001 sec to make nimble maneuvers. "Other cameras have sensors and a clock, so with a 30-frames-per-sec camera, the clock freezes all the values every 33 msec" says Censi — and then values are read. In contrast, neuromorphic sensors let each pixel act as an independent sensor. “When a change in luminance is larger than a threshold, the pixel … communicates this information as an event and then waits until it sees another change."

The algorithm tracks every change in luminance every 1 µsec and supplements camera data with events, so doesn’t need to identify features. Comparing the before and after of a situation's change is easier, because even dynamic environments don't change much over a µsec. The algorithm doesn’t match all the features in the previous and current situation at once, either — but instead generates hypotheses about how far the vehicle moved. Then over time, the algorithm uses a statistical construct called a Bingham distribution to pick the hypothesis that’s confirmed most often and track vehicle orientation more efficiently that other approaches.

Recent experiments with a small vehicle fitted with a camera and event-based sensor show the algorithm is as accurate as existing state-estimation algorithms. Censi says with that done, the next step is to develop controls that decide what to do based on state estimates.

What's most interesting is that the algotrithm is said to work particularly well for making quadrators with only onboard perception and control nimbler. So maybe it's time for Wallich to perfect his son-walking UAV at last.

Sponsored Recommendations

MOVI-C Unleashed: Your One-Stop Shop for Automation Tasks

April 17, 2024
Discover the versatility of SEW-EURODRIVE's MOVI-C modular automation system, designed to streamline motion control challenges across diverse applications.

Navigating the World of Gearmotors and Electronic Drives

April 17, 2024
Selecting a gearmotor doesn’t have to be a traumatic experience. The key to success lies in asking a logical sequence of thoughtful questions.

The Power of Automation Made Easy

April 17, 2024
Automation Made Easy is more than a slogan; it signifies a shift towards smarter, more efficient operations where technology takes on the heavy lifting.

Lubricants: Unlocking Peak Performance in your Gearmotor

April 17, 2024
Understanding the role of lubricants, how to select them, and the importance of maintenance can significantly impact your gearmotor's performance and lifespan.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!