Image

Book review: Motion Sensing for Autonomous Systems

Oct. 11, 2013
Motion Vision: Design of Compact Motion Sensing Solutions for Navigation of Autonomous Systems is now out in hardcover. Published by the British organization called the Institution of Engineering and Technology, Motion Vision is written for designers working in controls engineering and looking to incorporate machine vision into their application.
After years of work by scores of engineering teams worldwide, the dream of consumer driverless cars could soon become reality. Central to the crash-free operation of such vehicles is the array of motion vision that help them avoid other objects.

One book that details such motion-vision technologies, Motion Vision: Design of Compact Motion Sensing Solutions for Navigation of Autonomous Systems, is now available in the U.S. Published by the British Institution of Engineering and Technology, Motion Vision is written for designers working in controls engineering and looking to incorporate machine vision into their application.
 
The book outlines the problem of motion estimation from biological, algorithmic, and digital perspectives. (Check out a preview of the first section on Google Books here.) It goes on to describe an algorithm that fits with the motion processing model, and hardware and software constraints. This algorithm is based on the optical flow constraint equation and introduces range information to resolve what's called depth-velocity ambiguity. It's a funtion that's key to autonomous navigation. This section to be heavy stuff, but for those that are interested, online there's copious information about the constraint equation and (thanks, Wikipedia) optical flow in general as well.

Motion Vision also explains how to implement algorithms in digital hardware, including details related to initial motion-processing models, the hardware platform, and the systerm's global functional structure.

In Chapter 5, the book gives a thorough review of motion estimation to avoid collisions through the tracking of position and approximate velocity. It's a description of a few technologies already being put to work in alternate forms in the Google car, Catapillar's self-driving haulers, and Komatsu's autonomous trucks.

Motion Vision ends with a 100-page appendix that details all the circuitry and software of the FPGA for controls and vision that the authors use as a reference example. The appendix also details the software design — which is modular — so engineers reading the book can actually use pieces of it in hardware and I/O modules of their own specification.

About the Author

Elisabeth Eitel Blog

Elisabeth is Senior Editor of Machine Design magazine. She has a B.S. in Mechanical Engineering from Fenn College at Cleveland State University. Over the last decade, Elisabeth has worked as a technical writer — most recently as Chief Editor of Motion System Design magazine.

Sponsored Recommendations

How to Build Better Robotics with Integrated Actuators

July 17, 2024
Reese Abouelnasr, a Mechatronics Engineer with Harmonic Drive, answers a few questions about the latest developments in actuators and the design or engineering challenges these...

Crisis averted: How our AI-powered services helped prevent a factory fire

July 10, 2024
Discover how Schneider Electric's services helped a food and beverage manufacturer avoid a factory fire with AI-powered analytics.

Pumps Push the Boundaries of Low Temperature Technology

June 14, 2024
As an integral part of cryotechnology, KNF pumps facilitate scientific advances in cryostats, allowing them to push temperature boundaries and approach absolute zero.

The entire spectrum of drive technology

June 5, 2024
Read exciting stories about all aspects of maxon drive technology in our magazine.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!