Machinedesign 36 Eitel

Book review: Motion Sensing for Autonomous Systems

Oct. 11, 2013
Motion Vision: Design of Compact Motion Sensing Solutions for Navigation of Autonomous Systems is now out in hardcover. Published by the British organization called the Institution of Engineering and Technology, Motion Vision is written for designers working in controls engineering and looking to incorporate machine vision into their application.
After years of work by scores of engineering teams worldwide, the dream of consumer driverless cars could soon become reality. Central to the crash-free operation of such vehicles is the array of motion vision that help them avoid other objects.

One book that details such motion-vision technologies, Motion Vision: Design of Compact Motion Sensing Solutions for Navigation of Autonomous Systems, is now available in the U.S. Published by the British Institution of Engineering and Technology, Motion Vision is written for designers working in controls engineering and looking to incorporate machine vision into their application.
 
The book outlines the problem of motion estimation from biological, algorithmic, and digital perspectives. (Check out a preview of the first section on Google Books here.) It goes on to describe an algorithm that fits with the motion processing model, and hardware and software constraints. This algorithm is based on the optical flow constraint equation and introduces range information to resolve what's called depth-velocity ambiguity. It's a funtion that's key to autonomous navigation. This section to be heavy stuff, but for those that are interested, online there's copious information about the constraint equation and (thanks, Wikipedia) optical flow in general as well.

Motion Vision also explains how to implement algorithms in digital hardware, including details related to initial motion-processing models, the hardware platform, and the systerm's global functional structure.

In Chapter 5, the book gives a thorough review of motion estimation to avoid collisions through the tracking of position and approximate velocity. It's a description of a few technologies already being put to work in alternate forms in the Google car, Catapillar's self-driving haulers, and Komatsu's autonomous trucks.

Motion Vision ends with a 100-page appendix that details all the circuitry and software of the FPGA for controls and vision that the authors use as a reference example. The appendix also details the software design — which is modular — so engineers reading the book can actually use pieces of it in hardware and I/O modules of their own specification.

Sponsored Recommendations

How BASF turns data into savings

May 7, 2024
BASF continuously monitors the health of 63 substation assets — with Schneider’s Service Bureau and EcoStruxure™ Asset Advisor. ►Learn More: https://www.schn...

Agile design thinking: A key to operation-level digital transformation acceleration

May 7, 2024
Digital transformation, aided by agile design thinking, can reduce obstacles to change. Learn about 3 steps that can guide success.

Can new digital medium voltage circuit breakers help facilities reduce their carbon footprint?

May 7, 2024
Find out how facility managers can easily monitor energy usage to create a sustainable, decarbonized environment using digital MV circuit breakers.

The Digital Thread: End-to-End Data-Driven Manufacturing

May 1, 2024
Creating a Digital Thread by harnessing end-to-end manufacturing data is providing unprecedented opportunities to create efficiencies in the world of manufacturing.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!