Could Machine Learning Seal the Deal for Lifelike Bionic Limbs?

May 12, 2017
When it comes to fine motor control and natural movements if nerve-controlled prostheses, why not let machine learning do the thinking?

Research over the past ten years has already led to advances in voluntary nerve control over bionic limbs. In 2014, Les Baugh became the first bilateral amputee to gain control of two neuro-prosthetic arms. He underwent nerve-pathway reassignment surgery to redirect signals from his brain's primary motor cortex to a location in his upper back. It took him hours with simulation software to learn how to control the arms before he put them on.

It’s breakthroughs like this that make people wonder what’s next for the technology. Baugh could train his primary motor cortex to control the movement of two arms, but controlling a bionic hand would be harder. In addition it would require complex decoding algorithms to read neuronal signals as intentions to move the prosthesis.

For this reason, many researchers turn to machine learning as a viable solution for fine motor control in bionic hands. Scientists at Newcastle University present their convoluted neural network (CNN), which uses machine vision to classify objects by their grip type. The team tested their CNN’s ability to relay that grip type to a thought-controlled prosthetic limb.

The team trained their CNN using 500 virtual objects from the free Amsterdam library of object images (ALOI). ALOI uses 80 pictures taken at 5-degree angles to represent objects; these pictures are processed as renderings that can be classified by grip type.

Memory to process 3D images is stored in the individual nodes of the CNN layers. When presented with completely different objects in real life, the CNN could classify grip type with an accuracy of 75%.They used their results to generate the free Newcastle Library of Grip Types

The team tested their CNN’s ability to integrate with a myoelectric prosthesis, and its depth perception with a stereo-lens camera. The prosthesis was tested by trans-radial amputees in a clinical setting using objects of 4 different grip classes. Amputees could initiate a small wrist movement through voluntary nerve commands to activate the stereo-lens camera. After the CNN classified the grip from the snapshot, the hand could move into the necessary position and pick up objects.

With more research, deep neural networks could be used to improve the natural movements of prostheses. Paired with other tools beside machine vision, machine learning could be used to augment voluntary control in movements like walking and jumping.

Read the full methods and results of this experiment in the Journal of Neural Engineering.

Read more about the bionic arms used by Les Baugh with engineering resources provided by Johns Hopkins Applied Physics Laboratory.

Sponsored Recommendations

High Pressue, High Temperature Pump

April 29, 2024
This innovative axial piston design eliminates the use of elastomers, increases resistance to contamination, and dramatically improves reliability. They can generate up to 10,...

MOVI-C Unleashed: Your One-Stop Shop for Automation Tasks

April 17, 2024
Discover the versatility of SEW-EURODRIVE's MOVI-C modular automation system, designed to streamline motion control challenges across diverse applications.

A Comprehensive Guide for Automation Success

April 17, 2024
Gain insight into the benefits that SEW-EURODRIVE's streamlined automation processes offer to industries involved in machine automation and factory operations.

Navigating the World of Gearmotors and Electronic Drives

April 17, 2024
Selecting a gearmotor doesn’t have to be a traumatic experience. The key to success lies in asking a logical sequence of thoughtful questions.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!