Control and functionality of prosthetics continue to make technological strides, creating more natural movements for amputees. One area that’s showing promise is neuroprosthetics, so much so that a recent breakthrough at the Johns Hopkins Advanced Physics Lab (APL) left even the research scientists conducting the study in shock: A bilateral amputee could control both arms just by imagining the desired motions. After rehab and training, Les Baugh, who lost both his arms in an electrical accident, was able to simultaneously control two prosthetic arms to fulfill everyday tasks.
It represents one of the latest advances stemming from The Revolutionizing Prosthetics 2009 program, funded by the U.S. Defense Advanced Research Projects Agency (DARPA), which aims to develop advanced neuroprosthetics to restore function, control, comfort, appearance, and feeling to amputees.
Scientists at APL developed an advanced prosthesis with the 27 degrees of freedom present in a natural arm, from the shoulder to the fingertips. As required by DARPA, the prosthesis is naturally controlled with voluntary neural signals from the brain’s control centers. The arm is modular for reproducibility and versatility for all levels of arm amputees. APL teams collaborated with other teams around the country to develop sensors, software, electromechanical actuation, an accurate neural-prosthetic interface, and more to build the FDA-approved prosthesis for clinical trials. Five years later, the prosthetic is successful.
For Baugh, clinical testing of the arm prototype began with targeted muscle re-innervation (TMR) surgery. In surgery, nerve bundles that once transmitted voluntary motion signals from the brain to muscles in the limb were relocated to innervate the pectoral muscles. The reassigned nerves send voluntary motion signals to the pectoral muscles, generating electric potentials in the pectorals that can be read with electrodes on the skin surface. These muscular electric potentials are called myoelectric signals.
Before clinical testing, the APL neural team determined what myoelectric signals are generated when a person envisions or expects to make a desired motion. When a person thinks about performing a motion, the signal transmits from the brain, through the nerves, and to the muscle, generating a myoelectric potential at a certain frequency and amplitude for muscle movement. After reassignment, these myoelectric frequencies are sent to the pectorals and picked up by the prosthetic’s electrodes placed on the chest.
Scientists and software engineers created pattern-recognition algorithms for frequency, amplitude, and location of myoelectric signals that would generate the desired movement of the prosthesis. They then incorporated the algorithms into a Virtual Integration Environment (VIE), a realistic virtual training program that inputs the mental cue by the amputee to move a virtual limb with the intended motion.
After recovering from surgery, Baugh began training with the virtual prosthetic in VIE. VIE leveraged MATLAB to provide a graphical user interface, Simulink for real-time modeling and simulation, Delta3D’s open dynamics engine for real-time collision detection, and MSMS to render a 3D virtual world. VIE training prepared Baugh for easier control when he tried the real prosthetic.
While Baugh completed his virtual training, the researchers fitted Baugh’s chest, back, and shoulders with a plastic jacket containing a socket for the new prostheses and electrodes for measuring myoelectric signals at the chest. The jacket would evenly distribute the weight of the prostheses, allow the intended degrees of freedom for the prosthetic shoulder, and align electrodes at the neural-prosthetic interface to read the myoelectric signals generated in the pectoral muscles. After nerve reassignment surgery, VIE training, and fitting, Baugh put on the jacket, inserted the prosthesis, and tried it out. He was able to pick up objects and move them.
In Phase II of the Revolutionizing Prosthetics 2009 program, researchers at Johns Hopkins designed a neuroprosthetics arm that was used in the clinical trial. Researchers at APL found that a person requires 50 Wh for daily upper limb activity. A removable, chargeable lithium-polymer battery , which maintains high energy density, powers the prosthetic arm. The arm uses lightweight carbon fiber and titanium to reduce weight, and the gear train takes advantage of a friction planetary reduction stage to decrease noise and provide the required strength. The torque output can reach as high as 60 N-m in the upper arm.
The upper Phase II limb uses electromechanical actuation with drives at nearly every joint. Electromechanical actuation was chosen because parts are compact, safely insulated, modular, and readily available on the market, adhering to the 24 months allowed for Phase II design. Modular parts were manufactured by HDT Robotics, with components manufactured by Micro Waterjet. To eliminate messy wiring and improve compactness of the arm, electronic and mechanical interfaces coexist at the joints. For instance, local small motor controllers (SMCs), drives, and motors are all built into the joints at the hand and wrist.
The prosthetic hand posed challenges in actuation design for APL researchers because it requires high torque to grasp objects, but has limited room for components. DARPA required that all actuators in the hand have a common drivetrain to increase reliability and decrease weight and bulk. The developmental team chose an intrinsic, locally acting actuation system, rather than put components in the forearm, to make the prosthetic available for trans-radial amputees (limb missing below the middle of the forearm).
The joints that connect the fingers to the palm (MCP joints) each have two drives, one for flexion/extension (finger curling) and another for abduction/adduction (finger spreading). Three-stage planetary drives were used for finger spreading and cycloidal drives for finger curling. Miniature brushless linear DC (BLDC) motors power the drives. Planetary and cycloidal transmissions were chosen because of their small packaging and high gear ratio for torque actuation. All finger joints have one drive for each degree of motion (DOM) in the joint and an embedded SMC for position and velocity control.
The thumb has four DOMs and uses three-stage planetary drives for actuation. That’s because they’re smaller than the cycloidal drives, yet have a similar gear ratio for high torque.
For precise motor control of the hand, SMCs are imbedded at each joint. They all communicate directly with the main circuit board’s limb controller through an RS-485 communications bus. Conversely, large motor controllers (LMCs) are used to control more than one joint, including the four upper arm joints and the three wrist joints. The LMCs are responsible for BLDC commutation, sensor signaling, and communication with the main circuit’s limb controller. LMCs also monitor local joint sensors for temperature, torque, position, current, and rotor position for motor commutation.
While sensors on the hand and arm were used for control by the LMCs and SMCs, force sensors also were implemented to create the illusion of touch. When a sensor on the prosthetic came in contact with an object, a small vibrator at the pectoral nerves activated, creating the illusion of touch. The team is still developing cosmesis methods to make the arm look more natural. So far, function seems to be down pat.