Randy Montoya
Sandia National Laboratories researchers (from left) Steve Verzi, William Severa, Brad Aimone and Craig Vineyard hold different versions of neuromorphic hardware platforms.
Sandia National Laboratories researchers (from left) Steve Verzi, William Severa, Brad Aimone and Craig Vineyard hold different versions of neuromorphic hardware platforms.
Sandia National Laboratories researchers (from left) Steve Verzi, William Severa, Brad Aimone and Craig Vineyard hold different versions of neuromorphic hardware platforms.
Sandia National Laboratories researchers (from left) Steve Verzi, William Severa, Brad Aimone and Craig Vineyard hold different versions of neuromorphic hardware platforms.
Sandia National Laboratories researchers (from left) Steve Verzi, William Severa, Brad Aimone and Craig Vineyard hold different versions of neuromorphic hardware platforms.

Spiking Software Streamlines AI on Neural Networks

March 7, 2019
The Whetstone software tool makes artificial intelligence algorithms more efficient, letting them work on smaller, less-power-hungry hardware, a major benefit for smartphones and self-driving cars.

Artificial neurons are basically capacitors that absorb and sum electrical charges they then release in tiny bursts of electricity. Computer chips, called “neuromorphic systems,” assemble neural networks into large groupings that mimic the human brain by sending electrical stimuli to neurons firing in no predictable order. This contrasts with the more lock-step procedure used by most computers with pre-set electronic processes.

Because of their haphazard firing, neuromorphic systems are often slower than conventional computers but also require far less energy to operate. They also require a different approach to programming because otherwise their artificial neurons fire too often or not often enough, which has been a problem in commercializing them.

To solve this problem, computer engineers use spiking tools that let artificial neurons release energy in spikes, much like human neurons do. Researchers at Sandia National Laboratory developed a spiking tool they call Whetstone. It acts as supplemental computer code for conventional software training programs. Whetstone trains and sharpens artificial neurons by leveraging those that spike only when a sufficient amount of energy (data) has been collected. This training has improved standard neural networks and is being evaluated for use with neuromorphic systems, which have usually been trained in ad hoc ways rather than using a standardized method.

Whetstone can be visualized as a way to control a class of talkative students tasked with identifying an object on their teacher’s desk. Prior to Whetstone, the students sent a continuous stream of sensor input to their formerly overwhelmed teacher, who had to listen to all of it before passing a decision into the neural system. This huge amount of information often requires a lot of computing power to process, which also adds an increase in electrical power, or cloud computing. Both options add time and costs to commercial AI products, reduce security and privacy, and make their acceptance less likely.

Under Whetstone, their newly strict teacher only pays attention to a simple “yes” or “no” measurement of each student when they raise their hands with a solution, rather than to everything they are saying. Suppose, for example, the intent is to determine if a piece of green fruit on the desk is an apple. Each student is a sensor that may respond to a different quality of what makes up an apple: Does it have the right of smell, taste, texture, and so on? And although a student looking for red may vote “no,” another student looking for green would vote “yes.” When the number of answers, either yay or nay, is electrically high enough to trigger the neuron’s capacity to fire, that simple result, instead of endless waffling, enters the overall neural system.

Although Whetstone simplifications could potentially increase errors, the overwhelming number of participating neurons, often over a million, provide information that statistically make up for inaccuracies introduced by data simplification.

Whetstone works best when patched in to programs meant to train new artificial intelligence equipment. That because Whetstone doesn’t have to overcome learned patterns with already established energy minimums.

Whetstone has been shown to let neural computer networks process information up to 100 times more efficiently than the current industry standard, say the Sandia researchers who developed it.

It also greatly reduces the amount of circuitry needed to perform autonomous tasks. This should help AI become more popular and useful for mobile phones, self-driving cars, and automated interpretation of images.

The largest AI companies have developed spiking tools for their own products, but none are as fast or efficient as Whetstone. And their tools usually only work only their own hardware and software. Whetstone, in contrast, works on many neural platforms.

Sponsored Recommendations

NEW Low Profile, Ultra Compact Power Supplies

March 13, 2024
Learn more HERE about Altech's Power supplies!

Altech's Liquid Tight Strain Relifs Catalog

March 13, 2024
With experienced Product Engineers and Customer Service personnel, Altech provides solutions to your most pressing application challenges. All with one thought in mind - to ensure...

Industrial Straight-Through Cable Gland

March 13, 2024
Learn more about Altech's cable glands and all they have to offer for your needs!

All-In-One DC-UPS Power Solutions

March 13, 2024
Introducing the All-In-One DC-UPS, a versatile solution combining multiple functionalities in a single device. Serving as a power supply, battery charger, battery care module,...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!