Thinkstock
Hacker

Robotic Decoy Designed to Foil Attacking Hackers

April 9, 2018
Georgia Tech researchers have turned the tables on hackers who have designs on factory robots.

Cybersecurity experts at the Georgia Institute of Technology have assembled a new tool in the fight against hackers: a decoy robot. They built “HoneyBot” to lure hackers into thinking they had taken control of a factory robot, but instead the robot gathers valuable information about them and helps the business protect itself from future attacks.

The four-wheeled robot is smaller than a breadbox. It can be monitored and controlled through the internet. But unlike other remote-controlled robots, the HoneyBot’s special ability is tricking its operators into thinking it is performing one task, when it’s actually doing something completely different.

“The idea behind a honeypot is that you don’t want the attackers to know they’re in a honeypot,” Beyah said. “If the attacker is smart and looking out for honeypots, maybe they’d look at different sensors on the robot, such as an accelerometer or speedometer, to verify the robot is doing what it had been instructed. That’s why we will be spoofing that information. The sensors will indicate the robot accelerated from point A to point B.”

The Georgia Tech decoy robot is designed to trick hackers into thinking they have taken control of a factory robot, but the machine is actually gathering data on the hackers.

In a factory setting, such a robot could sit motionless in a corner, springing to life when a hacker gains access, a visual indicator that a malicious actor is targeting the facility.

Rather than letting hackers to run amok in the factory, it could follow some harmless commands such as picking up objects and stopping far short of doing anything dangerous.

So far, the researchers’ technique seems to be working. In experiments to test how convincing false sensor data is to individuals remotely controlling the device, volunteers used a virtual interface to control the robot; they could not to see what was really happening. To entice volunteers to break the rules, they encountered forbidden “shortcuts” that would let them finish the maze faster.

In the real maze back in the lab, no shortcut existed, and if participants opted to go through it the robot instead remained still. Meanwhile, volunteers who had now unwittingly become hackers for the experiment were fed simulated sensor data, indicating they passed through the shortcut and continued along.

Researchers wanted them to think the robot had taken the shortcut.

In surveys after the experiment, participants who controlled the device and those who fed simulated data about the fake shortcut both thought the data was believable at similar rates.

Sponsored Recommendations

How to Build Better Robotics with Integrated Actuators

July 17, 2024
Reese Abouelnasr, a Mechatronics Engineer with Harmonic Drive, answers a few questions about the latest developments in actuators and the design or engineering challenges these...

Crisis averted: How our AI-powered services helped prevent a factory fire

July 10, 2024
Discover how Schneider Electric's services helped a food and beverage manufacturer avoid a factory fire with AI-powered analytics.

Pumps Push the Boundaries of Low Temperature Technology

June 14, 2024
As an integral part of cryotechnology, KNF pumps facilitate scientific advances in cryostats, allowing them to push temperature boundaries and approach absolute zero.

The entire spectrum of drive technology

June 5, 2024
Read exciting stories about all aspects of maxon drive technology in our magazine.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!