Meet Cassie, the first bipedal robot using machine learning to control a running gait on outdoor terrain. In fact, Cassie completed a 5K course (3.1 miles) on Oregon State University’s campus in just over 53 minutes.
Oregon State and Agility Robotics created Cassie and used a deep reinforcement algorithm to “teach” the robot how to run, a complex movement that requires flexibility and balance—a movement that comes somewhat naturally to most species, but robots could be none the wiser without help.
Cassie was also able to run at an even higher speed in a controlled environment.
“Deep reinforcement learning is a powerful method in AI that opens up skills like running, skipping and walking up and down stairs,” said Yesh Godse, an undergraduate in OSU’s Dynamic Robotics Laboratory.
“The Dynamic Robotics Laboratory students in the OSU College of Engineering combined expertise from biomechanics and existing robot control approaches with new machine learning tools,” said OSU robotics professor Jonathan Hurst, in a university press release. “This type of holistic approach will enable animal-like levels of performance. It’s incredibly exciting.”
In the press release, Hurst noted that walking robots would one day be a common sight—much like the automobile, and with a similar impact.
Within the 53-minute walk, the team counted six-and-a-half minutes of reset time after two falls, one of which was due to an overheated computer. Cassie also fell after being commanded to turn at a speed too high.
“Cassie is a very efficient robot because of how it has been designed and built, and we were really able to reach the limits of the hardware and show what it can do,” said Jeremy Dao, a Ph.D. student in the lab.
Cassie was developed under the direction of Jonathan Hurst with a 16-month, $1 million grant from the Advanced Research Projects Agency of the U.S. Department of Defense. The National Science Foundation has been funding the students’ efforts to explore machine learning with Cassie.