Research aimed at teaching robots to "see" as humans and animals do may soon make it possible to bag speeding motorists, track enemy planes, and help safeguard the nation's borders and vital resources with zero chance of detection. The technique could potentially render obsolete current "fuzz busters" and some military technologies. That's because, instead of painting a target with radar waves or laser beams, a camera or cameras captures an image or series of images from the target. Images from the camera get integrated by special software that interprets the scene.
"All it needs to do is view the object moving. The computer figures out the rest," says Warren Dixon, a Univ. of Florida assistant professor of mechanical and aerospace engineering. "We're trying to use both regular and infrared cameras so the system works at night or in adverse weather."
Many challenges remain. One problem: how to make a computer extract 3D information from 2D images recorded by a video or still camera. People and animals can perceive depth because their brains combine the snapshots taken by each eye. Two cameras can as well produce stereovision, but a computer can make sense of the images only when it knows the exact position of each camera relative to the target. Part of Dixon's research is figuring out the underlying mathematics and software to sidestep the spatial restriction.
Police either in moving or parked squad cars could use the computer-camera systems to bust speeders. A target must be within line of sight and camera-lens power determines range. Such a system does not yet exist, but any video camera with the right software could be used, say researchers, adding that a prototype could be built within a year.
Autonomous robots fitted with the technology could help secure warehouses and shopping centers, continuously monitor borders, protect strategic assets, and even harvest orange groves. Soldiers could mount the cameras on tiny airborne drones and set them to look for and report movement of enemy forces. A goal of the five-year project is make drones fly without the assistance of a remote human operator, navigating instead using signals from onboard cameras. Funding for the project comes from Eglin Air Force Base and the Air Force Office of Scientific Research.