Aptima develops neuroscience-inspired sense-making system for military robots

29 April 2013 (Last Updated April 29th, 2013 18:30)

A new knowledge-based, collaborative sense-making system has been developed by Aptima to boost intelligence and decision capabilities of the US military's unmanned platforms during combat operations.

Robot

A new knowledge-based, collaborative sense-making system has been developed by Aptima to boost intelligence and decision capabilities of the US military's unmanned platforms during combat operations.

Developed as part of the US Defense Advanced Research Space Agency's (DARPA) Defense Sciences Office and the US Army Research Laboratory contract, the cognitive patterns prototype draws inspiration from neuroscience to help robots better recognise, adapt to, and intelligently work with human operators in various situations.

The ROS-compliant technology features three approaches, the first of which combines the robot with lower level sensor data to help the platform recognise a situation as much as possible on its own, as its human counterpart also does.

Blending current concepts, the system generates new knowledge for the robot, similar to the mechanisms used for visual perception in the human brain, on exposure to ambiguous information or scenarios that fail to match with its existing knowledge.

The robot's high-level knowledge and ability to draw conclusions from its sensory data are then enhanced by a human operator through adjustments in the way it categorises objects, people, and environments.

Aptima Cognitive Patterns contract principal investigator Webb Stacy said the robots were originally designed to function from the bottom up, so were incapable of recognising new things, despite fitted with advanced sensors, which severely limits their usefulness.

"If the images hitting its camera don't match what's in its brain, they're unable to understand what would be clear to us, which requires lots of hand-holding."

"If the images hitting its camera don't match what's in its brain, they're unable to understand what would be clear to us, which requires lots of hand-holding," Stacy said.

Combining both top-down and bottom up processing approaches, the system generates a rich situation model, which is shared between robot and human and could not have been produced by either alone.

In addition, the robot requires interaction with an operator only when something unusual or unexpected occurs and when receiving mission orders.

Working as an automated system, the prototype is expected to advance a new class of robots having higher level decision-making, with reduced pre-mission preparation costs and human intervention.


Image: US Army engineers deploy a robot to search for improvised explosive devices (IEDs) during training session at the National Training Center in California, US. Photo: courtesy of Spc Ryan Hallock.

Defence Technology