NASA has upgraded its Mars Exploration Rover Opportunity with advanced autonomous capabilities.
The update allows Opportunity’s computer to examine images taken with an on-board wide-angle navigation camera and recognize rocks which meet specified criteria, such as rounded shapes or light colors. The rover can then center its narrower-angle panoramic camera on the selected target and snap multiple images through color filters.
“The new system is called Autonomous Exploration for Gathering Increased Science, or AEGIS. Without it, follow-up observations depend on first transmitting the post-drive navigation camera images to Earth for ground operators to check for targets of interest to examine on a later day,” explained Tara Estlin of NASA’s Jet Propulsion Laboratory.
“Because of time and data-volume constraints, the rover team may opt to drive the rover again before potential targets are identified or before examining targets that aren’t highest priority.”
According to Estlin, the first images taken by a Mars rover choosing its own target were of a rock approximately the size of a football, tan in color and layered in texture.
“Opportunity pointed its panoramic camera at this unnamed rock after analyzing a wider-angle photo taken by the rover’s navigation camera at the end of a drive on March 4,” said Estlin.
“[The computer then] decided that this particular rock, out of more than 50 in the navigation camera photo, best met the criteria that researchers had set for a target of interest: large and dark.”
Estlin also noted that Opportunity had pinpointed “exactly the target” NASA wanted it to find.
“This checkout went just as we had planned, thanks to many people’s work, but it’s still amazing to see Opportunity performing a new autonomous activity after more than six years on Mars.
“We spent years developing this capability on research rovers in the Mars Yard here at JPL. Six years ago, we never expected that we would get a chance to use it on Opportunity.”