NASA is inviting people to explore Mars using the images taken by the Curiosity Rover and help to label rocks and other surface features of the red planet. The AI4Mars project is designed to improve the artificial intelligence algorithm that will help future Martian rovers navigate the planet's surface.
This will prevent Mars rovers from getting stuck like what happened to NASA Spirit in 2010 that resulted in the space agency to abandon the mission after seven years of service.
The simulation is designed by a team from the Jet Propulsion Laboratory (JPL). The project is hosted by the citizen science website Zooniverse, which has volunteers labeling the terrain features in thousands of pictures that the Curiosity rover has taken.
NASA's AI4 Mars Project
Volunteers will be able to look through thousands of images taken on Mars by the human-made vehicle when they sign up on the AI4 Zooniverse project page. Then they will be guided through a training exercise to help them identify types of surface features.
Afterward, they are shown a series of randomized images which they can draw lines around each feature - one for the color of the rocks, and another for sandpits and so on.
Hiro Ono, an AI researcher at JPL, said that it typically needs hundreds to thousands of examples to train a deep learning algorithm, such as the algorithms for self-driving cars, which are trained with numerous images of roads, signs, traffic lights, pedestrians, and other vehicles.
"Other public datasets for deep learning contain people, animals, and buildings - but no Martian landscapes," Ono said.
The terrain is important to get around on Mars, as what they had discovered when Spirit got stuck in a sandpit and ended its mission after seven years of exploring Mars. Likewise, Opportunity and Curiosity rovers also got stuck in the past, but they could continue their missions.
That prompted the improvement of the artificial intelligence algorithm the rovers use for identifying rocks and surface features. They think that the best way to do that is by having humans go through thousands of images and label them.
Training Curiosity's Artificial Intelligence Algorithm
The Zooniverse database lets users draw boundaries around terrain and choose one of four labels - sand, consolidated soil, bedrock, or big rocks. These labels are key to sharpening the Martian terrain-classification called Soil property and object Classification (SPOC).
SPOC will be able to automatically distinguish between cohesive soil, high rocks, flat bedrock, and dangerous sand dunes once it is entirely up to speed. Then it sends the images to Earth to make it easier to plan the rover's next moves.
The scientists hope that the algorithm can become accurate enough to do other useful tasks, such as predicting how likely a rover's wheels are to slip on different surfaces.
It can take about four to five hours of drive and require multiple people to write and review hundreds of code lines and collaborate with scientists. Geologists will assess the terrain to predict whether the tires of Curiosity will slip, be damaged by sharp rocks, or get stuck in a sand dune.
Since Curiosity's high-gain antenna needs a clear line of sight to Earth to receive commands, planners also consider which way the rover will be pointed at the end of the drive by anticipating shadows falling across the terrain during a drive.
Read More: NASA Reveals Highest-Resolution Ever of the Panorama of Mars