Launched last year for the Curiosity Rover, AI4Mars will now allow members of the public to label features of scientific interest - such as objects obstructing the rover's path - in imagery taken from the Perseverance, NASA says.
“Equipped with 19 cameras, Perseverance sends anywhere from dozens to hundreds of images to Earth each day for scientists and engineers to comb through for specific geological features,” the agency said.
“But time is tight: After those images travel millions of miles from Mars to Earth, the team members have a matter of hours to develop the next set of instructions, based on what they see in those images, to send to Perseverance.”
Ideally, the goal is to create an algorithm that could help future rovers “pick out needles from the haystack” of data sent from Mars and navigate the ground more easily.
When the project was first revealed for Curiosity, participants labelled nearly half a million images, using a tool to outline features – such as sand and rock – for NASA’s Jet Propulsion Laboratory (JPL) to look out for.
The images of terrain identified then assisted the lab in sending a more thorough set of instructions for the rover to less likely get stuck.
The end result was an algorithm called Soil Property and Object Classification (SPOC), that could identify features correctly 98 per cent of the time, NASA said.
“It’s not possible for any one scientist to look at all the downlinked images with scrutiny in such a short amount of time, every single day,” said Vivian Sun, a JPL scientist.
“It would save us time if there was an algorithm that could say, ‘I think I saw rock veins or nodules over here,’ and then the science team can look at those areas with more detail.”
NASA’s Spirit rover, which was active for six years from 2004 got stuck on Mars terrain and became incommunicado in early 2010.
For over a year NASA tried to reach contact until in May 2011, the agency declared it would cease attempts, even after exceeding its intended lifespan by 90 days.
NASA said it is working on using machine learning, like self-driving cars on Earth, for Perseverance to identify dangerous terrain itself.
JPL’s Annie Didier, who worked on the Perseverance version of AI4Mars said this could serve several purposes in the future.
“With this algorithm, the rover could automatically select science targets to drive to,” she said.
It could store numerous images onboard and send back to Earth only images captured of scientific interest, NASA said.
Hiro Ono, the JPL AI researcher who led the development of AI4Mars, said making the dataset publicly available allows the entire data science community to benefit.
“If someone outside JPL creates an algorithm that works better than ours using our dataset, that’s great, too,” he said. “It just makes it easier to make more discoveries.”
So far, almost 12,000 volunteers have joined the AI4Mars Perseverance project.
Bella Richards is a journalist who has written for several local newspapers, her university newspaper and a tech magazine, and completed her Bachelor of Communications (Journalism) at the University of Technology Sydney in 2020. She joined Momentum Media in 2021, and has since written breaking news stories across Space Connect, Australian Aviation and World of Aviation.
You can email Bella on: [email protected]
Receive the latest developments and updates on Australia’s space industry direct to your inbox. Subscribe today to Space Connect here.