NASA‘s Curiosity probe watches outstanding views on Mars that no human has ever seen in person. Its camera setups observe the Red Planet‘s lonely and dry scenes, captures them, and sends the result to Earth for the control team to analyze them.
However, Curiosity’s snaps are not just for us to see how another world looks like; they also reveal important data that informs future science. Its cameras are fixed focal lengths with no zoom ability, with the left one being a 34mm lens, and the right one a 100mm telephoto sensor.
How Curiosity Perceives Mars
The Mastcam, including both these lenses, is able to capture color images and video, which can be merged by the researchers on Earth. In early March, NASA revealed an outstanding scenario composed of 1.8 billion pixels’ worth of photographs captured during the Thanksgiving holiday.
The data is sent through radio waves to NASA’s Deep Space Network of antennas located all over the world. Then, image processing experts at the space agency’s Jet Propulsion Laboratory (JPL) access the images and process them using the first digital image processing software created by JPL in 1966.
Among skilled image processing specialists is Halle Gengl, who has to understand the orientation of the images carrying 100 lines of data coming from Curiosity, and put them together. Then, the photographs are sent to the rover team researchers and published online, together with the raw images, she said.
Without the images snapped by Mastcam and the black-and-white Navcams, the rover would just spend time staying still on Mars because it is not an autonomous probe. Rather, teams on Earth program it, and with no images, the controllers would not be able to tell Curiosity where to head next.
The rover has been able to travel about 13 miles across the surface of the Red Planet since touching down in 2012 because of those images. At the moment, Curiosity is climbing and analyzing a steep hill to understand more about the planet’s geologic formation.
New Images Come In Every Day
Curiosity landed in Gale Crater, a large and dry old lake bed, housing a 16,404-foot mountain at its core. There is the possibility that the crater was filled with streams and lakes a few billions of years ago, and that is why NASA laded the probe in that location in 2012.
“Every day, I feel like I’m vicariously living through robots on Mars. Especially for rovers, because we’re exploring a new location we’ve never been to. Seeing the images coming down, I can’t help but think, ‘I’m one of the first people to see this data of this location on Mars,’ “ Gengl said.
Every day for the last seven years, Curiosity’s team of researchers has begun their day by analyzing the latest captures the rover managed to snap on Mars. The probe’s broad science team includes 500 researchers from all over the world, 40 percent of which are not from the United States. They help make decisions for the data collected by Curiosity’s ten tools.
“That data set provides our first look at what’s really there,” said Ashwin Vasavada, the probe’s project lead scientist.
Curiosity captures new images every day, spreads its robotic arm to analyze and place tools strategically every few days and digs for sample material every month, Vasavada said. With rovers and landers on Mars, as well as orbiters above, we can definitely expect more images from the surface of the planet for a long time.