MORGANTOWN — West Virginia University scientists have developed a way for extraplanetary rovers to use nonvisual information to maneuver over treacherous terrain. This research aims to prevent losses like that of the Martian exploration rover Spirit, which ceased communications after its wheels became trapped in invisibly shifting sands in 2010.
Space roboticist Cagri Kilic, a Statler College of Engineering postdoctoral research fellow in the Department of Mechanical and Aerospace Engineeringat the WVU Navigation Laboratory, led research on preventing slips and stumbles in planetary rovers that will be featured in a Field Robotics paper he coauthored with aerospace engineering associate professors Yu Gu and Jason Gross.
Supported by funding from NASA’s Established Program to Stimulate Competitive Research, Kilic, Gu and Gross have found a way to help a rover feel its way forward, using only its existing sensors, when visual data is not available or reliable.
Darkness and extreme brightness can both make it hard for rovers to depend on visual data for navigation, but Kilic’s work also focuses on helping the rover in situations where aspects of the physical terrain are difficult to read based on a visual inspection: steep slopes, loose debris, layers of different sands, soft soil or salt flats like those of Europa, Jupiter’s moon.
Many of those terrain features can be found at the burnt-coal ash piles in Point Marion, Pennsylvania, where Kilic’s team tests their software on WVU’s Pathfinder rover.
“The area was actually found when we were doing some tests for the Mars Society’s University Rover Challenge,” he said. “As soon as I saw the environment, I wanted to look at the chemical composition of the area because it was looking like Mars.”
In Point Marion, Kilic’s team puts Pathfinder, a lightweight, small-scale test rover, through its paces, testing algorithms that allow it to adjust its course or speed, for example, based on the information it gets from onboard instruments like accelerometers, gyroscopes, magnetometers and odometers, rather than on what it can detect through its camera lens. Those instruments tell Kilic’s software about orientation, velocity and position, helping the rover and the engineers who guide it understand and respond to the environment.
“Mars rovers can understand if there is an obstacle in front of them,” Kilic said. “They can detect wheel slippage by using their cameras, they can tell if a wheel is spinning on a rock and so on. And they can adjust their navigation by changing their path, changing individual wheel speeds or stopping to wait for the command from the engineers on Earth.”
Kilic stressed that when visual data is available, the rovers’ current visual navigation system is “almost perfect – 99% success rate. The problem is that it can only work when there are sufficient features in the environment.” The sameness of a landscape is what gives a rover trouble when it’s relying on sight to get around.
According to Kilic, it’s “homogeneous, visually-low feature environments similar to deserts, ocean or tundra on our planet” that are a problem for rovers not just on Mars, but also on Earth’s moon and potentially on Europa, where the presence of ice has excited scientific speculation about habitability. Kilic said he tried to make the technology “as general as possible for use in any robot on any extraterrestrial body.”
Wherever a rover can go in our solar system, Kilic’s algorithms can help protect it against a fall or entrapment.
“Of course, the software needs to be tuned for a particular rover, adjusting to its wheel dimensions, its inertial measurement unit characteristics, but it does not need any additional sensors,” he said.
Still, Kilic’s research specifically aims to benefit the rovers that are currently exploring Mars: Curiosity, Perseverance and Zhurong. Mars is Kilic’s priority because “Martian soil is exceptionally challenging for traversability. Even throughout a single drive, Mars rovers traverse on various terrains with different slopes.”
To realize that goal, Kilic will now conduct additional tests with different rovers. His method already boasts slip detection accuracy of more than 92% for distances of around 150 meters and drains fewer computational resources than visually based navigation, enabling rovers using Kilic’s software to travel faster and stop less often than when they rely on visual signals.
Although the research still has some distance to travel, Kilic said the results to date “show us that we” – and the rovers – “are on the right path.”