Research advances autonomous off-road navigation

By U.S. Army DEVCOM Army Research Laboratory Public AffairsFebruary 2, 2021

Army researchers developed RELLIS-3D, a novel dataset that provides the resources needed to develop more advanced algorithms and new research directions to ultimately enhance autonomous navigation in off-road environments. Shown here is the warthog at an autonomous vehicle demo on one of RELLIS' off-road testing areas.
Army researchers developed RELLIS-3D, a novel dataset that provides the resources needed to develop more advanced algorithms and new research directions to ultimately enhance autonomous navigation in off-road environments. Shown here is the warthog at an autonomous vehicle demo on one of RELLIS' off-road testing areas. (Photo Credit: Courtesy Texas A&M Engineering Experiment Station and George H.W. Bush Combat Development Complex)) VIEW ORIGINAL

ADELPHI, Md. -- Soldiers navigate a wide variety of terrains to successfully complete their missions. As human/agent teaming and artificial intelligence advance, the same flexibility will be required of robots to maneuver across diverse terrain and become effective combat teammates.

Army researchers developed a novel dataset that provides the resources needed to develop more advanced algorithms and new research directions to ultimately enhance autonomous navigation in off-road environments.

Existing autonomy datasets either represent urban environments or lack multimodal off-road data, and Army researchers have changed the game by providing the data and research needed for autonomous agents to traverse more rugged environments, with characteristics similar to those possibly found on the battlefield, in ways never achieved before.

Phil Osteen
Phil Osteen (Photo Credit: David McNally ) VIEW ORIGINAL
Dr. Maggie Wigness
Dr. Maggie Wigness (Photo Credit: Jhi Scott) VIEW ORIGINAL

Researchers Phil Osteen and Dr. Maggie Wigness from the U.S. Army Combat Capabilities Development Command, now known as DEVCOM, Army Research Laboratory, in collaboration with the Texas A&M Engineering Experiment Station and George H.W. Bush Combat Development Complex , developed RELLIS-3D, an off-road dataset captured from a Clearpath Robotics Warthog platform.

This research is conducted in support of the lab’s Artificial Intelligence for Maneuver and Mobility, or AIMM, Essential Research Program.

According to the researchers, autonomous navigation systems that rely solely on LiDAR lack higher-level semantic understanding of the environment that could make path planning and navigation more efficient.

“Urban environments provide many structural cues, such as lane markings, cross-walks, traffic signals, roadway signs and bike lanes, that can be used to make navigation decisions, and these decisions often need to be in alignment with the rules of the road,” Wigness said. “These structured markings are designed to cue the navigation system to look for something such as a pedestrian in a crosswalk, or to adhere to a specification such as stopping at a red light.”

In off-road environments, she said, this is not necessarily the case, which can cause a much more significant class imbalance than those seen in urban datasets.

Off-road datasets contain non-asphalt surfaces that present completely different traversability characteristics than roadways, Osteen said. For example, a vehicle might have to decide between navigating through grass, sand, or mud, any of which could present challenges for autonomous maneuver. Tall grass versus short grass are also very different in terms of traversability, with tall grass potentially hiding unseen obstacles, holes, or other hazards.

Additionally, while roadway driving is not universally flat, for the most part roads are relatively smooth and at least locally flat. Off-road driving can contain uneven hills and sudden drops, and there are a larger number of potentially dangerous conditions (a small nearby cliff, loose rock, etc.) that could cause unsafe decisions from a navigation planner.

“These sudden and constant changes in terrain require an off-road navigation system to constantly re-evaluate its planning and control parameters,” Wigness said. “This is essential to ensure safe, reliable and consistent navigation.”

RELLIS-3D presents a new opportunity for off-road navigation. According to Osteen, the dataset can provide benefits in at least three ways.

First, he said, it provides data that can be used to evaluate navigation algorithms including SLAM in challenging off-road conditions, such as climbing and then descending a large hill.

Second, it provides annotated LiDAR and camera data that can be used to train machine learning networks to better understand the terrain conditions in front of a robot, which can facilitate improved path planning based on semantic information.

Finally, Osteen noted, the combination of annotated data from active LiDAR and passive camera sensors may lead to algorithms that are trained with both sensor types, but are ultimately deployed in a passive-only mission.

For Osteen and Wigness, now is the right time to be exploring this research in support of the Army and our Soldiers, as well as first responders in disaster relief scenarios.

“Private companies are racing to develop driverless cars, which they see as an enormous future market,” Osteen said. “Therefore, they have invested significant resources into ensuring that their driverless cars have the best autonomy in the industry. A large part of this autonomy development involves creating and using datasets for training perception algorithms. This is what we are doing with RELLIS-3D, with the distinct goal of improving the state-of-the-art in off-road autonomy.”

Also, according to Wigness, the laboratory has been working on developing these unstructured dataset resources for the research community for the last couple of years, and RELLIS-3D is the next step.

The Robot Unstructured Ground Driving, or RUGD, dataset was publicly released in 2019, Wigness said. RUGD is comprised of annotations for imagery only, but was designed to provide the necessary resources to get the larger academic and industry researchers thinking about perception for autonomous navigation in off-road and unstructured environments.

RELLIS-3D takes the next step by publicly providing an entire sensor suite of data for off-road navigation, she said. This allows researchers to leverage multimodal data for algorithm development and can be used to evaluate mapping, planning and control in addition to perception components.

Wigness also noted that algorithm generalization across domains is still an open research area.

“On-road autonomous navigation and perception algorithms have shown great promise, but there is still so much potential in their performance,” Wigness said.  “Addressing this and creating more generalizable solutions will allow us to deploy autonomous navigation into other environments and scenarios, e.g., disaster relief.”

The researchers have also seen interest and investment from other organizations, including DARPA, which recognize the value of developing these datasets for improving autonomous systems performance.

“Releasing off-road datasets that represent the complex off-road environments that the military and first responders will face are necessary to engage the greater research community, much in the way KITTI did for the on-road domain,” said Dr. Stuart Young, program manager in the DARPA Tactical Technology Office. “Statistical learning approaches require massive amounts of data to achieve the success that they have, and sufficient data in the off-road domain is critical to applying these approaches. Additionally, autonomous robotic behaviors will require the robots to do more than just navigate off-road, but must also maneuver in such a manner that uses the terrain for some behaviors.”

Young said the ARL dataset is a great initial step in understanding the type and volume of data that must be collected and in what environments.

“DARPA is working with ARL to better understand the detailed parameters associated with off-road navigation data collection, such that future data collections for the Army and DOD will be as efficient and useful as possible in all ground domains,” he said.

Moving forward, the researchers plan to develop and test new systems to learn from multiple distinct datasets simultaneously, such as KITTI and RUGD; use the synchronized multimodal data to enhance some of their research on costmap generation for navigation; research developing algorithms that are better able to rapidly re-train and improve performance in never-before-seen environments; and research fusing data from both camera and LiDAR to leverage the strengths of each modality.

RELLIS-3D is publicly released and available online.

Visit the laboratory's Media Center to discover more Army science and technology stories
(Photo Credit: U.S. Army) VIEW ORIGINAL

DEVCOM Army Research Laboratory is an element of the U.S. Army Combat Capabilities Development Command. As the Army’s corporate research laboratory, ARL is operationalizing science to achieve transformational overmatch. Through collaboration across the command’s core technical competencies, DEVCOM leads in the discovery, development and delivery of the technology-based capabilities required to make Soldiers more successful at winning the nation’s wars and come home safely. DEVCOM is a major subordinate command of the Army Futures Command.