PLAYA VISTA, Calif. - Imagine future American warfighters in the midst of a mission leveraging technology to maintain a new level of situational awareness. This may be possible thanks to a new suite of software tools that tap into what a Soldier or sailor sees and feels.
U.S. Army researchers developed a suite of tools under a decade-long research program that focused on how brain function and eye tracking can be used to predict situational awareness.
Researchers developed software to exploit gaze and physiological data and provide real-time estimates of human situational awareness using a systematic collection of measurements via what they call the lab streaming layer, or LSL. This data collection ecosystem addresses analytic difficulties when combining information from different types of sensors.
It also offers the capability of synchronizing physiological data from a suite of sensors that monitor eye tracking, breathing patterns and other physiological responses during experiments designed to mimic realistic mission events.
Researchers use the software to quantify, predict and enhance squad-level shared situational awareness with Tactical Awareness via Collective Knowledge, or TACK.
“We can know exactly when and what someone looked at when we use TACK software tools and the physiological changes happening concurrently including what their pupil size was, as well as heart, brain and many other sensors,” said Dr. Russell Cohen Hoffing, a research scientist supporting TACK who works at the U.S. Army Combat Capabilities Development Command, known as DEVCOM, Army Research Laboratory’s western regional site in California.
Cohen Hoffing said he extensively relies on TACK tools and LSL to do data collection and analysis. He’s bringing together DEVCOM ARL colleagues with researchers from the U.S. Army Aeromedical Research Laboratory and the Naval Research Laboratory to find synergy and collaborate on experiments for multi-domain operations.
“The ability to integrate USAARL’s realistic helicopter pilot simulations with TACK’s dismounted environment, which can incorporate multiple humans in virtual or augmented reality scenarios alongside intelligent agents, is currently not possible but would be necessary to do virtual experimentation around a multi-domain Army-relevant scenario,” Cohen Hoffing said. “We could simulate helicopter pilots dropping off dismounted team.”
Researchers developed LSL as part of the lab’s Cognition & Neuroergonomics Collaborative Technology Alliance, which is the Army’s flagship basic science research and technology transition program in the neurosciences. It’s a multi-aspect data acquisition and synchronization software backbone that has been adopted by an industry partner, Neurobehavioral Systems, Inc., for integration into their commercial stimulus presentation tool.
LSL has also become a key integration and synchronization technology for a number of laboratory projects, including large-scale research efforts supported by
Army-wide programs designed to address expected challenges within multi-domain operations. A growing number of academic labs around the world use LSL to create a unified ecosystem for human sensing, Cohen Hoffing said.
Dr. Jonathan Touryan, Army researcher and collaborative alliance manager of this decade-long research team, now leads TACK, which aims to improve warfighters situational awareness in teaming contexts that involve both Soldiers and intelligent agents like autonomous aerial and robotic systems.
“Obtaining and maintaining situational awareness in complex, dynamic environments is a critical component to ensuring force protection and mission success,” Touryan said. “Maintaining situational awareness is everyone’s responsibility.”
Army and Navy researchers are focusing efforts to determine what to do with the data once it’s been collected from the sensors.
“Without meaningful analysis of pupil size, for example, it is just a number of millimeters at any given time point,” Cohen Hoffing said.
Researchers at USAARL and NRL are beginning to integrate LSL into their research pipeline because it offers an easy method to synchronize sensors in a standardized format that is shareable, he said.
Researchers at DEVCOM ARL used physiological sensors like electroencephalograms, or EEG, to detect electrical activity in brains to build a human-interest detector. They also plan to create a way to estimate other states relevant to situational awareness like cognitive load and exploration or exploitation.
“This new research efficiency will allow laboratories to move away from previous efforts spent on making custom software to synchronize other sensors,” Cohen Hoffing said. “Relying on LSL will allow them to focus on run experiments which aim to understand and interpret the sensor data and infer human states.”
Visit the laboratory's Media Center to discover more Army science and technology stories
DEVCOM Army Research Laboratory is an element of the U.S. Army Combat Capabilities Development Command. As the Army’s corporate research laboratory, ARL is operationalizing science to achieve transformational overmatch. Through collaboration across the command’s core technical competencies, DEVCOM leads in the discovery, development and delivery of the technology-based capabilities required to make Soldiers more successful at winning the nation’s wars and come home safely. DEVCOM is a major subordinate command of the Army Futures Command.
Social Sharing