USARIEM partners to explore using virtual humans to measure cognitive performance

By Ms. Mallory Roussel (Natick)June 18, 2018

USARIEM partners to explore using virtual humans to measure cognitive performance
A Soldier talks to Ellie, a virtual human agent who is a part of SimSensei, a clinical decision support tool that can read expressions, speech patterns and body movements in order to detect signs of psychological distress. The developers of SimSensei... (Photo Credit: U.S. Army) VIEW ORIGINAL

NATICK, Mass. (June 15, 2018) -- Facial expressions, body movements and features of speech can communicate volumes about a person's physical, cognitive and emotional states.

Army researchers from the U.S. Army Research Institute of Environmental Medicine, or USARIEM, have recently begun collaborating on developing and enhancing technologies that can be used to accurately and objectively detect degraded cognitive performance in Soldiers, allowing unit leaders and medics to be able to make informed mission decisions and assess who is operationally fit to fight.

Picture this: You turn on your mobile device or slip on a virtual reality headset and connect to Ellie, who is going to measure your cognitive performance. Ellie begins to ask you questions. Her calm, nonjudgmental demeanor and her responsive body language and facial expressions encourage you to open up. If you shift away or avoid eye contact, Ellie adjusts her approach, helping you to re-engage in the conversation. The conversation resembles the ones you have had with a medical provider, friend or family member.

The only twist to this story is that Ellie is not a real person. She is a virtual human working as a conversational agent.

Ellie was developed as part of SimSensei by the University of Southern California Institute for Creative Technologies, or ICT, and the Army Research Laboratory in 2011 as part of a new generation of clinical decision support tools that can be used to detect signs of psychological distress by automatically analyzing human behavior. As a support tool, SimSensei would provide military personnel and their families with better awareness and access to care while reducing the stigma for seeking help.

SimSensei enables engaging face-to-face conversations by using two technologies: Multisense and Ellie. Multisense is state-of-the-art sensing technology that automatically tracks and analyzes facial expressions, body posture, speaking patterns and higher-level behavior descriptors, such as whether you are paying attention or if you are fidgeting. As you converse with Ellie, Multisense tracks all of your behaviors and movements, information which is then used to guide Ellie, so she can respond to you appropriately.

While ICT originally created Ellie to facilitate mental health screening in deployed Soldiers, the developers are now working with scientists from USARIEM; Walter Reed Army Institute of Research, or WRAIR; the U.S. Military Academy at West Point and MIT Lincoln Laboratory, or MITLL, on expanding Ellie's capabilities.

According to Dr. Kristin Heaton, a USARIEM research psychologist who specializes in evaluating warfighter cognitive performance in diverse military training and operational environments, an enhanced Ellie could provide a novel and cost-effective approach for assessing warfighter cognitive performance.

"We are finding ways to improve Ellie's ability to detect changes in warfighter performance that can be indicative of an emerging illness or injury," Heaton said. "ICT has a powerful and innovative virtual human platform. We are working with them and our other collaborators to identify physiological, cognitive and emotional features that will help Ellie predict how well a warfighter is functioning at any given point in time."

SimSensei is based on more than 10 years of research by ICT on how facial and vocal cues correlate with psychological health. While some of this work can be applied to analyzing cognitive readiness, Heaton explained that there is more research that needs to be conducted to pinpoint which subtle facial and vocal changes indicate degraded cognitive performance, which may impair attention and decision-making and jeopardize the mission.

"The brain coordinates speech and facial expressions in an exquisite way," Heaton said. "When cognitive performance is compromised, that coordination can start to break down. You can see this in a variety of speech disorders. Even when someone is tired, they sometimes sound as if they are intoxicated. They slur their words. Their voice might drift off, get softer or sound monotonous. Something similar could occur if a person has depression. They might also have muted facial expressions. Different disorders and stressors can change these facial expressions and speech."

At USARIEM, Heaton has collaborated with MITLL to investigate the relationship between vocal and facial expression signals and Soldiers' physical and cognitive performance. Heaton is particularly interested in how combinations of facial muscle movements and speech characteristics may be used to distinguish changes in performance in a variety of military occupational and environmental conditions Soldiers are exposed to during training and on the battlefield.

With a simple tablet computer, the USARIEM team has been collecting high-quality video and audio recordings of study volunteers pronouncing syllables, reading scripted passages, answering open-ended questions and making various facial expressions to cues. These data will not only help USARIEM and MITLL researchers develop algorithms to detect changes in cognitive status under different exposure conditions, but they can also be used to enhance Ellie's programming.

"We need to understand what speech and facial expressions look like in a variety of exposure conditions, whether it be a single exposure or stressor or multiple stressors," Heaton said. "This information could then be used as a signal to identify individuals at elevated risk of becoming a casualty or sustaining an injury."

Virtual human agents like Ellie may become the new normal for the future force. According to Heaton, Ellie could be tremendously helpful for people who do not have easy access to medical facilities or treatment providers or for those who may fear the stigma of mental health services. Users who interact with Ellie in a private and anonymous setting have reported that they feel less risk in honestly reporting and self-disclosing psychological issues, as compared to talking with a real person or even checking off symptoms on the Post Deployment Health Assessment questionnaire. While these technologies are not intended to replace evaluation, diagnosis or treatment by a trained medical provider, they are cost-effective and accessible, with 24 hours a day, 7 days a week availability from anywhere in the world.

Looking ahead, Heaton indicated that future work will expand the application of this technology to other useful applications, such as physical and cognitive performance coaching, educational tutoring and health and wellness assessment and tracking.

"Virtual human agents like Ellie are more common now than they ever have been," Heaton said. "The idea is to make this technology work for the Soldier by providing a personalized bridge between the user and readiness and wellness enhancing resources."

Whether a Soldier is a well-seasoned general or a freshly-recruited private, everyone is susceptible to decreased cognitive performance. Dr. Karl Friedl, the Army's senior scientist for physiology, stated that the research findings from the SimSensei project could open the door to other emerging technologies that can accurately predict when a Soldier's cognitive status is declining, providing the Army with necessary tools that can help prevent catastrophic events during missions and ensure we have a vigilant, lethal force that can win our nation's wars.

"In addition to Soldier avatar coaching applications, Multisense Ellie provides a research platform to determine the combination of physiological measures that reliably predict mood and cognitive status in stressful conditions," Friedl said. "These measures will form the basis of future man-machine interfaces for optimal human performance, such as adapting the complexity of information displays, and they can be incorporated into wearable monitors for Soldier readiness status."

Related Links:


USARIEM Facebook