SPRINGFIELD, Va. (Army News Service, Feb. 16, 2016) -- Army researchers are trying to better understand the types of stress Soldiers could encounter in combat that might cause degraded performance. They are also looking into ways to mitigate those stress factors, said Dr. Mike LaFiandra.
LaFiandra, chief of the dismounted warrior branch at Army Research Laboratory, or ARL, spoke at the National Defense Industrial Association-sponsored Human Systems Conference here, Feb. 9.
Other speakers focused on both machine and human performance goals.
Stressors impinging Soldier performance can be dust, toxic fumes, fatigue from carrying heavy loads and other things, LaFiandra said.
Special sensors are being used to measure the effects those variables have on performance, he said.
Once the information is quantified, the next step is exploring various mitigation strategies to prevent performance degradation, he said. Mitigation strategies that work could result in improved task performance upwards of 20 or 25 percent.
The methodology might seem pretty straightforward, but it's actually not, LaFiandra said.
Certain stressors have a much greater impact on Soldier performance than others. Understanding why those differences occur is just as important as understanding the types of stressors, he said.
Another challenge is selecting the right sensors, he said. While some sensors are small and non-invasive, others, such as face masks which measure oxygen uptake are not, particularly in a field environment. Such invasive sensors might create stress of their own.
Identifying stressors and their causes is not always straightforward, LaFiandra said. For example, Soldiers on flight lines were found to experience a much higher than average loss of hearing.
One might conclude, LaFiandra said, that the loss of hearing was simply due to the noise of helicopters and jets. But further investigation found the cause to be slightly more complicated than that. An occupational health study found that the toxic effects of aircraft fumes compounded the effects of the aircraft noise in causing hearing loss. The study could later inform mitigation strategies for that hearing loss.
ARL is working with the Defense Advanced Research Projects Agency, the other services and Special Operations Command on a number of other stressor mitigation projects that show promise to improve Soldier performance.
Dr. Greg Zacharias, the U.S. Air Force's chief scientist and advisor to the Air Force chief of staff, spoke about human-machine relationships with autonomous vehicles like unmanned aerial vehicles, or UAV, and unmanned ground vehicles, used by all of the services.
While these vehicles are called autonomous, he noted, they're really not, because a human is in the loop interacting with the systems.
While autonomous vehicles hold great promise for warfighters winning in a complex world, Zacharias raised concerns about how humans interact with those machines.
"Vigilance complacency" is one example of how a Soldier could team poorly with a UAV, he said. Long hours watching a blip on a computer screen could and has caused UAV operators to lose focus and not detect errors.
A possible solution to vigilance complacency, he said, is to make the machine more aware of the operator, monitoring the operator's physiological state of alertness and providing some sort of warning when alertness levels decline below a certain point.
Complexity is another potential problem. As gear becomes more complex, longer training time is needed for operators, he said.
Complex controls, displays and actions required by the operator increase workload and decision time. Eventually, operator performance could deteriorate to the point where the benefits of autonomy become lost.
A mitigation strategy to prevent complexity would need to come at the early design phase of the system, with extensive user testing to determine how well the human is interacting with the machine.
The right level of operator trust in their equipment is also important, Zacharias noted. If an operator puts too much trust in a UAV to operate on autopilot, for instance, the operator might not notice a decrease in speed and elevation and a crash could result.
Not enough trust in a machine can also prove detrimental, he said. An operator who doesn't trust the machine and overrides its calculations can cause harm to the machine or the mission.
FUTURE HUMAN-MACHINE ENDEAVORS
Zacharias thinks that future autonomous systems will be wired in ways similar to the human neural network. This, he said, will allow machines to have a better understanding of their human counterpart and humans will be able to better relate to their machines.
The chief scientist even thinks that systems can be designed with "flexible autonomy," whereby the operator can hand off tasks to the machine, or the machine can hand off certain tasks to the operator, based on mission demands and workload changes.
The neural networked machine could become aware of itself, similar to the way humans are self-aware, and if the machine becomes damaged, it might even find ways to heal itself, he said.