JOINT BASE LANGLEY-EUSTIS, Va. -- The U.S. Army’s Mad Scientist Initiative hosted research fellows from The College of William & Mary’s Project on International Peace and Security program at Joint Base Langley-Eustis, Virginia, Feb. 28, 2020. The event is one of many that the Mad Scientist Initiative has hosted with other universities across the United States.
“This event, and other initiatives, allows Mad Scientist to broaden our view of the Operational Environment and expand understanding of the full extent of OE possibilities,” said Allison Kuntzman, Mad Scientist deputy director.
The College of William & Mary is a public research university located approximately 15 miles from JBLE in Williamsburg, Virginia, and the PIPS program is one of the premier undergraduate think tanks in the country. The program provides undergraduate fellows the chance to work with practitioners in the military and intelligence communities.
The proximity of the college provided an opportunity for students to present research to a diverse audience of military members and civilian personnel from JBLE. The work produced is also shown to policy officials and scholars at a year-end symposium in Washington, D.C.
The theme of this event was Generation Z and the Operational Environment and the changing character of warfare.
The students’ research focuses on emerging international security issues and the development of original policy recommendations to address these challenges. The topic areas presented at the event included weaponized information, artificial intelligence, and bio convergence - representing a year’s worth of research by each of the fellows.
Megan Hogan’s research thesis, The Underlying Cost-Benefit Analysis Associated with the U.S. Developing and Maintaining Deepfake Technologies as a Capability to Deter, Deny, or Defeat any Adversary That Seeks to Harm U.S. National Interests, highlighted the use of deepfakes as extremely effective weapons of disinformation.
According to Hogan, "The Department of Defense has to choose either continuing to restrict its research into developing video authentication algorithms or expand its effort to include deepfake weaponization for coercive diplomacy and warfighting.”
Deepfakes are fake videos or audio recordings that look and sound just like the real thing. With more and more people getting their news and information from social media, a deepfake video can reach millions of people globally.
As deepfakes become more complex and more challenging to detect, the disinformation efforts of the creators will be more successful.
One way the DoD is trying to tackle deep fakes is through DARPA’s Media Forensics program.
According to the Defense Advanced Research Projects Agency, DARPA’s MediFor program brings together world-class researchers to attempt to level the digital imagery playing field, which currently favors the manipulator, by developing technologies for the automated assessment of the integrity of an image or video and integrating these in an end-to-end media forensics platform.
"The participation of students like the ones at William and Mary allows us the opportunity to obtain perspectives from the next generation, their ability to inform us of current trends and how they might develop and impact our nation's security and relationships with allies in the future. The research they perform and the recommendations given serve to inform U.S. government leaders on where they should be proactively focusing resources,” said Patricia DeGenarro, U.S. Army Training and Doctrine Command G-2, Operational Environment Center geopolitical analyst.
The Mad Scientist Initiative continually explores the future through collaborative partnerships and continuous dialogue with academia, industry, and government. The team directly contributes to the understanding of the OE and the changing character of warfare from now until 2028 (U.S. Army Training and Doctrine Command) and out to 2050 (Futures and Concepts Center).