ABERDEEN PROVING GROUND, Md. -- Paul Roche, mathematical statistician at the Army Evaluation Center, describes the role statistics play in science, and in the Army, as being the 'watch dog' of science and technological advances.
Roche, a Virginia Tech graduate, has worked for AEC for six years and presently serves as a team member of the Army's Test and Evaluation Command's Systems Team, referred to as AST. He is currently pursuing a master's degree in Systems Engineering at Johns Hopkins University in Baltimore and expects to graduate in December.
This team is comprised an AST Chairperson and representatives from the Integrated Product Support; Reliability, Availability and Maintainability; Survivability; Modeling and Simulation; and a Design of Experiments expert for the Army Evaluation Center; and military evaluators and testers from the Operational Test Command. The AST also works with Program Managers; end-users and the Office of the Secretary of Defense representatives.
As part of this acquisition team, Roche and his colleagues receive a set of requirements to evaluate the system against. Based on the evaluation data needs, both developmental and operational test events are planned using design experiments for systems going through the acquisition process, explained Nancy Dunn, Chief, Methodology Division, AEC. The test event data is then analyzed and provided in an evaluation report to Army Acquisition decision makers to use as they determine how best to resource the Army's capabilities.
Roche recently returned from the Army Logistics University at Fort Lee, Va., where he briefed a group of military operational research analysts who were attending the FA49 Qualification Course about the Design of Experiments method that is mandated by the Defense Department.
The course focuses on preparing FA49 officers and civilians to become senior analysts through the Army and Department of Defense communities, explained Army Maj. James Henry, Operations Research/Systems Analysis, ORSA, Instructor at ALU.
"By inviting subject-matter-experts like Paul Roche as guest speakers, we expose students to relevant information about the sate-of-the-art research and current analysis being conducted by the ORSA community," Henry said.
He went on, stating that the guest speakers provide students an opportunity to hear senior leader perspectives among joint military community. During this course, students have an opportunity to see where ORSAs work, identify needed skills across the DoD and begin networking within the diverse community, according to Henry.
Roche, who recently served as a guest speaker, spoke to the group about how ORSAs navigate through the decision making process using the design of experiments methodology.
"The method enables us to use statistical algorithms to pinpoint what data points are the most feasible to test to learn the most about what affects a system's performance," he explained. Test and evaluation experts collaborate with PMs and other key players to determine what tests are the most feasible to conduct considering time, terrain, test costs and the test range's ability to conduct the test as the test event is being designed.
"We're looking to uncover the best way to test in the most efficient and accurate way possible," said Roche.
Much like the methodology behind the scientific method, elements of this design of experiments method outlines steps to guide critical thinking early in the process to ensure testers are gathering the right data to uncover the right information about a systems performance. These steps include:
1. Establish a goal. How accurate are the system rounds being fired?
2. Establish a way to measure. How far away does the shot land from the target?
3. Inputs to tests. Determine outside factors that may affect the accuracy of the shot fired, for example, was it too hot, cold, too far away from the weapons system to be accurate, etc.?
4. Develop the test run matrices using design of experiments.
5. Conduct developmental and operational tests.
6. Analyze the data.
But simply put, it's about pre-planning the experiment to pinpoint the most vital pieces of data needed to reveal the most about the system's performance.
"We're trying to apply more logic to how we set up the test to achieve a better analysis with less cost," said Roche. By using this process, Roche explained, he can uncover the same quality and accurate results with less test runs.
Working on a comprehensive team such as the AST requires interpersonal skills and the ability to talk through differences of opinions, he went on to explain. And that's the message he sends to college undergraduates pursuing the mathematical field.
"Keep in mind, most people are not statisticians, so you must be able to articulate your work to people outside the career field," he continued. "Otherwise, your input may be worthless because no one will understand it."
"Even in math, communication and interpersonal skills are key factors to success," said Roche. "In my job, I work on a team that is required to talk through differences of opinions to arrive at the best decision to ensure a thorough and effective test event."
Social Sharing