By John T. Dillard, Col., USA (Ret.)October 23, 2018
BEEN THERE, DONE THAT
Modeling and simulation is a PM's friend.
One thing I always felt pretty confident about during my acquisition career was the ability to see the obvious. That may not sound like much, but believe it or not, you'll encounter a lot of people who don't have this skill. However, seeing the unobvious … Ah, now that was the ability of only clairvoyants and psychics, I thought.
But guess what: The right investment by the program manager (PM) in modeling and simulation (M&S) can help you do just that. How can this be? After all, the primary weakness of all models is the same as their strength--simplicity.
Seems a real balancing act. Philosopher William of Ockham (see Ockham's Razor, or Occam's Razor) advised us to keep things as simple as we can. Einstein voiced the same caveat with, "Things should be as simple as possible, but no simpler." It was mathematician Norbert Wiener who said, "The best model of a cat is another, or preferably the same, cat," while statistician George Box advised, "Essentially all models are wrong, but some are useful."
These fellows weren't telling us not to model or simulate. They were simply warning us against excessive elaboration or build-out when modeling or simulating a product or system, since no system can be exactly represented by a simpler model.
So, how can a model that is not too complex and not too simple give us additional information we need to make technical or financial decisions?
I'll provide some examples. But you should know upfront that modeling and simulation can support you in your management decisions through all phases of the acquisition life cycle. In the main, M&S will likely do this by reducing your sample size of test articles, saving you money, as well as providing early discovery of technical anomalies. And yes, just like the modeling and simulation evangelists tell us, M&S will help you reduce time and risk as well, while increasing the quality and readiness of the fielded system.
Those are the expected payoffs to your investment. Still have some doubts? I don't blame you.
QUANTIFYING THE UNKNOWNS
There are any number of models across various knowledge domains: cost or financial models, models for spare parts and usage of consumables and, of course, the technical components of your developing system. Focusing herein mostly on the latter, it often became apparent to me how many "unknown unknowns" were always lurking out there during any particular phase of development.
It is often said that complexity is best defined as many, different, interconnected parts and their interactions. A key component of complexity is the uncertainty of those interactions, and that's where modeling and simulation gives us a hand. If we build our models right, with just enough granularity or detail, these efforts can actually save us from some misery.
We're probably all familiar with flight simulators, helping to train pilots in the operation of their systems. But along the development path, we have no finished system to emulate or simulate. So that's when the benefit of M&S can seem a little vague. And when you're having trouble achieving some key performance parameter, the last thing you want to hear is that more research, development, test and engineering (RDT&E) money is needed for some model when you know that even the actual system probably doesn't have enough funding. We have to see our investment in M&S as a risk-handling technique, since the worst we can do is proceed into uncertainty with a ready-fire-aim approach.
For example, during development of the Javelin anti-tank missile, engineers were uncertain as to whether the early design of the gimbaled seeker (the rotating assembly up front of the missile that "sees" the target) would be able to hold on to the target view throughout its climbing and diving flight pattern for autonomous top attack. Using a scale model of the eventual missile's front end, the engineers literally mounted a gimbaled seeker contraption on the skids of a helicopter and flew a simulated path to determine whether there were enough degrees of freedom for the seeker's movement.
It might have been an extra expenditure of funds not anticipated. But it was the best approach before having a finished missile to integrate, allowing the project team to go forward with design of other components.
And during the downrange initial operational test of the Army Tactical Missile System, when the very first of only 15 missiles to be fired didn't hit the target and was scored a failure, it was our "hardware-in-the-loop" software model back at Redstone Arsenal, Alabama, that told us it flew right where it was supposed to--but had the wrong targeting data for the gunners to input.
History is replete with examples of people who failed to adequately model before they began construction, including the Tacoma Narrows Bridge collapse of 1940 (unexpectedly swaying wildly in the wind and recorded on film), and the retrofitting of Dubai's Palm Jumeirah Island in 2009, with breakwater crescent openings (a bit of an afterthought) to prevent internal water stagnation. The problem is no different with weapon systems.
M&S THROUGH THE WICKETS
During the materiel solutions analysis phase of the acquisition process, it's common for PMs to invest in force-on-force modeling to predict combat value using key performance parameters of the new system. Other concept studies include such modeling to aid the analysis of alternatives effort.
Later on, during technology maturation and risk reduction, our prototypes will be early design models to demonstrate technology readiness levels and reduce risk by means of technical and operational discoveries during simulations. M&S efforts during both of these phases inform us as well as the user community.
With uncertainty resolving over time, we enter the engineering and manufacturing development phase, where M&S investments are still required as we learn more about our more mature "engineering design models." Computer-assisted design and manufacturing are invaluable as we refine requirements and design while using developmental testing to validate maturing models. These operational models also help us better realize the logistical support plans that previously had been conceptual.
At this point there is likely a divergence of models: the real-world mission kind (test articles) that can fly, roll or swim throughout the multidomain battlefield, and the hardware (and software)-in-the-loop models. The latter are typically being run within computers to predict performance in a huge variety of conditions and scenarios via Monte Carlo simulations (multivariate probability distributions).
Both serve to inform project stakeholders, especially you, the PM. Often, we have to cut a deal with the operational testers to let us use something less than a full-up system for destructive testing. No sense firing a .50-caliber machine gun at a nuclear submarine--just use a panel (skin) from the side of one. (Sound crazy? Just read the account in "The Pentagon Wars" of the time some folks wanted to fire a tank main gun round at a combat-loaded Bradley Fighting Vehicle, just to see what would happen.) PMs must negotiate the use of models instead, to reduce the costs of test articles.
SAVE THE BACON
Another bacon-saving episode--in this case, in the evolution of the Javelin missile project--was an engineering and manufacturing development field simulation that we felt would probably be more to satisfy bureaucrats' demands than to learn anything new about our system. How wrong we were! With immature models and prototypical hardware and software, we found out unequivocally that our FLIR (forward-looking infrared) sensor sensitivity specification was validated--and there would thus be zero room for design trades on that aspect of the system.
With the production and deployment phase drawing near, and as stereolithography technology is advancing now to 3D printing, output models will still be used to prove out production planning and manufacturing processes using factory simulations. Low-rate initial production test articles will no longer be models, but production-representative--the real thing. And, as with the example above of the Army Tactical Missile System initial operational test and evaluation, M&S can still save the day by revising the test score as successful that was initially thought to be a failure. In that same three-month major operational test event on the very eve of system deployment to the Persian Gulf War, we were able to use over 1,000 simulated fire missions, done solely by computer model, to convince testers and other stakeholders to support the full-rate production decision.
Of course, during the operations and support phase we will be getting user feedback to help us elaborate evolving ideas for more capability through modifications, tech insertion and so on. Here again, logistical support tracking, reliability, failure modes and corrective analysis will be accomplished by rigorous configuration management, at the heart of which are recorded design drawings that model the actual thing being in the hands of users. And, of course, the training simulators are the devices we have developed in parallel to help users gain proficiency before actual use.
So, without having to read a bunch of training material with boring terminology extolling the values of simulation-based acquisition, you have here what I hope is a convincing argument to make the investment when that open hand comes to you for M&S money. If done right, it's going to help you see the unobvious before disaster strikes--and save you time and money as well. M&S will do that principally by making it possible to reduce test article sample size and discover anomalies early. I guarantee it.
It also falls upon you to continue beyond the initial investment throughout the life cycle and keep elaborating your model of the actual system. How much you are able to apportion to whom will be a difficult trade, as there never seems to be enough RDT&E money to go around, even for the actual system effort underway. How thoroughly to build out your model will also be a tough financial and technical trade. This is the stuff management is made of--having to make those decisions about resource allocation, not knowing how much they're going to pay off. Your systems engineers and other technical folks can help here.
But make no mistake--if accomplished to the right degree, M&S will make it possible to tease out things to furnish the information you need. It might just save the day.
For more information, email the author at firstname.lastname@example.org.
JOHN T. DILLARD, COL., USA (RET.), managed major weapons development efforts for most of his 26-year career in the U.S. Army. He is now a senior lecturer in the Systems Engineering Department of the Graduate School of Engineering and Applied Sciences at the U.S. Naval Postgraduate School in Monterey, California, where he also serves as the technical representative for the Army's new Master of Science programs in Systems Engineering Management.
This article is published in the October-December 2018 issue of Army AL&T magazine.