In 2010, training developers at Fort Bragg were directed to organize a class that would train female Soldiers to join cultural support teams and support special-operations units in Afghanistan. The training developers didn't have much background information; just that on a relatively short timeline, high-performing female Soldiers had to be taught to shoot, move and communicate in order to keep up with well-trained Special Forces or Ranger Regiment teams.
No course like this had ever existed for the Army, so there was no template to build from.
"We looked at the capability the operational force was asking for, did a little bit of analysis and went right into designing and developing the course," said Geoff Jones, who was one of the training developers who participated in this process.
"Within 90 days, we had a pilot course ready to be implemented," Jones said, "and we did this knowing that we'd be doing more analysis and design as we moved forward. We know the course was only going to get better."
As a result, a pilot course that could've taken six months to research, discuss and refine was launched in half that time. Knowing that no amount of work on paper could take the place of useful evaluation from course instructors and students, the training developers conducted a total of three pilot courses within their course-design timeline, making significant improvements each time from their evaluations of the program.
Jones and his team relied on a formal process to keep the course's development on track, and measured evaluations and data points for pre-defined course requirements to gauge their design's strengths and weaknesses. They used a cyclical process of analyzing force requirements and student performance so that the course was appropriately designed, developed and implemented, and throughout each phase, internal and external evaluations occurred to ensure nothing was overlooked.
This process may not seem revolutionary, but it doesn't need to be in order to be effective. This is the process used by the U.S. Army John F. Kennedy Special Warfare Center and School to guide the development of training curriculum, as well as the conduct and evaluation of more than 70 qualification and advanced-skill courses.
By depending on these systems, which emphasize the use of student performance evaluations, SWCS is ensuring that its graduates are prepared to be members of the Army's special-operations regiments throughout their careers, said Grey Welborn, the deputy director of the Capabilities Development & Integration Directorate at SWCS.
These regiments make up the Army's Special Forces, Civil Affairs and Psychological Operations active-duty and Reserve-component force.
Conventional Army units rely on collective training events, where a unit comes together to train under the structure in which it will operate for a known mission. Special-operations forces, however, have historically pushed Soldiers' training further up in their careers, so SOF units can react swiftly with little pre-mission training when they're needed, Welborn said. As the saying goes: competent special-operations forces cannot be created after an emergency occurs; thus, SWCS is responsible for creating that competent force, and predicting what they must be competent in, so that the operational force is fully manned and trained at a high level.
The process used by SWCS training developers to create and improve courses is known as the ADDIE process -- which stands for analyze, design, develop, implementation and evaluation.
In the beginning of this five-step cycle, SWCS leaders and training developers dig into the critical skills in which the operational force says its Soldiers must be competent in, as well as the amount of Soldiers it needs to have received the training. This is the analyze phase of the ADDIE process. Fundamentally, the developers strip down and define the who, what, where, when and why of specific training needs.
From there, these training developers -- who are divided into teams within the CDID -- determine when, where and how that instruction should take place. This, the design phase, includes figuring out which courses a lesson should be integrated into, how instructors will be able to measure students' performance in the subject, and if additional resources like training materials or extra instructors will be needed.
These first two phases predominantly occur in the CDID offices in Fort Bragg's Bryant Hall, but these staff members rely on close relationships with SWCS course instructors who will eventually inherit the course outlines and objectives, with the mission of ensuring their students learn the materials.
Instructors' perspectives become even more valuable to the CDID's civilian developers, many of whom are former special operators themselves, during the third phase: development. This is where the outlines drawn up in the previous phase begin to fill out as the instructors combine their operational experience with the developers' understanding of learning theory to write materials, produce media, validate materials and prep training facilities.
Course instructors take the reins for the fourth phase of the process, implementation, when they actually conduct a course by training students and administering tests and exercises.
And while it's true that this is a five-step cycle, there is no final phase. The remaining phase, evaluation, occurs throughout training analysis, design, development and implementation. Specific check-points, such as formal post-instructional conferences with all instructors, administrators and developers for a course, provide holistic opportunities to identify and discuss course improvement, but even those conferences rely on each individual's diligence throughout the course.
"Who manages this process? Well, we all do," said Jones, who supervises training development for many more courses than just the Cultural Support Training course. "It's everybody's responsibility, across the organization."
"We almost use [ADDIE] is a rapid development process, because we inject ideas and lessons as we go," Jones said.
During and following course implementations, instructors feed data back to their training developers who then revisit the first phase, analysis. Additionally, SWCS uses feedback from operational units regarding their Soldiers' post-graduation abilities, and new or changing requirements for special-operations missions. SWCS training developers use the new information to identify lingering gaps and weaknesses, which they then attempt to correct by refining a course through design and development.
"In the end, we don't end up with a circle where these steps are simply repeated," Welborn said. "What we get is a spiral, where we continue to get better and better as we move forward."
If you think the ADDIE process sounds like it makes sense, rest assured that you're not alone.
"It's an intuitive thing, all we've done is codify it," said Col. Robert Lutz, commander of the Special Warfare Medical Group (Airborne) at SWCS. "When I look back at how we designed training before this process was formally defined, I see that we've followed these steps, although I'm not sure we did it intentionally."
When Lutz took command of the SWMG(A) in 2010, he and his instructors realized the Special Operations Combat Medic course only covered about 70 percent of the critical tasks required by the U.S. Special Operations Command; those numbers were consistent with the results of the medics' performance on a SOCOM-generated test. While Special Forces medical sergeants stay at Fort Bragg for additional training, the first half of their year-long training is the SOCM course, which they attend alongside medics from the 95th Civil Affairs Brigade, 75th Ranger Regiment, 160th Special Operations Aviation Regiment, and Special Forces group support medics.
Using a system that turned out to fit into each step in the ADDIE process, the SWMG(A) cadre analyzed the critical tasks from SOCOM and measured them against the existing course's performance objectives. They found gaps where students didn't learn to manage and prioritize two or more patients at a time, and found an instance where students were having to forget advanced techniques and think on a simpler level to pass a basic national emergency medical technician exam.
"During the analysis phase we realized that students were only trained to manage one patient at a time," Lutz said. "In the real world they could have five, six or seven casualties at times, so they needed to be taught to establish priorities and evacuation requirements. Additionally, we taught them more than they needed to know to pass the EMT-Basic test, and we were actually having to tell them to forget things so that they could successfully take the national test without over-thinking it."
Through design and development, primarily done in-house at SWMG(A) due to the level of medical expertise among the school's cadre, the instructors planned a final field-training exercise where students would be exposed to the stress of managing multiples casualties, and the course's schedule was rearranged to begin with the basic EMT exam before the students were taught more advanced material.
"The instructors showed me where the disconnect was in the tasks that we needed to train, and how it was better to rearrange the course's schedule," Lutz said. "The course changes made sense, they didn't involve any extra resources and it was the right thing to do."
Since the SWMG(A) runs eight overlapping, year-long medic training courses at a time, its cadre doesn't wait until graduation day to evaluate a class's performance.
"It's really a continuous process," Lutz said. "We evaluate ourselves each time a class completes a block of training. The instructors conduct an evaluation, identify any issues, and if they need to change things they'll present them to me." Once, with only five days until the next class began their EMT block of instruction, the instructors were able to analyze, design and develop changes, and get Lutz's approval, before beginning the block for the following class. The process was effective because the instructors understood the requirements and the process to evaluating and changing training.
"Instructors need to know what they're responsible for instructing, and who they're accountable to," Welborn said. "To achieve their job, instructors need to coordinate with doctrine writers and training developers so they know what's accepted to be the standard, but also so they can tell staff if that standard is no longer appropriate."
"It's a two-way relationship," Welborn said. "It's not just a matter of training developers feeding the material to the instructors."
Due to the advanced capabilities of SWCS students, complexity of special-operations theories and techniques, and importance of sending fully capable Soldiers to the operational force, SWCS training developers and instructors can't afford to perform their duties in a vacuum.
"We're a unique and complex institution, and we have a unique target audience," Jones said. "Our students are Army Soldiers, but they've also been assessed and selected to join the special-operations community."
"We're constantly looking for ways to improve," he said. "I just want to make sure we're doing that systematically. The ADDIE process helps keep us on track"
Social Sharing