In 15 weeks, MCRP intern reduces Joint Munitions Command scorecard process 95 percent
May 31, 2013
ROCK ISLAND ARSENAL, Ill. -- Making the most of her 15 weeks, taking her time and saving hundreds of hours for future users, Jasmine Vinson considers her Minority College Relations Program internship a success.
Specifically co-sponsored by the Army Sustainment Command and Joint Munitions Command, the MCRP develops collaborative programs within the commands, allowing minority serving institutions (Historically Black Colleges and Universities, Tribal Colleges and Hispanic serving institutions) to participate in the program and enhancing the future readiness of each command through these partnerships.
For the Joint Munitions Command's Enterprise Integration Office, Vinson's MCRP project is a real game changer.
Vinson is pursuing a bachelor's in computer science at Alabama A&M University, Huntsville, Ala., but during her MCRP Spring internship, she excelled at writing and instituting code ultimately improving the JMC data quality scorecard for the Enterprise Integration Office.
The Enterprise Integration Office is responsible for the implementation and operation of the Logistics Modernization Program. Instituted in 2010, LMP is the Army's new method of managing Army materiel assets replaces the previously 30-plus-year-old Commodity Command Standard System and the Standard Depot System.
"The LMP data quality scorecard is the tool JMC created to document the results of Logistics Modernization Program (LMP) critical data object assessments. LMP is an enabler that provides the Warfighter with the best products and logistics services at the best value," said Rhonda Fuller, JMC LMP Data Quality Lead. (Fuller served as a team leader to Vinson during her project.) In reality, the scorecard is a Microsoft Excel workbook with enable programming macros; however, its application and reach is far more complex than the Windows system housing it.
"It uses data provided by Logistics Support Activity (LOGSA) and the sites. The data is extracted from the LMP system by LOGSA and is used by the sites to audit and determine the quality of the data in the system. The scorecard shows the 'health' and 'quality' of the data resident in our Enterprise Resource Planning (ERP) system," said Melissa Sharp, Information Technology Specialist.
Currently, nine JMC sites use the scorecard: headquarters JMC (Rock Island, Ill.); Anniston Munitions Center (Anniston, Ala.); Blue Grass Army Depot (Richmond, Ky.); Crane Army Ammunition Activity (Crane, Ind.); Hawthorne Army Depot (Hawthorne, Nev.); Letterkenny Munitions Center (Chambersburg, Pa.); McAlester Army Ammunition Plant (McAlester, Okla.); Pine Bluff Arsenal (Pine Bluff, Ark.); and Tooele Army Depot (Tooele, Utah).
The former scorecard process included redundancy and room for much human error. Vinson set out to incorporate as much improvement as possible to reduce the possibility of injecting errors into an already tedious and meticulous process. As the Enterprise Integration office reports, the original scorecard was designed for smaller data sample assessments.
Over time it became apparent to JMC that LOGSA data samples could contain several thousand records for the JMC sites to assess and record the errors in the scorecard. The manual processing time to duplicate the errors found during the assessment process from the LOGSA data sample into the scorecard excel workbook was time consuming.
In addition, the more data entry that was required, the probability of human error increased. If the number of records and/or critical data element errors in the site's scorecard did not match the LOGSA report assessment findings, it could take hours analyzing hundreds to thousands of data records to determine where the differences were.
Often the differences were due to human error.
For Vinson, understanding the problem was just the tip of the iceberg. She utilized her educational background to determine the best solution, including learning additional codes.
"At my school, the professors focus more on teaching us how to write code. I have taken several programming classes and once you learn one, it's easier for you to pick up different languages because they all have some similarities," said Vinson.
"My project was pretty intense so I would rate it a 7 (out of) 10 in difficulty. It takes a lot of time and patience to write programs in general but even more when it's a program that is necessary to help our government save money and improve efficiency," she said.
"The main part of the process that has been removed is the requirement of users to enter the same data twice and then to go on to manually check each record, line-by-line.
"Sometimes there are a couple hundred lines and a lot of times there are a few thousand lines that the user would have to go through and mark for errors," said Vinson.
Learning on the job was crucial to Vinson as well as the knowledge she brought to the project.
"Jasmine came to the internship with an IT (information technology) background and was willing to try new methods to achieve the desired outcome. Because she was so used to studying, the research that was required for the project was not difficult for her," said Sharp.
And according to Sharp, "She (Jasmine) built a prototype scorecard and developed and tested the code from her research. She also developed and tested the majority of the Visual Basic for Applications (VBA) code behind the Excel scorecards. This VBA code built all the formulas and conditional formatting for the scorecard templates, to make the process as easy as cutting/pasting from one worksheet to another without duplication of effort on the user's part."
Vinson acknowledges the biggest change was the coding itself; however, with that change brings future consistency.
"The biggest change that ensures future consistency is in the code behind the data quality scorecard. Everything that users had to do previously is now being done automatically by using that code to put all critical data elements where they belong and mark all errors, therefore cutting the processing time down tremendously. And, also, creating a more accurate scorecard," said Vinson.
"Jasmine's project plan was not entirely programming. The majority of the time spent on the project was on research and testing rather than developing code. To begin the project she learned about what an Enterprise Resource Planning (ERP) is and data quality accuracy dimension. Next she learned about the mechanics of the JMC LMP data quality scorecard and how it was programmed. Once she had an understanding of data quality and the scorecard, she created a list of the scorecard's strengths and weaknesses," said Fuller.
By improving the scorecard process, JMC sites and the Enterprise Integration office benefits in the reduced amount of time required to input data and trouble shoot errors. The scorecard could be exported to other Army Materiel Command sites for their use.
As Vinson outbriefed Brig. Gen. Kevin G. O'Connell, commander, Joint Munitions Command, in mid-April, she spoke about more than the technicality of her project. She developed the visual basic for application code to auto search any assessment package for any critical data elements and errors.
But Vinson also outlined her methodology as well.
"I learned many things while being an intern in the JMC but one of the most significant things is the importance of time management. Following a detailed work plan and setting goals, for me, is the best way to obtain solid results," said Vinson.
"I also gained great hands-on experience by getting the chance to apply everything I've learned so far in school and put it to work on something useful that will benefit many users at JMC sites for years to come," she continued.
More impressively, according to both Fuller and Sharp, the Enterprise Integration Office is currently working with the Continuous Process Improvement office on a submission for a non-gated Lean Six Sigma project.
"While we (Enterprise Integration) are currently documenting the actual savings per year, it is estimated that the user effort now is about 5 percent of what it was before. Factors being used to calculate the actual savings are, the number of records in the assessment data sample (20 to 10000), the number of critical data elements (4 to 50) for each record, the number of critical data elements in each record that are in error, and the complexity of the data object being assessed. "
In simpler terms, at the end of Vinson's redesign of the scorecard process, the new and improved scorecard will take users about 9.3 hours per year to complete. Previously, the scorecard process took 224 hours per year to complete for a 95.8 percent reduction in labor.
For those who understand savings when they see it, so do the JMC Continuous Process Improvement, CPI, team. Consideration of Vinson's project as a "Just-Do-It" non-gated project applies a great deal of significance to what began as a 15 week internship.
Designation as a "Just-Do-It" signifies such careful analysis of a problem, measured, relevant calculations and a definitive solution--all of the things inclusive of a green or black belt project but without the formal process.
According to Chad DeWitte, JMC CPI team, "to date we've conducted a process mapping session and mapped out the current and new process for the JMC Data Quality Scorecard."
"A JDI non-gated or "Just-Do-It" non-gated project is a project that is outside the DMAIC (define, measure, analyze, improve & control), process or non-standard. It's a streamlined approach which doesn't require all of the normal work associated with a green or black belt project. Non-gated projects are defined as "Just-Do-Its" that have been or will be implemented based on a project change," said DeWitte.
From its headquarters in Rock Island, Ill., JMC operates a nationwide network of conventional ammunition manufacturing plants and storage depots, and provides on-site ammunition experts to U.S. combat units wherever they are stationed or deployed. JMC's customers are U.S. forces of all military services, other U.S. Government agencies, and allied nations.