00 DCMA-MAN-3101-02 Program Support Analysis and Reporting PDF
00 DCMA-MAN-3101-02 Program Support Analysis and Reporting PDF
Office of Primary
Responsibility: Program Support Capability
New Issuance
Internal Control: Process flowchart and key controls are located on the Resource
Page of this manual.
Purpose: This issuance, in accordance with the authority in DoD Directive 5105.64:
• Outlines procedures on how the Agency will report on program performance and
anticipated performance to program, product, and project offices and the OSD in
accordance with Federal Acquisition Regulation 42.302 (a)(67).
• Provides and defines procedures for Program Support analysis and reporting to include
Program Assessment Reports, Program Notifications, Program Support Team and
Support Program Support Team Inputs to Program Integrators and Support Program
Integrators and Defense Acquisition Executive Summary reporting.
2
DCMA-MAN 3101-02, November 22, 2017
TABLE OF CONTENTS
SECTION 1: GENERAL ISSUANCE INFORMATION .............................................................................. 5
1.1. Applicability. .................................................................................................................... 5
1.2. Policy. ............................................................................................................................... 5
SECTION 2: RESPONSIBILITIES ......................................................................................................... 7
2.1. Executive Director, Portfolio Management and Business Integration (PM&BI). ............ 7
2.2. Director, Major Program Support (MPS) Division........................................................... 7
2.3. Director, Earned Value Management System (EVMS) Center. ....................................... 7
2.4. Corporate Administrative Contracting Officer (CACO), Divisional Administrative
Contracting Officer (DACO), or Administrative Contracting Officer (ACO). .................. 7
2.5. Commanders/Directors, Operational Units....................................................................... 7
2.6. Commanders/Directors, CMO. ......................................................................................... 8
2.7. Director, Special Programs. .............................................................................................. 8
2.8. Program Integrator. ........................................................................................................... 8
2.9. Support Program Integrator. ............................................................................................. 9
2.10. PST and SPST Support Members. .................................................................................. 9
2.11. Functional First Level Supervisors. ................................................................................ 9
SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR ............................ 10
3.1. Program Analysis. ........................................................................................................... 10
3.2. Electronic Functional Input Template (EFIT). ................................................................ 10
3.3. Production Supportability Table. .................................................................................... 12
3.4. Contractor Business System Reporting. ......................................................................... 13
3.5. EVM Analysis................................................................................................................. 14
3.6. Prime Control of Subcontractor Assessment. ................................................................. 19
3.7. Suppliers Driving the Ratings. ........................................................................................ 19
3.8. SPI Support PAR. ........................................................................................................... 20
APPENDIX 3A: CONTRACT DATA EVALUATION METRICS ............................................................. 21
APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT COMPLETION .................................. 24
SECTION 4: PROGRAM REPORTING ................................................................................................ 30
4.1. Quarterly PAR. ............................................................................................................... 30
4.2. Program Notification (PN).............................................................................................. 35
4.3. DAES Assessment Input. ................................................................................................ 37
APPENDIX 4A: PAR NARRATIVE CRITERIA .................................................................................. 38
APPENDIX 4B: CONTRACT PERFORMANCE ASSESSMENT .............................................................. 40
APPENDIX 4C: PRODUCTION ASSESSMENT.................................................................................... 43
APPENDIX 4D: MANAGEMENT ASSESSMENT................................................................................. 46
GLOSSARY ..................................................................................................................................... 48
G.1. Definitions...................................................................................................................... 48
G.2. Acronyms. ...................................................................................................................... 52
REFERENCES .................................................................................................................................. 55
TABLES
Table 1. DID Correlation ............................................................................................................. 15
TABLE OF CONTENTS 3
DCMA-MAN 3101-02, November 22, 2017
Table 2. Best Predictive EAC Performance Factors by Contract Completion Status ................. 24
Table 3. PAR Report Months....................................................................................................... 30
Table 4. Cost Assessment Color Criteria ..................................................................................... 40
Table 5. DCMA Schedule Slip (Months) Color Criteria (CPA) .................................................. 40
Table 6. DCMA Schedule Slip (Months) Color Criteria (PA) .................................................... 43
Table 7. Production Assessment Criteria ..................................................................................... 43
Table 8. Production Assessment Checklist .................................................................................. 44
Table 9. Management Assessment Criteria ................................................................................. 46
FIGURES
Figure 1. Management Reserve Consumption ............................................................................. 26
Figure 2. PAR Group Timeline.................................................................................................... 30
TABLE OF CONTENTS 4
DCMA-MAN 3101-02, November 22, 2017
1.1. APPLICABILITY. This issuance applies to all DCMA Components, DCMA Operational
Units, and DCMA Contract Management Offices (CMO) involved with Program Support (unless
it conflicts with higher-level regulations, policy, guidance, waiver, or agreements, in which case
those take precedence). Requests for exception to this manual must be addressed through the
waiver process in the DCMA Manual (MAN) 501-01, “Policy Issuances Procedures.” This
manual applies to analysis and reporting on Major Programs, Non-Major Programs with
Reporting Requirements, and High Visibility Commodities.
a. Major Programs. The requirements of this manual apply to all programs identified as
Major Programs.
(1) Section 3 – Section 3 requirements apply only as needed to meet the negotiated
reporting requirements (e.g., electronic Functional Input Template (eFIT), the Production
Supportability Table, Earned Value Management (EVM) analysis, the Prime Control of
Subcontractor Assessment (PCSA), the Program Assessment Report (PAR), the Support PAR,
the use of Program Support Teams (PST) and Support Program Support Teams (SPST)). The
PST Collaboration Site must be used for non-major programs with reporting requirements.
(2) Paragraph 4.1 – For PARs, only Executive Summaries are required. Other PAR
narrative fields that are not negotiated for inclusion must contain “Not Applicable”.
(3) Paragraphs 4.1.c and 4.2.f – Distribution of the PARs/Program Notifications (PN)
will be accomplished by local procedures unless the PAR/PN represents a program which is
contained within the approved distribution system per the Program Support Analysis and
Reporting User Guide. In this case the PAR/PN must be uploaded to the distribution system.
d. High Visibility Commodities. The reporting requirements of this manual do not apply to
High Visibility Commodities, refer to the High Visibility Commodity User Guide on the
Resource Page.
a. Deliver global acquisition insight for all programs and High Visibility Commodities by
providing objective, independent, relevant, timely and actionable information to the Acquisition
Enterprise.
b. Comply with Office of the Secretary of Defense (OSD) Defense Acquisition Executive
Summary (DAES) Deskbook and OSD DAES Guidelines when reporting on DAES programs,
specifically these 3 of 11 DAES assessment categories: Contract Performance Assessment
(CPA), Production Assessment (PA), and Management Assessment (MA).
SECTION 2: RESPONSIBILITIES
2.2. DIRECTOR, MAJOR PROGRAM SUPPORT (MPS) DIVISION. The Director, MPS
Division must:
b. Compile, publish, and disseminate the PAR Scoring Rubric and Rework Metric Results.
c. Ensure that support is provided to the Operational Units in the evaluation of PAR quality,
as needed.
c. Provide an impact statement for any EVMS Corrective Action Request (CAR) issues.
d. Provide follow up on system related issues, such as data integrtiy concerns or system
performance.
SECTION 2 : RESPONSIBILITIES 7
DCMA-MAN 3101-02, November 22, 2017
b. Ensure the Operational Unit provides PAR score, feedback, and recommendations to the
PI, PI’s First Level Supervisor (FLS), and CMO Commander or Director.
c. Ensure the Operational Unit, in coordination with the CMO, jointly develop and
implement a get well plan when the PAR quality is below the acceptable threshold.
d. Ensure the Operational Unit performs a follow-up review of PAR quality when required.
e. Ensure the Operational Unit loads the DAMIR rework into the DAES Assessment Rework
Database.
f. Ensure the Operational Unit provides PAR health results to the MPS Division and the
CMO.
g. Promote the use of predictive analysis throughout the entire content of the PAR.
e. Promote the use of predictive analysis throughout the entire content of the PAR.
f. Special Programs CMOs only – comply with DCMA-INST 3101 and meet the intent of
this manual to the maximum extent practicable for all Special Access Programs (SAP)/Sensitive
Compartmented Information (SCI) contracts.
g. High Visibility Commodities – follow the High Visibility Commodity User Guide on the
Resource Page.
2.7. DIRECTOR, SPECIAL PROGRAMS. The Director, Special Programs must comply
with DCMA-INST 3101 and meet the intent of this manual to the maximum extent practicable
for all SAP/SCI contracts (including the designation of Lead CMO for SAP/SCI).
SECTION 2 : RESPONSIBILITIES 8
DCMA-MAN 3101-02, November 22, 2017
b. Drafts PNs.
2.10. PST AND SPST SUPPORT MEMBERS. The PST and SPST members must
document and submit functional inputs.
SECTION 2 : RESPONSIBILITIES 9
DCMA-MAN 3101-02, November 22, 2017
a. Summary. The PI and SPI rely on documented inputs from PST and SPST members to
provide independent acquisition insight about the program to the DCMA customers. These
inputs (e.g., eFIT, EVM Analysis input, PCSA, narratives supporting CBS status or impact,
Production Supportability Table, and Support PARs) will reside in the program’s PST
Collaboration Site (see Resource Page).
b. Conduct Program Analysis. Based on program risk and resources, PST and SPST
members must perform their assigned surveillance and analysis activities identified in the
Program Support Plan (PSP) or Support Program Support Plan (SPSP) and functional
surveillance plan(s). In all cases, PST and SPST surveillance should be conducted with an
emphasis on the cost, schedule, and technical impacts to the program, in addition to assessing
contractor compliance to contractual and procedural requirements. The PI, SPI, PST, and SPST
must engage with the contractor as necessary to perform PSP/SPSP surveillance and analysis
activities.
a. PST and SPST Documents Program Analysis Results Using eFIT. The purpose of the
eFIT is to standardize PST and SPST member input for the creation of PARs and PNs by the PI
and SPI. All PST and SPST member inputs must be through the use of eFITS. Exceptions are
the EVM Analysis input, CBS status and impact statements, PCSA, Production Supportability
Table, and Support PAR. For the PST and SPST members not required to use the eFIT, refer to
the applicable section of this manual for input requirements. All eFITs must be completed in
accordance with the appropriate section of the Program Support Analysis and Reporting User
Guide found on the Resource Page. The eFIT must be provided via the program’s PST
Collaboration Site. Information included in the eFIT should provide justification and predictive
analysis in support of the PAR and PN assessments by answering the following questions:
(1) Frequency. At a minimum, PST or SPST members must provide weekly eFITs or a
notification of “no input” to the PI or SPI in accordance with the PSP/SPSP. eFITs submissions
can evolve as additional information is gathered through observations, trending, and information
exchange. Periodic updates are encouraged for high risks and significant issues.
(2) Accessing the eFIT Template. The eFIT template, DCMA-Form (DCMAF) 3101-02-
02A, is located on the program’s PST Collaboration Site.
(3) eFIT Program Impact Rating. The eFIT is composed of several data input sections
with each section having multiple data elements. All fields marked with an asterisk in the eFIT
are mandatory fields. For any specific issue or risk, the generic rating definitions are:
(a) Green: Some minor problem(s) may exist, but appropriate solutions to those
problems are available, and none are expected to affect overall contract cost, schedule, and
performance requirements; and none are expected to require managerial attention or action.
(b) Yellow: Some event, action, or delay has occurred or is anticipated that may
impair progress against major contract objectives, and may affect the contractor’s ability to meet
overall cost, schedule, and performance requirements or other major contract objective.
(c) Red: An event, action, or delay has occurred or will occur that, if not corrected,
poses a serious risk to the contractor's ability to meet overall cost, schedule, and performance
requirements or other major contract objective.
(d) More detailed rating criteria is available in Appendices 4B, 4C, and 4D.
(4) Type. Risk, Issue, Opportunity, or Observation general guidelines are as follows:
(a) Risk. An event or condition (contractor or program) that has not yet occurred, but
may impact successful performance (cost, schedule, or technical). Risks can be either:
2. Program Office, DCMA, or contractor identified risks that the contractor is not
addressing. The risk and resulting impacts, due to lack of contractor action, will be assessed by
DCMA, as discussed in the “Documenting a Risk” section below.
(c) Opportunity. A future event or condition (contractor or program) that may result
in a benefit to the program (cost, schedule, or technical).
(d) Observation. A noteworthy occurrence that has not been identified as a Risk,
Issue, or Opportunity. Observations are not envisioned to have a significant effect to the
program cost, schedule, or technical performance.
(5) Input and Description. Functional Assessment and Predictive Analysis. DCMA’s
independent assessment of the contractor root cause, the corrective action, and mitigation should
consider:
(6) Supervisor Review and Approval (Optional). Supervisory approval is not required for
eFIT submittal unless a CMO requirement exists.
b. eFIT Quality. The PST and SPST Functional Specialist’s supervisor must periodically
review a sample of their Functional Specialist eFITs to evaluate the quality and provide
appropriate feedback to the originator.
(1) Functional FLS Reviews eFITs Using eFIT Rubric. At least once per quarter, the
PST and SPST Functional FLS must review at least one eFIT for each of their Functional
Specialists assigned to a PST or SPST. To obtain the eFIT, go to the program’s PST
Collaboration Site. This review must be accomplished using the eFIT Rubric located on the
Resource Page. The Rubric automatically calculates the score; scores below 85 percent are not
acceptable.
(2) Functional FLS Uses eFIT Score as Part of Functional Feedback. If the resultant eFIT
score is below 85 percent, then the supervisor must discuss relevant opportunities to improve the
quality of future eFITs.
(3) Functional FLS Reviews Subsequent eFITs. If the eFIT score is less than 85 percent,
the FLS must perform reviews of subsequent eFITs to ensure that the quality has improved to an
acceptable level.
a. The table provides insight into current and future contractor production performance. It
supports and traces to DCMA independent projected delivery dates used in the PAR. The
analysis details current delivery trends and creates an understanding of how future performance
on remaining deliveries will either be met, slipped, or ahead of schedule. The table allows for
quick quantification of items not meeting delivery dates and quantifying schedule underrun or
overruns in terms of months. See the appropriate section of the Program Support Analysis and
Reporting User Guide on the Resource Page. The Production Supportability Table must be
developed at least quarterly to align with the PAR timeline but is not required for the following
situations:
(1) Low quantity deliveries (See the Program Support Analysis and Reporting User
Guide)
a. Summary. CBSs are the first line of defense against fraud, waste, and abuse in DoD
contracts. Contractors with approved business systems allow the contractor and the DoD to
more confidently rely upon the information produced, which helps manage programs more
effectively. For contracts subject to the Cost Accounting Standards and containing one or more
of the following six business system clauses, the contractor must establish and maintain an
acceptable CBS, as required by the following clauses:
b. Cognizant ACO Documents Reasons for CBS Disapproval or Not Evaluated. CBSs
status and status date is populated from the Contractor Business Analysis Repository (CBAR).
The cognizant ACO responsible for the CBSs must document reasons for any “Disapproved” or
“Not Evaluated” CBS in the program’s PST Collaboration Site. The pertinent information
supplied must include (as applicable):
(2) CBS status of “Disapproved” or “Not Evaluated” must address why the CBS is not
approved.
(c) Identify whether or not a withhold was applied to the disapproved CBS. If a
withhold has been applied, identify:
1. Whether the payment withhold applies to the program (If it does not apply to
the program, still identify the withhold but state that it does not impact program).
(2) For Disapproved Systems, include the reasons for disapproval (include the guideline
numbers and titles that resulted in the disapproval) and must address reliability of the EVM data
used in EVM analysis. If the data is determined to be unreliable it will have an impact the CPA
rating.
(3) Impact(s) of transmitted Level III or IV CARs supporting EVMS disapproval (if
there is no impact, then state that); address actual or proposed date of submission for the CAP,
DCMA’s assessment of the contractor’s status towards closing the CAR, and estimated time for
follow-up review.
d. PI Reviews CBS Data and Determines Programmatic Impact. The PI, in conjunction
with the applicable ACO or EVMS Center, must identify any impacts to their specific program
resulting from significant CBS issues in the Business Systems Input tab of the program’s PST
Collaboration Site.
3.5. EVM ANALYSIS. EVM Analysis is performed for program conducting program
reporting, containing contracts that have the EVMS clause, and the contractor is submitting
EVM CDRLs. EVM analysis will be conducted using the Agency cost and schedule analysis
tools and user guides found on the Resource Page. The EVM Analyst is identified in the PSP or
SPSP.
a. Obtain and Evaluate Contractor EVM Data Submission. DCMA will use Defense
Cost and Resource Center (DCARC) EVM Central Repository (EVM-CR) to access official
contractor submissions for review and analysis unless the contract does not specify the use of
EVM-CR, in which case DCMA will receive the data directly from the contractor. Any EVM
data obtained from the contractor, which is not stored in EVM-CR, must be uploaded to the
Prime contract folder in the Electronic Document and Records Management (eDRMS). DCMA
EVM Analysts must register in DCARC as a “Reviewer” in order to access and provide
validation recommendations to the program office during the official review timeframe. See the
appropriate section of the Program Support Analysis and Reporting User Guide found on the
Resource Page for instructions on how to register with DCARC.
(1) Every month the EVM Analyst will download EVM-CR or contractor provided
Integrated Program Management Report (IPMR) or Contract Performance Report
(CPR)/Integrated Master Schedule (IMS) data and import the data into the appropriate DCMA
EVM analysis software. The EVM Analyst must ensure the data in the submitted IPMR or
CPR/IMS is complete, Data Item Description (DID) compliant, consistent, and reliable (Table 1).
The minimum data integrity metrics and tests for evaluating cost and schedule data are described
in Appendix 3A.
(2) For submissions in the EVM-CR, no later than (NLT) 10 business days after
contractor submission, the EVM Analysts must report any discrepancies and provide a
recommendation to the Program Office through the DCARC system as to whether the data is
complete and DID compliant. (See the appropriate section of the Program Support Analysis and
Reporting User Guide found on the Resource Page.) For other prime contractor submissions not
in the EVM-CR, the discrepancies will be directly reported to the program office through the PI.
The CMO may submit a contractual noncompliance or “other Contract Management”
documentation CAR pertaining to the missing or late EVM CDRL deliverable or for incorrect
CDRL data if the program office rejects the submission. This is not a system review and suspect
data that may indicate areas of concern with the EVM system will be forwarded to the EVMS
Center for review and determination of appropriate actions in accordance with DCMA-INST
210. The EVM Analyst must not create any EVMS CARs.
b. Determining Contract Risks or Issues. As part of the monthly analysis, the EVM
Analyst will evaluate variances and determine potential risks or issues.
(1) For those Work Breakdown Structure (WBS) elements with effort remaining, the
EVM Analyst must perform Cost Variance (CV), Schedule Variance (SV), and Variance at
Completion (VAC) analysis at the lowest reporting level in order to determine the WBS elements
that significantly contribute to the overall contract variances. This analysis is typically
performed with cumulative performance data; however, variance analysis employing current
period data may also be useful in identifying emerging trends that indicate current issues or may
signal potential risks.
(2) To determine the list of WBS elements, the EVM Analyst must select incomplete
WBS elements that may impact the contract and potential need for surveillance. The information
for these WBS elements will be provided to the PST or SPST functional specialists during the
review of the Program Risk table IAW DCMA-MAN 3101-01.
(3) The PST or SPST functional specialist must work with the other PST or SPST
members to provide to the EVM Analyst through eFITs (paragraph 3.2):
(d) Any additional cost or schedule adjustments (impacts) to the contractor’s values.
c. DCMA Cost and Schedule Estimates. Independent DCMA Cost and Schedule
Estimates are of paramount importance to the acquisition community. See Appendix 3B for
additional details.
(1) Determine the Need for Revised Estimates. As part of the monthly analysis, the
EVM Analyst will make a determination as to whether DCMA Estimate at Completion
(EACDCMA) or Estimated Completion Date (ECDDCMA) needs to be reviewed outside of the
quarterly generation process. If any of the following conditions have occurred since the last time
these values were calculated, or if a Memorandum of Agreement (MOA) with the Program
Office requires one, perform an out-of-cycle calculation of the EACDCMA, DCMA Variance at
Completion (VACDCMA), or ECDDCMA and days of schedule slippage and update if necessary:
(2) Quarterly Schedule Analysis. On a quarterly basis, in preparation for the PAR
submission, the EVM Analyst must provide to the PI DCMA’s assessment of the contract’s
schedule performance. This analysis uses the EVM and duration-based schedule metrics to
evaluate the contract schedule and forecast any potential schedule delays to the contract or
program. Contributing factors affecting the contract milestone completion dates include: critical
path, driving paths, lag, float, margin use/consumption, subcontractors, rework, test failures,
schedule delays, contract mods, Over Target Schedule (OTS), and single point adjustments.
(a) Schedule Performance. The EVM Analyst must analyze the Baseline Execution
Index (BEI) and missed task trends, correlate the tasks to the PST Functional provided root
causes, identify any corrective actions, and evaluate the impact on the schedule.
(b) ECDDCMA. Prior to determining the ECDDCMA, the EVM Analyst must apply the
functional specialists’ and SPST EVM Analysts’ schedule adjustments to the appropriate tasks in
the IMS. These schedule adjustments will account for known or forecasted slippages based on
functional surveillance or SPST EVM analysis. After the EVM Analyst has made all known
adjustments, he or she will use the Agency schedule analysis tool or the constraint method to
determine the DCMA independent critical path and the ECDDCMA for each milestone or
contractual required events. After incorporating all functional schedule adjustments, the EVM
Analyst must upload a copy of the DCMA independent critical path and contractor’s critical path
to the “Program Documents” tab in the program’s PST Collaboration Site.
(3) Quarterly Cost Analysis. On a quarterly basis, in preparation for the PAR
submission, the EVM Analyst must submit DCMA’s assessment of the contract’s cost
performance to the PST Collaboration Site. Cost analysis evaluates the impact to contract
performance based on numerous factors (e.g., Management Reserve (MR) use, subcontractors,
rework, test failures, schedule delays, contract mods, labor rates, material costs, OTB, single
point adjustments) and forecasts any potential cost overruns to the contract or program. The
provided assessment is an independent evaluation of the contract’s performance and includes
insight from DCMA functional specialists.
(a) EACDCMA. The EACDCMA is an analysis of WBS elements at the lowest reporting
level and the risk/opportunities that may impact implementation of contract level requirements.
This analysis involves determining the reasonableness of the WBS level EAC with information
gained from PST/SPST functional surveillance and SPST EVM analysis inputs. In addition to
the WBS level analysis, the contract level analysis will account for factors such as estimates of
known or anticipated risk areas, planned risk reductions, or cost containment measures.
Challenge, when appropriate, the contractor’s analysis and explanations. Perform independent
analyses and surveillance to support these challenges.
incorporate the SPST EVM Analysts’ EACDCMA for the respective WBS elements. When a
subcontractor with a fixed price type contract is represented as an element on the prime
contractor’s WBS, the WBS level EAC estimate for the subcontractor cannot exceed the
subcontract ceiling price.
2. Lowest WBS Level EACDCMA Realism. The EVM Analyst must evaluate the
confidence of the EAC by comparing the WBS level VAC to the existing cumulative Cost
Variance (CVCUM), comparing the CPICUM to the “To Complete Performance Index” (TCPIEAC),
and comparing the WBS level EACDCMA to the optimistic and pessimistic range of EACs. This
will provide warning indicators of any WBS level EACDCMA not in alignment with past
performance trends and significant differences should be explained and noted at the WBS level
before generating the contract level EACDCMA.
3. Contract Level EACDCMA. The EVM Analyst must generate the contract level
EACDCMA by summing the lower WBS level EACDCMA values and adding the expected MR
usage and risk adjustments. Since MR does not form part of the PMB, the EVM Analyst must
add the expected MR usage and risk adjustments not included in the WBS level EACDCMA.
a. Risk Items. Integrate functional specialist dollarized cost impact with the
EACDCMA for the cost, schedule, and technical risk items the functional specialist is monitoring.
1. WBS Level VACDCMA. The WBS level VACDCMA is the difference between
the WBS level Budget at Completion (BAC) and the respective EACDCMA. This determines the
performance drivers influencing the contract level VACDCMA. When reporting the results in the
“EVM Analysis” tab, summarize completed tasks into a summary line and focus on the top
current and future drivers.
If an OTB has occurred, also track the percentage of DCMA Variance at Completion
(VACDCMA%) to the original contract budget base.
(c) EACDCMA and VACDCMA Comparative Analysis to the Contractor Values. The
EVM Analyst must address the key reasons for significant differences between the EACDCMA and
the contractor’s EAC (EACKtr) value.
d. Providing EVM Analysis Insights. Using the program’s PST Collaboration Site, the
EVM Analyst must enter the analysis results into the “EVM Analysis” tab for the specific
program, contract or CLIN, and report month. This will provide the PI the required EVM
information defined in Appendix 4B. The DCMA 14 point assessment is no longer a
requirement but may be performed and provided per the MOA. In this case, the 14 point
assessment will be loaded to the PST Collaboration Site “Program Documents” tab.
a. PCSA Development. Information supporting the PCSA table in the PAR must be
provided to the PI by the PST member identified in the PSP. The PCSA rating must be
determined using the PCSA tab within the PST Collaboration Site. Instructions can be found in
the appropriate section of the Program Support Analysis and Reporting User Guide found on the
Resource Page.
b. PCSA Ratings. Ratings must be verified and updated using multifunctional risk-based
surveillance execution results, audit findings, and input from external sources as applicable.
(2) Narrative of what the prime contractor’s corrective and preventive actions for
resolving subcontractor management issues (e.g., CAR, CAP).
(3) Narrative of DCMA action and assessment of potential impact on the program.
Insert the narratives in the Notes section of the PCSA tab for use by the PI in completing the
PCSA section of the PAR.
d. PCSA Results. PCSA results are provided to the PI at least quarterly in accordance with
the PSP.
3.7. SUPPLIERS DRIVING THE RATINGS. The PI or SPI will use functional inputs, CAR
eTool and Delegation eTool to populate the table.
3.8. SPI SUPPORT PAR. The SPI must develop a Support PAR for each Major Program
effort they have been delegated to support. The PAR template is utilized for this purpose;
however, the applicable content of the reporting is as specified in the delegation from their next
higher SPI or PI.
a. SPI Drafts Support PAR in Collaboration with SPST Functional Input. The SPI
must familiarize themselves with all of the inputs provided by their SPST members in their
program’s PST Collaboration Site. The content for these inputs will be determined by the
accepted Letter of Delegation (LOD) and may include Support PARs and other types of reports
from sub-tier CMOs, eFITs, EVM information, Production Supportability Tables, CBS, and
others.
(1) SPI Discussions to Reach Consensus on Predictive Analysis. It is imperative that the
SPI have discussions with their SPST members to reach a consensus of the impact the various
inputs have on our predictive assessments. The Support PAR or alternative report must include
summarized information derived from functional input that contributes to the rating assessment
conclusions. This summary must be written by the SPI using the SPST members’ input.
(2) SPI Use of the PAR Template. The SPI must use the PAR Template found on the
Resource Page to create the Support PAR unless an alternative format was delegated. The
Support PAR, at a minimum, will contain the content specified in the applicable LOD. If the
PAR Template is used the SPI must label all non-applicable Support PAR elements as “N/A.”
Final Support PARs must be submitted by the due date specified in the LOD.
b. SPI Initiates CMO Review Process (Optional). A CMO review process is not required.
If the CMO may add a review process the SPI must review and consider all comments provided
by any CMO reviewers. The SPI must update the draft PAR to include those comments as
needed. The review must not interfere with the PAR submission timelines identified by the PI.
c. Report Submitted. The SPI must load the Support PAR or alternate report into the
program’s “Program Documents” tab in the program’s PST Collaboration Site.
3A.1. COST DATA INTEGRITY INDICATORS. CPR/IPMR Data Integrity Indicators are
metrics designed to provide confidence in the quality of the data being reviewed instead of
providing insight into the performance of a contract. The EVM Analyst should report any WBS
elements with one of the following conditions being tested for by these metrics.
a. BCWSCUM > BAC. The Budgeted Cost for Work Scheduled (BCWS) is the contract
budget time-phased over the period of performance. The summation of BCWS for all reporting
periods should equal the BAC. In other words, BCWS summation for all reporting periods
(BCWSCUM) should equal BAC on the month the contract is planned to complete. Both of these
values can be found on the IPMR/CPR Format 1. Due to this relationship, the value of
BCWSCUM should never exceed BAC. Errors may exist in EVM data resulting in this condition,
thereby making it necessary to perform this metric. Compare the value of BCWSCUM to the
value of BAC; if BCWSCUM is greater than BAC, consider this an error in the EVM data. There
is no plausible explanation. There may be no issue if the value of BCWSCUM is less than BAC.
b. BCWPCUM > BAC. The Budgeted Cost for Work Performed (BCWP) is the amount of
BCWS earned by the completion of work to date. Like the BCWSCUM, the Budgeted Cost for
Work Performed, cumulative (BCWPCUM), cannot exceed the value of BAC. The contract is
considered complete when BCWPCUM equals BAC. Compare the value of BCWPCUM to BAC.
If BCWPCUM is greater, then this is an error, otherwise there is no issue.
c. ACWP with No BAC. The Actual Cost of Work Performed (ACWP) is the total dollars
spent on labor, material, subcontracts, and other direct costs in the performance of the contract
statement of work (SOW). These costs are controlled by the accounting general ledger and
should reconcile between the accounting system and EVMS. Work should only be performed if
there is a clear contractual requirement. The BAC is required to be traceable to work
requirements in the contract SOW. If work is performed and the ACWP incurred without
applicable BAC, there may be a misalignment between the work and the requirements of the
contract. To test for this condition, simply review the IPMR/CPR Format 1 data for WBS
elements containing any instance of current or cumulative ACWP but no BAC. If there are
elements that meet these criteria, the contractor should provide justification. If this did not
occur, consider this an error.
d. Negative BAC or EAC. BAC is the total budget assigned to complete the work defined
within the contract. Likewise, EAC is the Estimate at Completion of the work. A negative total
budget is not logical. To test for this condition simply examine the IPMR/CPR Format 1 data for
a BAC or EAC less than zero. This test should be performed at the reported WBS levels as well
as the total program level. A BAC or EAC less than zero should be considered an error.
is not possible to re-plan more budget than has already been time-phased to date. Therefore,
there should not be an instance of negative BCWSCUM. To test for this condition simply examine
the current and cumulative sections of the IPMR/CPR Format 1 for BCWSCUM or BCWSCUR less
than zero.
g. BCWP with No ACWP. Since work or materials must be paid for, it is not possible to
earn BCWP without incurring ACWP. This condition may occur for elements using the Level of
Effort (LOE) Earned Value Technique (EVT). In this case, it would signify the support work
that was planned to occur is not occurring due to some delay. This metric can be calculated
using the IPMR/CPR Format 1 data. Inspect the elements on the report for any instance of
current or cumulative BCWP with a corresponding current or cumulative ACWP equal to zero.
i. Incomplete Work without ETC. If work has yet to be completed, there should be a
forecast of the remaining costs to be incurred. Determine if there are any elements that are
incomplete (BCWPCUM < BAC) and contain an ETC of zero. If this condition exists, consider it
an error.
j. ACWP on Completed Work. There may be valid reasons to incur cost (ACWP)
following the completion of work (BCWPCUM = BAC). However, this should not be considered
the norm. Review the IPMR/CPR Format 1 for the following:
• BCWPCUM = BAC
• BCWPCUR = 0
• ACWPCUR ≠ 0
Keep in mind there may be costs incurred in the month the element of work is complete. That is
why it’s necessary to check for BCWPCUR. This insures the work was completed in a prior
period and if ACWPCUR returns a value other than zero the metric is flagged.
k. BCWP with No BCWS. Since all budgeted work performed should have been
scheduled, occurrences of BCWP without BCWS should be commensurate with early starts in
the IMS. The values do not have to be equal since actual work will rarely match the baseline
work during project execution, but the values will equal at project completion. This metric can
be calculated using the IPMR/CPR Format 1 data. Inspect the elements on the report for any
instance of current or cumulative BCWP with a corresponding current or cumulative BCWS
equal to zero.
l. ACWPCUM > EAC. The EAC consists of two components, the actual costs incurred to
date (ACWPCUM) and the estimate of future costs to be incurred or the ETC. The ACWPCUM can
only be greater than EAC if the ETC is negative or extra cost incurred/recorded due to correction
of accounting, management, or ledger errors. There may be limited cases that would require a
negative ETC. Using the IPMR/CPR Format 1, examine the elements for any condition of
ACWPCUM greater than EAC. If this condition exists, adjust your EAC forecast to accommodate
the condition and refer the issue to the EVMS Center.
a. Logic. This metric identifies incomplete tasks with missing logic links. It helps identify
how well or poorly the schedule is linked together. Any incomplete task that is missing a
predecessor and/or a successor is included in this metric.
b. Hard Constraints. This is a count of incomplete tasks with hard constraints in use.
Using hard constraints (e.g.; Must-Finish-On (MFO), Must-Start-On (MSO), Start-No-Later-
Than (SNLT), and Finish-No-Later-Than (FNLT)) may prevent tasks from moving with their
dependencies and, therefore, prevent the schedule from being logic-driven. Soft constraints such
as As-Soon-As-Possible (ASAP), Start-No-Earlier-Than (SNET), and Finish-No-Earlier-Than
(FNET) enable the schedule to be logic-driven.
c. Invalid Dates. This area of analysis includes planned tasks that have a forecast
start/finish date prior to the IMS status date, completed tasks that have actual start/finish dates
beyond the IMS status date, incorrectly statused finish dates when a task is not complete, and
tasks that have riding start dates. There should not be any invalid dates in the schedule.
d. Critical Path Test. The purpose is to test the integrity of the overall network logic and,
in particular, the critical path. If the contract completion date (or other milestone) is not delayed
in proportion (assuming zero float) to the amount of intentional slip that is introduced into the
schedule as part of this test, then there is broken logic somewhere in the network. Broken logic
is the result of missing predecessors and/or successors on tasks where they are needed. The IMS
passes the Critical Path Test if the project completion date (or other task/milestone) show a
negative total float number or a revised Early Finish date that is in proportion (assuming zero
float) to the amount of intentional slip applied.
e. Milestones with Duration. Includes milestones that are planned or in-progress whose
duration is greater than zero. Per the Earned Value Management System Interpretation Guide
(EVMSIG), milestone tasks should not have a duration.
f. Missing WBS. Activities without WBS values indicate poor planning and cause problems
in reporting information about that task. This metric includes only normal activities and
milestones that are planned, in-progress, or complete.
3B.1. COST.
a. EAC. The EAC is the projection of the final cost of the contract/program. The process
starts with how much has already been expended (ACWP). It follows with how much work
remains, Budgeted Cost of Work Remaining (BCWR), and how much is needed to finish the
work as related to the amount of expenditures or costs incurred or recorded (“Actuals”). This
allows that EAC is made up of “actuals” and the Estimated (cost of the work left) To Complete
(ETC).
where PF could be Cost Performance Index (CPI), Schedule Performance Index (SPIx), CPI x
SPIx, .x(CPI) +(1-.x)(SPIx) given .x: 0 < .x < 1.0 as determined with elaborated rationale
(Table 2). Completion of an OTB/OTS may impact the utility of any performance factor and
the impacts of the changes should be understood as part of the EAC development.
remaining work, risks and opportunities at WBS reporting level, and risk adjustments that can be
mapped to a specific WBS element.
(a) Performance Factor. Select the Efficiency Factor that best describes the contract
and WBS element being evaluated using Table 2. On large programs, not all contracts or even
WBS elements will utilize the same performance factor method to accurately estimate its
completion value. Although the method may differ by contract or WBS element, the methods
should remain consistent from reporting period to reporting period.
(b) Risk Adjustments. The CPR/IPMR provides the contractors’ most likely EAC
(EACKtr) that accounts for some program/contract risk factors. It is important to review the
program/contract risk registry and determine if the risks included by the contractor in their most
likely EAC are reasonable. These risks may present a consequence in terms of either cost or
schedule. Technical risks may impact schedule and cost performances and initiate cost and
schedule risks. Risks, contingencies, and mitigation plans should be included in the schedule
and thereby in the cost system. Risks and issues not accounted for by the contractor or
adequately addressed through the performance factor should be considered. Participation in risk
management meetings between the contractor and the program office will facilitate this
understanding.
(a) WBS Level Rollup. To determine the EACDCMA for the contract, the WBS level
EACs must be summed to the Contract level. The reason this is performed at the lower level
first is to prevent skewing the data by averaging out performances of individual WBS elements
by conducting the performance factor determination at only the contract level.
(b) Risks and Opportunity Adjustments. Any PST provided or known risk and
opportunity impacts adjustments that were not included at the WBS level will be applied at the
contract level to adjust the EACDCMA.
1. The contractor is required to track debits and credits to MR over time. These
changes should be reflected in the IPMR Format 5. It is important to account for all the MR
debits and credits when calculating this metric. It is not simply the current value of MR divided
by the original value of MR. In fact, if there are significant credits to MR since program
inception, the current MR value might actually be greater than the original value, even if there
was a debit of some MR.
(d) Realism. Calculate the EAC Realism of the contract level EAC for both the
DCMA’s and the contractor’s EAC. At the contract level, DCMA uses TAB instead of BAC in
the TCPI formula since historical trends show that most programs consume all the MR by the
end of the contract. The EAC realism value for both the Contractor and DCMA will be reported
to the PI as part of the EVM Analyst input.
b. VAC. VACDCMA is calculated at the contract level and represents the difference between
the TAB or Contract Budget Base and the estimated final cost. VAC is also measured in
percentage (%) and known as VAC percentage (VAC%). Why does DCMA use TAB instead of
BAC? At the contract level, as MR is used, it becomes part of the PMB, increasing the BAC.
Since most major programs historically use all their MR, the BAC at the completion of the
contract would equal the TAB. Given this, to be predictive, DCMA uses TAB to begin with to
reduce VAC fluctuations caused from applying MR. If a VAC is being calculated at the WBS
level, then BAC is still used as there are no WBS level TAB values.
𝐕𝐕𝐕𝐕𝐕𝐕𝐃𝐃𝐃𝐃𝐃𝐃𝐃𝐃
𝐕𝐕𝐕𝐕𝐕𝐕𝐃𝐃𝐃𝐃𝐃𝐃𝐃𝐃 % = ∗ 𝟏𝟏𝟏𝟏𝟏𝟏%
𝐓𝐓𝐓𝐓𝐓𝐓
𝐰𝐰𝐰𝐰𝐰𝐰𝐰𝐰𝐰𝐰, 𝐕𝐕𝐕𝐕𝐕𝐕𝐃𝐃𝐃𝐃𝐃𝐃𝐃𝐃 = 𝐓𝐓𝐓𝐓𝐓𝐓 − 𝐄𝐄𝐄𝐄𝐄𝐄
3B.2. SCHEDULE.
a. Status. A variety of variables can be used to illustrate the current status of the schedule
(e.g.; Status Date, ECD, Percent Complete, Activity Counts, and Remaining Duration).
b. Baseline Execution Index. The BEI metric is an IMS-based metric that calculates the
efficiency with which tasks have been accomplished when measured against the baseline tasks at
a Status Date. BEI compares the cumulative number of tasks completed to the cumulative
number of tasks with a baseline finish date on or before the status date of the reporting period.
BEI does not provide insight into tasks completed early or late (before or after the baseline finish
date), as long as the task was completed prior to the status date of the reporting period. Missed
Task metrics provide further insight into on-time performance.
(1) If the contractor completes more tasks than planned, then the BEI will be higher than
1.00, reflecting a higher task throughput than planned. A BEI less than 0.95 should be
considered a flag and requires additional investigation. The PST needs to investigate areas of
interest to include but are not limited to:
(2) A consistently downward trending BEI and increasing missed task percentage can be
associated with variance trends. If these trends continue over the long run, the schedule may
become unreliable.
c. Missed Tasks. The BEI metric is an IMS-based metric that calculates the efficiency with
which tasks have been accomplished when measured against the baseline tasks at a Status Date.
Missed tasks are incomplete tasks with a baseline finish date that is before the status date.
Examination of a consistently high percentage of missed tasks, regarding the nature of the late
completion dates, is necessary. Use the following guidelines to make a proper determination:
(1) A high percentage of missed tasks could result from a series of tasks that on average
only a few days late or conversely months late. This is a clue when developing an independent
forecast date. For example, if the contractor is 76 days late on average in completing tasks, then
the analyst can use this information as a partial basis for forecasting that the contractor will be 76
days late to program completion.
(2) If the missed tasks are always non-critical but the critical path tasks are consistently
completed on time, the high missed task percentage may not be a major concern.
(3) If the missed tasks are always on the critical path, then the low hit task percentage is
a major concern and should be tracked as a risk.
(4) A consistently high missed task percentage and a consistently good BEI could mean
the tasks are only slightly late on average and the actual finishes for those late tasks may never
impact the BEI beyond the current period.
d. Critical Path and ECDDCMA. The ECDDCMA is predictive insight to project completion.
The analyst must ensure that the ECDDCMA is independent, based on in-depth critical path
analysis, evaluates past performance, leverages multifunctional surveillance results, and uses
available metrics and tools to provide measureable data. In order to perform this analysis, an
analyst first identifies the critical path.
(1) Critical Path. The program critical path is the sequence of discrete tasks/activities in
the network that has the longest total duration through the contract. Discrete tasks/activities
along the critical path have the least amount of float/slack. Be wary of contractor methodology
that states the critical path is comprised of all the tasks with zero or less total float. This is not
the same as the longest total duration with the least amount of total float (i.e., a single number,
not a range of numerical values). DCMA will utilize the contract submitted data to represent the
contractor’s values, but based on the PST Member inputs, may have to predict DCMA forecasted
completion dates that are different than the contractor’s and may result in a different critical path.
(2) Event and Milestone ECD. The ECD is measured at each of the program milestones,
contractually required events, and the contract finish date. As a contract can have a large number
of milestones and events, it is recommended for reporting purposes to focus on those that exist
either on the critical path or are of significant interest to the customer. The contractor’s work
schedule determines how many days equals a month. For example, a contractor that does not
work weekends has approximately 22 work days in a calendar month. Measurement can also be
done using calendar days. The net effect of all the task slippages on the critical path and
therefore the contract will indicate by how much the ECD has deviated from the Baseline.
(a) Baseline Finish Date. The baseline finish date is the scheduled completion date
found in the “Baseline Finish” field of each task or milestone in the IMS.
(b) ECDKtr. The ECD is defined as the date found in the “Forecast Finish” field of a
properly networked IMS (which is typically the same as the date found in the “Early Finish”
field) of each milestone or task.
(c) ECDDCMA. The EVM Analyst will report the to the PI the most recently
completed milestone or contractually required event, and the next milestone or event, the
contract finish date and any other tasks, milestones or events that illustrate DCMA’s projection
of the contractors progress to the schedule.
a. PI Drafts PAR. DCMA PARs provide our acquisition partners with a comprehensive
and unbiased assessment of the health of the program. PIs must adhere to the contents of the
Program Support Analysis and Reporting User Guide found on the Resource Page in the creation
of their PAR. PAR submissions must be submitted quarterly according to the schedule for their
assigned grouping (i.e., A, B, or C) in Table 3. Program group assignments can be found in the
IWMS PAR tool. The due date for PAR approval is NLT the 6th business day of the month for
the group to which they are assigned. It is imperative that all parties responsible for PAR
development and approval comply with the PAR Group Timeline (Figure 2) to ensure comments
are received prior to the PAR approval submission as well as for timely PAR distribution for
OSD review.
(1) PI Reviews Functional and SPI Inputs. PIs must familiarize themselves with all of
the inputs provided by their PST members and SPIs in the program’s PST Collaboration Site.
Inputs may include Support PARs, reports from sub-tier CMOs, eFITs, EVM analysis, and
Production Supportability Tables.
(2) PI Collaborates with PST Functional Members and SPIs. It is imperative that the PI
have discussions with their PST functional members, SPIs, and applicable ACOs, DACOs or
CACOs. This serves to better appreciate the inputs provided and to reach consensus on the
impact of the various inputs have on DCMA’s predictive assessments pertaining to program cost
and schedule.
(3) PI Drafts PAR. In order to assure the pre-populated PAR information is correct, the
PI must assure the Program Information is current. The CPA, PA, and MA synopses provide
senior leaders the bottom line up front (BLUF) by summarizing the primary drivers to the
aggregate assessment. They should be written at a strategic level and provide a summary
overview of DCMA’s perspective on program performance. Do not include any information that
is not mentioned in the assessment narratives. State the impacts, issues, and risks driving the
rating. Quantify contract and program impacts. Do not include the Assessment Color, rating
period, or program name in the synopsis.
(a) PAR Section 1. The PI must ensure that the PAR Section 1 data elements are
completed as follows.
3. DCMA MA. The MA provides a CBS assessment for the prime contractor(s).
The contractor must establish and maintain an acceptable CBS for contracts subject to the Cost
Accounting Standards and containing one or more of the six business system clauses. Refer to
Appendix 4D for more information.
4. Aggregate Rollup. Complete the Aggregate Ratings for the reporting month.
When multiple contracts exist, the Lead CMO or cognizant CMO will determine the aggregate
rollup method. Refer to the appropriate section of the Program Support Analysis and Reporting
User Guide located on the Resource Page.
(b) PAR Section 2. Supplemental Analysis for CPA, PA, and MA in PAR Section 1.
Complete this section of the PAR using the criteria found in Appendices 4B and 4C.
1. Supplemental Analysis for CPA, PA, and MA in PAR Section 1. Describe the
aggregate methodology used and any extenuating circumstances affecting the aggregate rating.
2. CPA Supplemental Analysis. Narrative that reinforces the drivers of the CPA
rating and supplements PAR Section 1. Include the following or “no additional information” if
these do not apply:
a. For contracts, CLINs, and DOs discussed in PAR Section 1.1 with ratings
driven by cost:
• If there has been an OTB, assess the VACDCMA% against the Contract
Budget Base to maintain visibility
b. Watch items that were not discussed in PAR Section 1.1 (CPA Narrative)
and do not currently impact the CPA rating; identify them as watch items.
a. Watch items that were not discussed in PAR Section 1.2 (PA Narrative)
and do not currently impact the PA rating; identify them as watch items.
(c) PAR Sections 3 and 4. The PI must ensure that the PAR Sections 3 and 4 data
elements are completed as follows.
1. Program Office Requested Data and Specific Reporting. Include any other
information that has been requested by the Program Office. This includes any Foreign Military
Sales (FMS) contract information that the Program Office has requested. If there is no requested
data or specific reporting, annotate “Not Applicable.”
b. PAR Review and Approval. After the draft PAR has been submitted for review in
Integrated Workload Management System (IWMS), all applicable DCMA organizations (e.g.,
CMO Leadership, Operational Units, and Headquarters) have the opportunity to review the draft
PAR. Review comments must be inserted in the IWMS PAR comment feature within the
following 3 business days. The intent of using the IWMS comments sections is to have a
common area to improve PAR quality before finalization. Visibility and documentation of these
comments and associated changes provide traceability, documentation, and insight to possible
training shortfalls and gaps for future mitigation through manuals and other venues. CMO
personnel must manage PAR improvement efforts through IWMS. Edits cannot be made after
PAR approval. The CMO Commander or Director is ultimately responsible and is the approval
authority for PAR content. The PAR approval due date for these quarterly submissions is NLT
the 6th business day of the month for the program group to which they are assigned.
(Subparagraph 1)
(1) PI Initiates the Agency Review Process. The PI must initiate the Agency PAR
review process per the appropriate section of the Program Support Analysis and Reporting User
Guide located on the Resource Page in accordance with the timeline in Figure 2.
(2) Program Management Office (PMO) Draft PAR Review. The PI must send a draft
copy of the quarterly PAR to the PMO for review and comment by the last business day of the
month. The PI must wait at least 3 business days from PMO submission to allow for PMO
feedback prior to PAR submission for approval. The PMO has the opportunity to comment prior
to the PI making the assessments available to DCMA management and possible dissemination
outside the Agency. The PI must inform the PMO that information contained within the
documents may be required to input into DAMIR if subjected to DAES reporting. If, after the 3
day window, the PMO has not provided any comments, the PI may consider this concurrence.
(3) PI Reviews Comments. The PI must review all comments for potential inclusion into
the PAR.
(4) PI Updates PAR. Based on the comments received from all reviewers, the PI updates
the draft PAR. The PI must not make any updates to the draft PAR based on PMO comments
which question our assumptions, conclusions, or independent assessment of the program. The PI
must, however, make updates to the draft PAR to correct any misstated facts.
(5) CMO Commander or Director Reviews PAR. The CMO Commander or Director or
designee must review the entire draft PAR submitted for approval. Returned PARs must have
comments explaining the reasons. Minimum elements to consider during review:
(c) Ensure the issue or risk, root cause, mitigation strategy, contract or program
impact, and DCMA analysis are clearly stated.
(6) CMO Commander or Director Approves PAR. The CMO Commander or Director or
designee must approve all PARs ready for release. There is no method for editing a PAR after
approval as it becomes an official document. Further correction can only be accomplished by
cancelling and republishing a new PAR or through PNs.
(7) PI Utilizes Customer Feedback. The PAR contains a survey link for customer
feedback. When a customer completes the survey, the feedback is provided to the CMO via
DCMA Headquarters. The CMOs should consider this feedback as an opportunity to improve
future reporting.
(1) Operational Unit Identifies PARs to be Reviewed. The applicable Operational Unit
representative must develop a plan for evaluating the quality of the PARs under their cognizance.
(2) Operational Unit Reviews PAR Using Scoring Rubric and Rework Metric. The
Operational Unit reviews their PARs using the Scoring Rubric and Rework Metric according to
their plan.
(3) Operational Unit Provides PAR Score, Feedback, and Recommendations to the PI, PI
FLS, and CMO Commander or Director. The Operational Unit must provide the results of their
review to the CMO Commander or Director, the PI FLS, and the PI at a minimum. The results
of the review should include the completed PAR Rubric coupled with any recommendations for
improvement that should be incorporated into the next PAR.
(4) Operational Unit and CMO Jointly Develop and Implement Get Well Plan. If the
results of an Operational Unit PAR quality review is a score of less than 85 percent, then a get
well plan must be jointly developed between the Operational Unit representative and
representatives from the CMO. The purpose of the get well plan is to develop an action oriented
approach to impart the requisite knowledge and skills so that subsequent PARs will be developed
which are of a sufficient quality. The plan must, at a minimum, include the steps to be taken to
improve future PAR quality coupled with estimated dates in which each of these steps will be
completed. The plan will then be implemented by the estimated dates.
(5) Operational Unit and CMO Utilize Scoring Rubric and Rework Metric. If the results
of the initial PAR quality review were a score of less than 85 percent, then a follow-up review of
a subsequent PAR for that program must be completed by the applicable Operational Unit
representative using the complete Scoring Rubric and Rework Metric. The results of the follow-
up review must be distributed to the same CMO personnel as the original results.
(6) Operational Unit Loads PAR Scores. The Operational Units check out the PAR
Scoring Rubric spreadsheet from the DAES Strategic Metrics (PAR Rubric) library of the MPS
DAES Site and update the spreadsheet tabs, corresponding to the programs being reviewed, by
the last business day of the report month. A link to the MPS DAES Site is located on the
Resource Page.
(7) PM&BI Provides Agency metrics to DCMA Director. The PM&BI Executive
Directorate must provide the DCMA Director with metrics that synopsizes the results of the PAR
quality evaluations performed throughout the period.
e. PAR Blackout. The quarterly PAR submitted may contain source selection sensitive
information. Caution must be given to programs with ongoing source selections so that
competition sensitive information is not released. Source selection as defined in Federal
Acquisition Regulation 2.101 and 3.104-4 must be excluded from DAES Assessments. Other
information which could jeopardize the competitive nature of a successful source selection must
also be excluded. Only DCMA personnel with a need to know will have access to a PAR for a
source selection sensitive program. PIs should be aware of all circumstances where a program is
subject to source selection restrictions. Information pertaining to the establishment of program
Blackout is contained in DCMA-MAN 3101-01, “Program Support Life Cycle.”
(1) The CMO has determined that the information in the previous PAR or PN has
changed in a significant way, not only when a rating changes.
(2) The CMO identifies an issue or risk that changes the CPA, PA, and/or MA ratings
unless those changes can be incorporated into the next PAR in a near-real time basis.
(3) There is any significant evolution to an event described in the immediately preceding
PAR or PN.
(4) A program related event has been submitted as an Agency Weekly Activity Report
(WAR) entry.
(1) PI/SPI Forwards the Completed PN for Review. The PI must submit the completed
PN to the Lead CMO Commander or Director, or their designee, for review. SPIs may draft PNs
for the PI but must upload them to the Program Notification tab in the program’s PST
Collaboration Site. The PI will review the SPI PN for content, the need to integrate with other
draft PNs, and make a determination if the PN will be released. The PI then coordinates any
updates prior to the PI submitting the PN to the Lead CMO for review.
(2) Lead CMO Commander or Director Review of PN. PNs must be reviewed by the
Lead CMO Commander or Director prior to release to ensure PN adequacy. When the Lead
CMO Commander or Director completes their review, they may approve, disapprove with
comments, or reject the PN. SPI PNs do not require formal approval by their CMO Commander
since only the approved PNs will be releasable outside of DCMA.
(3) Lead CMO Commander or Director Communicates Changes to PI. If the Lead CMO
Commander or Director is either returning the PN back to the PI for changes or is deleting the
PN, specific comments should be made to the PI clearly describing the rationale for either
returning or deleting the record. If the PN is returned with comments, the PI must address those
comments and resubmit the PN for review and approval.
(1) PN Attachments. The intention is not to include any attachments unless they are
absolutely necessary to tell the story.
(2) PI Converts PN to Portable Document Format (PDF). Prior to approval, the PI must
convert the PN to a PDF.
(3) Lead CMO Commander or Director Approves PN. The Lead CMO Commander or
Director, or their designee, must digitally sign the PN when it is ready for release. The Lead
CMO Commander or Director will return the PN to the PI.
(4) Posting the PN. PNs will be loaded into the program’s PST Collaboration Site.
f. PN Distribution. PNs must be distributed to our external acquisition partners utilizing the
same means as PAR distribution.
a. Summary. All DAES reportable programs are submitted to DAMIR on a quarterly basis
according to the Group Assignment. Each program has a CPA, PA, and MA color rating,
synopsis, and narrative assessment entered into DAMIR by the responsible CMO. DAMIR
access must be requested using DCMAF 3101-02-01, “DAMIR Access Request” following the
guidelines outlined in the appropriate section of the Program Support Analysis and Reporting
User Guide. See the Resource Page for the form as well as the user guide. The DAES
assessments uploaded into DAMIR are derived entirely from the ratings, synopses and Section 1
narratives of the PAR. All DAES reviews and comments must occur during the PAR review
process as described in paragraph 4.1.
b. PI Loads PAR Section 1 Ratings, Synopses, and Assessments into DAMIR. NLT the
7th working day of the month, the DAMIR Action Officer must create and enter a color rating,
synopsis, and narrative for each assessment area (CPA, PA, MA) for assigned programs.
a. Readability. The narratives must have a consistent flow (logic) and be clear, concise, and
primarily use active voice. Write PARs for Senior Leadership Review and always start with the
BLUF. Narratives must be written to convey the most significant rating driver first. Bullets or
subparagraphs may be useful for outlining multiple issues or events. Titles for paragraphs may
be useful in identifying main issues.
(1) Overall Formatting Guidelines. The PI must use the following formatting guidelines
throughout the PAR.
(b) Spell out all acronyms the first time used in each assessment area (CPA, PA,
MA), with the exception of assessment Synopses and acronyms identified in the “PAR –
Common Acronym List” document. See the DCMA-INST 3101 Resource Page.
(2) Dollar Rounding Conventions. Use the following dollar rounding conventions in
your PAR. The exception is within tables, when consistent dollar convention should be used
(e.g., thousands or millions).
(a) Do Not Use Commas. Use the next higher convention (i.e., use $1.32B instead of
$1,322.12M).
(b) Use At Least One Non-Zero Digit Prior to Decimal. Have at least one non-zero
digit prior to the decimal (i.e., $2.32M instead of $0.002B). Use up to two decimal places.
(3) Synopses. The CPA, PA, and MA synopses provide senior leaders the BLUF by
summarizing the primary drivers to the aggregate assessment. They should be written at a
strategic level and provide a summary overview of DCMA’s perspective on program
performance. Synopses are limited to 350 characters, including spaces. Do not include any
information that is not mentioned in the assessment narratives. State the impacts, issues, and
risks driving the rating. Quantify contract and program impacts. Do not include the Assessment
Color, rating period, or program name in the synopsis.
b. Content Criteria. Issue and Risk Assessments must contain the following information as
a minimum:
(1) Issue or Risk Description. Describe the issue or risk. When addressing issues or
risks that contribute to and/or drive the CPA, PA, and MA color criteria, include the
nomenclature of the material (e.g., name, description), the Prime Contract number(s), and the
Subcontractor name, if applicable. When discussing color changes in the ratings from the
previous PAR, an example format to use is “The (color) rating is (the same as, worse than, better
than) the last quarter’s PAR rating due to (risk or issue).”
(2) Root Cause. Document the root cause or source that resulted in the issue occurring;
include DCMA’s assessment of the root cause.
(3) Impact in Dollars and Days. Using DCMA predictive analysis, describe issue and
risk impacts, quantified in dollars and days. These dollar and day impacts, discovered from
functional surveillance, are critical for forecasting cost overruns, schedule slips, and executive
level decision making for programs and EVM reporting contracts.
(4) Contractor’s Mitigation Plan. Identify the contractor’s mitigation plan including any
corrective actions taken to resolve the issue or mitigate risk. Include the DCMA independent
assessment of the adequacy of contractor actions and the likely outcome.
(5) Path Forward. Document possible courses of action that DCMA or the PMO can take
to resolve the issue or mitigate the risk.
(7) Narratives References. Narratives in PAR Section 1 must not reference other sections
in the PAR.
4B.1. CPA Color. The minimum rating criteria, used for each contract/CLIN/DO/TI, are
provided in Table 4, Cost Assessment Color Criteria and Table 5, DCMA Schedule Slip
(Months) Color Criteria (CPA).
b. For Disapproved EVMS or EVMS Level III or IV CAR. For EVMS that is
disapproved, or having a transmitted EVMS Level III or IV CAR:
(1) Lead CMO Requests Impact Statement. The Lead CMO requests an impact statement
from the EVMS Center which may be as little as a statement that disapproved EVMS does not
impact the CPA (e.g., no EVM reporting contracts on program).
(2) Aggregate Assessment Color Default. The aggregate assessment color for CPA must
not be Green when the EVMS Center impact statement has indicated that the cost or schedule
EVM data used to rate the CPA is not reliable.
4B.2. CPA Narrative. The CPA narrative provides data, information, and analysis focused on
supporting the Assessment Color. The narrative is limited to 3,850 characters. Begin the
assessment with the Bottom Line; this could be a copy of the synopsis or similar expanded
statement. Address changes in Assessment Color from the previous PAR. When contract issues
or risks are discussed that would result in a worse rating than the Aggregate Assessment Rating,
then the Aggregate Rating must be briefly explained.
c. Maintain a consistent flow of information from current period’s PA and MA. For
example, if schedule delays drive the PA rating, or if business system in the MA is disapproved
(e.g., EVMS), then incorporate or address the issues and risks from the PA and MA in the CPA.
d. For contracts requiring EVM reporting, for contracts, CLINs, DOs, and TIs where the
aggregate rating is Yellow or Red, include, at a minimum:
(2) A discussion of VACDCMA drivers; risks to remaining effort and the availability of
MR to offset any risks, methodology, and observed trends indicating changes in future
performance
(3) EACKtr if there are significant differences compared to EACDCMA (i.e., more than 5
percent) include values and explanation of difference
e. For contracts not requiring EVM reporting, for contracts, CLINs, DOs, and TIs where the
aggregate rating is Yellow or Red, include, at a minimum:
(a) Issues and risks that could jeopardize the contractor’s ability to meet contractual
requirements (e.g., staffing levels, labor rates, achievement of milestones, and technical goals)
(2) Disapproved EVMS or transmitted EVMS Level III or IV CARs that do not impact
the CPA rating.
4C.1. PA Color. Tables 6, 7 and 8 provide the minimum rating criteria for each EMD or
production contract, CLIN, DO, or TI listed in the “Table 1.0, Contract Aggregate Assessment”
with ability to downgrade (e.g., Yellow to Red) due to further analysis. Consider the relevant
items in Table 7 which influence contractually required events, deliveries, and production
readiness. Expand on the root cause, impact, and how the issue affects contractually required
events. The Production Assessment incorporates current issues and risks, even when that risk is
projected to impact the program in the future. GFE or GFM that impacts the contract will be
incorporated into the PA ratings. FMS contracts are not included in the Aggregate Table, nor
incorporated into the aggregate PA ratings. Sustainment contracts and contracts that are fully
shipped or have all DD Form 250s processed may be excluded, if not applicable to the aggregate
rating.
4C.2. PA Narrative. The PA narrative provides data, information, and analysis focused on
supporting the Assessment Color. For aggregate assessments that are rated Green, provide
justification to support the assessment. If identifying issues and risks in a Green assessment,
consider downgrading to Yellow to raise awareness of the issues or risks. The narrative is
limited to 3,850 characters. Begin the assessment with the Bottom Line; this could be a copy of
the synopsis or similar expanded statement. Address changes in Assessment Color from the
previous PAR. When contract issues or risks are discussed that would result in a worse rating
than the Aggregate Assessment Rating, then the Aggregate Rating must be briefly explained.
(8) When corrective action or risk mitigation is projected to resolve an issue or risk in a
future quarter, this predictive analysis rating change improvement must be identified in the
corresponding assessments in Sections 1 and 2 of the PAR, but is not incorporated into the
current period assessment rating until the issue or risk has been actually resolved.
(9) Impacts of transmitted Level III and IV CARs, address actual or proposed date of
submission for the CAP and DCMA’s assessment of the contractor’s status towards closing the
CAR should be included in the narrative.
b. Maintain a consistent flow of information from current period’s CPA and MA.
4D.1. MA Color. The minimum rating criteria are provided in Table 9, Management
Assessment Criteria, to determine the Assessment Color. If an EVMS is disapproved or has a
transmitted EVMS Level III or IV CAR, request an impact statement from the EVMS Center to
determine impact to CPA rating and narrative.
GREEN: All six CBSs are Approved, Not Evaluated, or Not Applicable and there
are no transmitted or draft Level III/IV CAR against a CBS.
4D.2. MA Narrative. The MA narrative provides data, information, and analysis focused on
supporting the assessment. The narrative is limited to 2,500 characters. Begin the assessment
with the Bottom Line; this could be a copy of the synopsis or similar expanded statement.
Address changes in Assessment Color from the previous PAR. Non-Evaluated or Disapproved
CBS ratings must have the following in the MA Narrative:
(2) Address changes in Assessment Color from previous quarter in the first paragraph
(4) Disapproved Systems must include the drivers for disapproval (e.g., for EVMS,
include the guideline numbers and titles that resulted in the disapproval; for Material
Management and Accounting System, include the standard)
(5) Impact of transmitted Level III or IV CARs supporting CBS disapproval (if there is
no impact, state that); address actual or proposed date of submission for the CAP, DCMA’s
assessment of the contractor’s status towards closing the CAR, and estimated time for follow-up
review
(a) Whether the payment withhold applies to the program. If it does not apply to the
program, still identify the withhold but state that it does not impact the program.
(c) Whether the withhold is against progress payment, performance based payments,
or interim payments billed under cost, labor-hour, or time and materials contracts
(7) Comment on significant CBS issues, their impact to individual contracts (if there is
no impact, state that), and contractor’s ability to execute the contract
GLOSSARY
G.1. DEFINITIONS. Unless otherwise noted, these terms and their definitions are for the
purpose of this issuance.
ACWP. The total dollars spent on labor, material, subcontracts, and other direct costs in the
performance of the contract SOW. These costs are controlled by the accounting general ledger
and should reconcile between the accounting system and EVMS. ACWP is independently
reported by the contractor’s accounting system. Simply stated: “actuals.”
BCWP. Dollarized value of all work actually accomplished in a given time period or Earned
Value. This is equal to the sum of the budgets for completed WPs, completed portions of open
WPs, apportioned effort earned on the base tasks, and the value of LOE activities. BCWP is not
realized until the work is completed.
BCWR. Represents that portion of the budget for work not yet accomplished within a Control
Account. It is the difference between the BAC and the BCWPCUM.
BCWS. Dollarized value of all work scheduled to be accomplished in a given time period or
Planned Value. The sum of the performance budgets for all work scheduled to be accomplished
within a given time period. This includes detailed WPs, apportioned effort, LOE packages,
planning packages, and Summary Level Planning Packages.
BEI. The BEI metric is an IMS-based metric that calculates the efficiency with which tasks
have been accomplished when measured against the baseline tasks at a Status Date. BEI tasks do
not include Summary or LOE tasks.
Tasks Completed Qty of Tasks Completed
BEI = =
Baseline Count Qty of Tasks Completed + Qty of Tasks Missing Baseline Finish
CMT. The CMT reviews new contracts; performs an initial contract review; determines skill-set
and PST organizational requirements to support new major programs; and as deemed necessary
by the ACO, conducts a Post Award Orientation Conference with all CMT members assigned to
that contract.
Cognizant ACO. The administrative contracting officer responsible for performing the duty per
this manual, includes the DACO, CACO and ACO.
CPI. CPI is an efficiency factor representing the relationship between the performance
accomplished (BCWP) and the actual cost expended (ACWP). CPR/IPMR Format 1 contains
the BCWP and ACWP data. CPI can be calculated for current period (monthly) or cumulative
(to date).
BCWP
CPIx = ACWPx
x
GLOSSARY 48
DCMA-MAN 3101-02, Publication Date
Critical Path. Critical path is a sequence of discrete lower level tasks/activities in the network
that add up to the longest overall duration through an end point. The critical path determines the
shortest time possible to compete the contract. Any delay of an activity on the critical path
directly impacts the baselined completion date; i.e., there is no float on the critical path. Lower
level tasks/activities along the critical path have the least amount of float/slack (scheduling
flexibility) and cannot be delayed without delaying the finish time of the end point effort.
CV. The difference between BCWP and ACWP. It can be measured using cumulative (CUM)
or current (CUR) values at either the WP or the contract level. CPR/IPMR Format 1 contains the
BCWP and ACWP data as well as the correlating CVs. The CV% metric quantifies the
magnitude of the CV by dividing CV by BCWP and multiplying by 100. The formulas for
calculating CV and CV% are:
CV
x
CVx = BCWPx − ACWPx and CVx % = BCWP × 100
x
where: x is current period (cur) or cumulative (cum).
DAES. Principal mechanism for tracking programs between milestone reviews. It is both a
reporting and review process serving two primary purposes: (1) Provide awareness of the
execution status of all reporting programs, and (2) Provide assessments that enable identification
of emerging execution issues that warrant the attention of senior leadership.
DAMIR. OSD tool used to communicate program assessments and information across the DoD
Acquisition Enterprise.
MAIS. DoD acquisition program for an automated information system that is either designated
by the Milestone Decision Authority as a MAIS, or estimated to exceed certain dollar levels.
Major Programs. A term used by DCMA to identify those programs with specific reporting
requirements. Major Programs include (unless approved by exception):
• ACAT I/MDAPs
• DAES programs (excluding MAIS)
• Missile Defense Agency Ballistic Missile Defense System programs
• Strategic Systems Programs
• Additional programs or sub-programs designated by the PM&BI Executive Director.
MDAP. ACAT I programs are MDAPs. Programs estimated by the OUSD(AT&L) to require
eventual expenditure for Research, Development, Test and Evaluation of more than $365 million
(FY 2000 constant dollars) or procurement of more than $2.19 billion (FY 2000 constant
dollars), or those designated by the OUSD(AT&L) to be MDAPs.
GLOSSARY 49
DCMA-MAN 3101-02, Publication Date
Percent Complete
MR Consumption Ratio =
Percent MR
Operational Unit. DCMA organizational entity charged with ensuring mission accomplishment
for their organization. For purposes of this manual only, Operational Units include: East,
Central and West Regions, the International Directorate, and the Special Programs Directorate.
Performance Assessments and Root Cause Analyses (PARCA). Carries out performance
assessments of MDAPs and conducts root cause analyses for those MDAPs with Nunn-McCurdy
breach status or when requested by senior DoD officials.
Percent Complete. Percent complete is the percentage of the amount of completed work to date
to the PMB or BAC. The formula for percent complete (%):
BCWPcum
Percent Complete (%) = %comp = × 100
BAC
Percent MR. Percent MR is the percentage of MR used to the amount of MR on the contract.
Total Amount of MR Used
Percent MR = x 100%
Total Amount of MR Added to the Contract
Post Award Orientation Conference. A Post Award Orientation Conference may be held to
perform a detailed review of the contract, specifically highlighting and discussing complex terms
and conditions. The conference will ensure that all parties understand contractual requirements.
Predictive Analysis. The collection, examination, and synthesis of information and data from
our on-site presence which states (in terms of future cost, schedule, and performance) what we
forecast will happen based on our special knowledge of the supplier and program.
PI. Primary DCMA representative to the procuring customer and leads a PST comprised of
functional experts. PI assesses contractor performance, predicts future performance, and makes
actionable recommendations related to future programmatic efforts.
PST. The PST is a matrixed multifunctional team led by a PI which supports a major acquisition
program. The PST may include functional specialists from contract administration, earned value
management, quality assurance, engineering, software, manufacturing and production, supply
chain management, as well as other functions.
Report Month. The month and year associated with the program’s group.
Reporting Level. The reporting level specified in the CDRL. Usually at least at Contract Work
Breakdown Structure (CWBS) level 3 except for high cost and high risk items were the level is
established to ensure the necessary information for effective management control. It is not
necessary for the reporting levels in different legs of the CWBS to be the same.
Software Defects. Priority I and II Software Defect terminology is determined by Contractor
Command Media and Quality Management System (QMS) for definitions. Common software
categorization and definitions of defects can be found in IEEE 12207.
GLOSSARY 50
DCMA-MAN 3101-02, Publication Date
SPIX. The Schedule Performance Index (SPIX) is an efficiency factor representing the
relationship between the performance achieved or Earned Value or BCWP and Planned Value or
BCWS. CPR/IPMR Format 1 contains the BCWP and BCWS data. SPI can be calculated for
current period (monthly) or cumulative (to date).
BCWPx
SPIx =
BCWSx
where: x is current period (cur) or cumulative (cum).
SPI. Primary DCMA representative to either the PI or the next higher tier SPI. The SPI
provides input to the PI concerning their independent assessment of the program element(s) they
have been delegated. The SPI leads a SPST comprised of functional experts.
SPST. The SPST is a matrixed multifunctional team led by a Support Program Integrator which
supports a significant element, subcontract, or subsystem of a major acquisition program.
SV. SV is the difference between BCWP and BCWS. CPR/IPMR Format 1 contains the BCWP
and BCWS data as well as the correlating SVs. SV can be measured using cumulative (CUM) or
current (CUR) values at either the WP or the contract level. The SV% metric quantifies the
magnitude of the SV by dividing SV by BCWS and multiplying by 100. The formula for
calculating SV and SV% are:
SVx
SVx = BCWPx − BCWSx and SVx % = BCWS × 100
x
TCPI. TCPI is the ratio of work remaining (BCWR) and future cost of work remaining (ETC).
BCWR BAC ∗ − BCWP
TCPIEAC = =
ETC EAC − ACWP
*For DCMA the formula will use BAC at the WBS element level and TAB at the contract level.
Watch Item. Any issue, risk or observation that is not currently driving the rating but
significant enough to report. Report watch items in Section 2 of the PAR.
GLOSSARY 51
DCMA-MAN 3101-02, Publication Date
G.2. ACRONYMS.
GLOSSARY 52
DCMA-MAN 3101-02, Publication Date
MA Management Assessment
MAIS Major Automated Information System
MDAP Major Defense Acquisition Program
MR Management Reserve
MRL Manufacturing Readiness Level
MSRA Manufacturing System Risk Assessment
PA Production Assessment
PAR Program Assessment Report
PCA Physical Configuration Audit
PCSA Prime Control of Subcontractor Assessment
PDR Preliminary Design Review
PF Performance Factor
PI Program Integrator
PM&BI Portfolio Management & Business Integration Executive Directorate
PMB Performance Measurement Baseline
PMO Program Management Office
PN Program Notification
GLOSSARY 53
DCMA-MAN 3101-02, Publication Date
GLOSSARY 54
REFERENCES
DCMA-INST 3101, “Program Support,” July 28, 2017
DCMA-INST 210, “Earned Value Management System (EVMS) – Standard Surveillance
Instruction (SSI),” February 29, 2012, as amended
DCMA-INST 501, “Policy Issuances Program,” April 13, 2017
DFARS, Subpart, 252.234-7001, “Notice of Earned Value Management System”
DFARS, Subpart, 252.234-7002, “Earned Value Management System”
DFARS, Subpart, 252.242-7004, “Material Management and Accounting System”
DFARS, Subpart, 252.242-7006, “Accounting System Administration”
DFARS, Subpart, 252.244-7001, “Contractor Purchasing System Administration”
DFARS, Subpart, 252.245-7003, “Contractor Property Management System Administration”
DI-MGMT-81466A, “Contract Performance Report (CPR),” March 30, 2005
DI-MGMT-81650, “Integrated Master Schedule (IMS),” March 30, 2005
DI-MGMT-81861A, “Integrated Program Management Report (IPMR),” September 16, 2015
DoDD 5105.64, “Defense Contract Management Agency (DCMA),” January 10, 2013
FAR 42.302(a), “Contract Administration Functions”
GAO-09-3SP, “Cost Assessment Guide: Best Practices for Estimating and Managing Program
Costs,” March 2009
REFERENCES 55