0% found this document useful (0 votes)
196 views55 pages

00 DCMA-MAN-3101-02 Program Support Analysis and Reporting PDF

This document outlines procedures for DCMA program support analysis and reporting. It provides definitions and procedures for producing Program Assessment Reports, Program Notifications, and inputs to Program Integrators. The document applies to analysis and reporting for Major Programs, Non-Major Programs with reporting requirements, and High Visibility Commodities. It establishes responsibilities for various DCMA roles in conducting program support and reporting. It also provides templates and criteria for assessing contract performance, production, management, and developing cost and schedule estimates.

Uploaded by

Mario Molero
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
196 views55 pages

00 DCMA-MAN-3101-02 Program Support Analysis and Reporting PDF

This document outlines procedures for DCMA program support analysis and reporting. It provides definitions and procedures for producing Program Assessment Reports, Program Notifications, and inputs to Program Integrators. The document applies to analysis and reporting for Major Programs, Non-Major Programs with reporting requirements, and High Visibility Commodities. It establishes responsibilities for various DCMA roles in conducting program support and reporting. It also provides templates and criteria for assessing contract performance, production, management, and developing cost and schedule estimates.

Uploaded by

Mario Molero
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

DCMA MANUAL 3101-02

PROGRAM SUPPORT ANALYSIS AND REPORTING

Office of Primary
Responsibility: Program Support Capability

Effective: November 22, 2017

Releasability: Cleared for public release

New Issuance

Implements: DCMA-INST 3101, “Program Support”

Incorporates: DCMA- INST 205, “Major Program Support,” December 4, 2013


DCMA-INST 406, “Defense Acquisition Executive Summary
(DAES),” July 1, 2013
DCMA-EA PAM 200-1, “Earned Value Management System
(EVMS) – Program Analysis Pamphlet (PAP),” October 29, 2012

Internal Control: Process flowchart and key controls are located on the Resource
Page of this manual.

Labor Codes: Located on Resource Page

Resource Page: https://360.dcma.mil/sites/policy/PS/SitePages/3101r.aspx

Approved by: David H. Lewis, VADM, USN, Director


DCMA-MAN 3101-02, November 22, 2017

Purpose: This issuance, in accordance with the authority in DoD Directive 5105.64:
• Outlines procedures on how the Agency will report on program performance and
anticipated performance to program, product, and project offices and the OSD in
accordance with Federal Acquisition Regulation 42.302 (a)(67).
• Provides and defines procedures for Program Support analysis and reporting to include
Program Assessment Reports, Program Notifications, Program Support Team and
Support Program Support Team Inputs to Program Integrators and Support Program
Integrators and Defense Acquisition Executive Summary reporting.

2
DCMA-MAN 3101-02, November 22, 2017

TABLE OF CONTENTS
SECTION 1: GENERAL ISSUANCE INFORMATION .............................................................................. 5
1.1. Applicability. .................................................................................................................... 5
1.2. Policy. ............................................................................................................................... 5
SECTION 2: RESPONSIBILITIES ......................................................................................................... 7
2.1. Executive Director, Portfolio Management and Business Integration (PM&BI). ............ 7
2.2. Director, Major Program Support (MPS) Division........................................................... 7
2.3. Director, Earned Value Management System (EVMS) Center. ....................................... 7
2.4. Corporate Administrative Contracting Officer (CACO), Divisional Administrative
Contracting Officer (DACO), or Administrative Contracting Officer (ACO). .................. 7
2.5. Commanders/Directors, Operational Units....................................................................... 7
2.6. Commanders/Directors, CMO. ......................................................................................... 8
2.7. Director, Special Programs. .............................................................................................. 8
2.8. Program Integrator. ........................................................................................................... 8
2.9. Support Program Integrator. ............................................................................................. 9
2.10. PST and SPST Support Members. .................................................................................. 9
2.11. Functional First Level Supervisors. ................................................................................ 9
SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR ............................ 10
3.1. Program Analysis. ........................................................................................................... 10
3.2. Electronic Functional Input Template (EFIT). ................................................................ 10
3.3. Production Supportability Table. .................................................................................... 12
3.4. Contractor Business System Reporting. ......................................................................... 13
3.5. EVM Analysis................................................................................................................. 14
3.6. Prime Control of Subcontractor Assessment. ................................................................. 19
3.7. Suppliers Driving the Ratings. ........................................................................................ 19
3.8. SPI Support PAR. ........................................................................................................... 20
APPENDIX 3A: CONTRACT DATA EVALUATION METRICS ............................................................. 21
APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT COMPLETION .................................. 24
SECTION 4: PROGRAM REPORTING ................................................................................................ 30
4.1. Quarterly PAR. ............................................................................................................... 30
4.2. Program Notification (PN).............................................................................................. 35
4.3. DAES Assessment Input. ................................................................................................ 37
APPENDIX 4A: PAR NARRATIVE CRITERIA .................................................................................. 38
APPENDIX 4B: CONTRACT PERFORMANCE ASSESSMENT .............................................................. 40
APPENDIX 4C: PRODUCTION ASSESSMENT.................................................................................... 43
APPENDIX 4D: MANAGEMENT ASSESSMENT................................................................................. 46
GLOSSARY ..................................................................................................................................... 48
G.1. Definitions...................................................................................................................... 48
G.2. Acronyms. ...................................................................................................................... 52
REFERENCES .................................................................................................................................. 55

TABLES
Table 1. DID Correlation ............................................................................................................. 15

TABLE OF CONTENTS 3
DCMA-MAN 3101-02, November 22, 2017

Table 2. Best Predictive EAC Performance Factors by Contract Completion Status ................. 24
Table 3. PAR Report Months....................................................................................................... 30
Table 4. Cost Assessment Color Criteria ..................................................................................... 40
Table 5. DCMA Schedule Slip (Months) Color Criteria (CPA) .................................................. 40
Table 6. DCMA Schedule Slip (Months) Color Criteria (PA) .................................................... 43
Table 7. Production Assessment Criteria ..................................................................................... 43
Table 8. Production Assessment Checklist .................................................................................. 44
Table 9. Management Assessment Criteria ................................................................................. 46

FIGURES
Figure 1. Management Reserve Consumption ............................................................................. 26
Figure 2. PAR Group Timeline.................................................................................................... 30

TABLE OF CONTENTS 4
DCMA-MAN 3101-02, November 22, 2017

SECTION 1: GENERAL ISSUANCE INFORMATION

1.1. APPLICABILITY. This issuance applies to all DCMA Components, DCMA Operational
Units, and DCMA Contract Management Offices (CMO) involved with Program Support (unless
it conflicts with higher-level regulations, policy, guidance, waiver, or agreements, in which case
those take precedence). Requests for exception to this manual must be addressed through the
waiver process in the DCMA Manual (MAN) 501-01, “Policy Issuances Procedures.” This
manual applies to analysis and reporting on Major Programs, Non-Major Programs with
Reporting Requirements, and High Visibility Commodities.

a. Major Programs. The requirements of this manual apply to all programs identified as
Major Programs.

b. Non-Major Programs with Reporting Requirements. The requirements of this manual


apply to all Non-Major Programs with reporting requirements as clarified below:

(1) Section 3 – Section 3 requirements apply only as needed to meet the negotiated
reporting requirements (e.g., electronic Functional Input Template (eFIT), the Production
Supportability Table, Earned Value Management (EVM) analysis, the Prime Control of
Subcontractor Assessment (PCSA), the Program Assessment Report (PAR), the Support PAR,
the use of Program Support Teams (PST) and Support Program Support Teams (SPST)). The
PST Collaboration Site must be used for non-major programs with reporting requirements.

(2) Paragraph 4.1 – For PARs, only Executive Summaries are required. Other PAR
narrative fields that are not negotiated for inclusion must contain “Not Applicable”.

(3) Paragraphs 4.1.c and 4.2.f – Distribution of the PARs/Program Notifications (PN)
will be accomplished by local procedures unless the PAR/PN represents a program which is
contained within the approved distribution system per the Program Support Analysis and
Reporting User Guide. In this case the PAR/PN must be uploaded to the distribution system.

(4) Paragraphs 4.3 – Not applicable.

c. National Aeronautics and Space Administration (NASA) Programs. The requirements of


this manual do not apply to NASA programs. For NASA programs, refer to DCMA-MAN 3101-
03, “NASA Support.”

d. High Visibility Commodities. The reporting requirements of this manual do not apply to
High Visibility Commodities, refer to the High Visibility Commodity User Guide on the
Resource Page.

1.2. POLICY. It is DCMA policy to:

a. Deliver global acquisition insight for all programs and High Visibility Commodities by
providing objective, independent, relevant, timely and actionable information to the Acquisition
Enterprise.

SECTION 1: GENERAL ISSUANCE INFORMATION 5


DCMA-MAN 3101-02, November 22, 2017

b. Comply with Office of the Secretary of Defense (OSD) Defense Acquisition Executive
Summary (DAES) Deskbook and OSD DAES Guidelines when reporting on DAES programs,
specifically these 3 of 11 DAES assessment categories: Contract Performance Assessment
(CPA), Production Assessment (PA), and Management Assessment (MA).

c. Execute this manual in a safe, efficient, effective, and ethical manner.

SECTION 1: GENERAL ISSUANCE INFORMATION 6


DCMA-MAN 3101-02, November 22, 2017

SECTION 2: RESPONSIBILITIES

2.1. EXECUTIVE DIRECTOR, PORTFOLIO MANAGEMENT AND BUSINESS


INTEGRATION (PM&BI). The Executive Director, PM&BI, must:

a. Ensure that Agency metrics are provided to the DCMA Director.

b. Attend DAES or Service meetings as appropriate.

2.2. DIRECTOR, MAJOR PROGRAM SUPPORT (MPS) DIVISION. The Director, MPS
Division must:

a. Ensure that DCMA Defense Acquisition Management Information Retrieval (DAMIR)


inputs are validated and released.

b. Compile, publish, and disseminate the PAR Scoring Rubric and Rework Metric Results.

c. Ensure that support is provided to the Operational Units in the evaluation of PAR quality,
as needed.

d. Manage the DAES or Service meeting preparation process.

2.3. DIRECTOR, EARNED VALUE MANAGEMENT SYSTEM (EVMS) CENTER. The


Director, Earned Value Management System (EVMS) Center must:

a. Perform all EVM System surveillance.

b. Ensure supporting information concerning any “Disapproved” or “Not Evaluated” EVMS


Contractor Business System (CBS) is provided to the Program Integrator (PI).

c. Provide an impact statement for any EVMS Corrective Action Request (CAR) issues.

d. Provide follow up on system related issues, such as data integrtiy concerns or system
performance.

2.4. CORPORATE ADMINISTRATIVE CONTRACTING OFFICER (CACO),


DIVISIONAL ADMINISTRATIVE CONTRACTING OFFICER (DACO), OR
ADMINISTRATIVE CONTRACTING OFFICER (ACO). The CACO, DACO, or ACO
must ensure required information concerning any “Disapproved” or “Not Evaluated” CBS is
provided.

2.5. COMMANDERS/DIRECTORS, OPERATIONAL UNITS. The Commanders or


Directors, Operational Units must:

SECTION 2 : RESPONSIBILITIES 7
DCMA-MAN 3101-02, November 22, 2017

a. Ensure the Operational Unit representatives provide an independent evaluation of the


quality of the CMO developed PARs.

b. Ensure the Operational Unit provides PAR score, feedback, and recommendations to the
PI, PI’s First Level Supervisor (FLS), and CMO Commander or Director.

c. Ensure the Operational Unit, in coordination with the CMO, jointly develop and
implement a get well plan when the PAR quality is below the acceptable threshold.

d. Ensure the Operational Unit performs a follow-up review of PAR quality when required.

e. Ensure the Operational Unit loads the DAMIR rework into the DAES Assessment Rework
Database.

f. Ensure the Operational Unit provides PAR health results to the MPS Division and the
CMO.

g. Promote the use of predictive analysis throughout the entire content of the PAR.

2.6. COMMANDERS/DIRECTORS, CMO. The Commanders or Directors, CMO must:

a. Ensure their CMO complies with the requirements of this manual.

b. Review and approve PARs and PNs.

c. Promote DAMIR assessments to PM&BI for validation and release.

d. When applicable, ensure Production Supportability Tables are used.

e. Promote the use of predictive analysis throughout the entire content of the PAR.

f. Special Programs CMOs only – comply with DCMA-INST 3101 and meet the intent of
this manual to the maximum extent practicable for all Special Access Programs (SAP)/Sensitive
Compartmented Information (SCI) contracts.

g. High Visibility Commodities – follow the High Visibility Commodity User Guide on the
Resource Page.

2.7. DIRECTOR, SPECIAL PROGRAMS. The Director, Special Programs must comply
with DCMA-INST 3101 and meet the intent of this manual to the maximum extent practicable
for all SAP/SCI contracts (including the designation of Lead CMO for SAP/SCI).

2.8. PROGRAM INTEGRATOR. The Program Integrator must:

a. Draft and submit PARs and PNs.

SECTION 2 : RESPONSIBILITIES 8
DCMA-MAN 3101-02, November 22, 2017

b. Load DAMIR assessments.

c. Review PST functional inputs.

d. Review Support Program Integrator (SPI) Support PARs.

e. Review PAR and PN comments.

f. Distribute PARs and PNs.

2.9. SUPPORT PROGRAM INTEGRATOR. The Support Program Integrator must:

a. Draft and submit support PARs

b. Drafts PNs.

c. Review SPST functional inputs.

2.10. PST AND SPST SUPPORT MEMBERS. The PST and SPST members must
document and submit functional inputs.

2.11. FUNCTIONAL FIRST LEVEL SUPERVISORS. The Functional First Level


Supervisors must review and score functional eFITs.

SECTION 2 : RESPONSIBILITIES 9
DCMA-MAN 3101-02, November 22, 2017

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM


INTEGRATOR

3.1. PROGRAM ANALYSIS.

a. Summary. The PI and SPI rely on documented inputs from PST and SPST members to
provide independent acquisition insight about the program to the DCMA customers. These
inputs (e.g., eFIT, EVM Analysis input, PCSA, narratives supporting CBS status or impact,
Production Supportability Table, and Support PARs) will reside in the program’s PST
Collaboration Site (see Resource Page).

b. Conduct Program Analysis. Based on program risk and resources, PST and SPST
members must perform their assigned surveillance and analysis activities identified in the
Program Support Plan (PSP) or Support Program Support Plan (SPSP) and functional
surveillance plan(s). In all cases, PST and SPST surveillance should be conducted with an
emphasis on the cost, schedule, and technical impacts to the program, in addition to assessing
contractor compliance to contractual and procedural requirements. The PI, SPI, PST, and SPST
must engage with the contractor as necessary to perform PSP/SPSP surveillance and analysis
activities.

3.2. ELECTRONIC FUNCTIONAL INPUT TEMPLATE (EFIT).

a. PST and SPST Documents Program Analysis Results Using eFIT. The purpose of the
eFIT is to standardize PST and SPST member input for the creation of PARs and PNs by the PI
and SPI. All PST and SPST member inputs must be through the use of eFITS. Exceptions are
the EVM Analysis input, CBS status and impact statements, PCSA, Production Supportability
Table, and Support PAR. For the PST and SPST members not required to use the eFIT, refer to
the applicable section of this manual for input requirements. All eFITs must be completed in
accordance with the appropriate section of the Program Support Analysis and Reporting User
Guide found on the Resource Page. The eFIT must be provided via the program’s PST
Collaboration Site. Information included in the eFIT should provide justification and predictive
analysis in support of the PAR and PN assessments by answering the following questions:

• What is the issue, risk, opportunity, or observation (identification)?


• Why does DCMA believe it is an issue, risk, or opportunity (independent assessment)?
• Why does it matter (impact – current and future)?
• What is DCMA’s suggested course of action for the government?
• What is the contractor’s root cause and corrective action?
• What is DCMA’s assessment of the contractor’s root cause, corrective action, mitigation
plan details, Estimated Completion Date (ECD), and cost impact, if known?

(1) Frequency. At a minimum, PST or SPST members must provide weekly eFITs or a
notification of “no input” to the PI or SPI in accordance with the PSP/SPSP. eFITs submissions
can evolve as additional information is gathered through observations, trending, and information
exchange. Periodic updates are encouraged for high risks and significant issues.

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 10


DCMA-MAN 3101-02, November 22, 2017

(2) Accessing the eFIT Template. The eFIT template, DCMA-Form (DCMAF) 3101-02-
02A, is located on the program’s PST Collaboration Site.

(3) eFIT Program Impact Rating. The eFIT is composed of several data input sections
with each section having multiple data elements. All fields marked with an asterisk in the eFIT
are mandatory fields. For any specific issue or risk, the generic rating definitions are:

(a) Green: Some minor problem(s) may exist, but appropriate solutions to those
problems are available, and none are expected to affect overall contract cost, schedule, and
performance requirements; and none are expected to require managerial attention or action.

(b) Yellow: Some event, action, or delay has occurred or is anticipated that may
impair progress against major contract objectives, and may affect the contractor’s ability to meet
overall cost, schedule, and performance requirements or other major contract objective.

(c) Red: An event, action, or delay has occurred or will occur that, if not corrected,
poses a serious risk to the contractor's ability to meet overall cost, schedule, and performance
requirements or other major contract objective.

(d) More detailed rating criteria is available in Appendices 4B, 4C, and 4D.

(4) Type. Risk, Issue, Opportunity, or Observation general guidelines are as follows:

(a) Risk. An event or condition (contractor or program) that has not yet occurred, but
may impact successful performance (cost, schedule, or technical). Risks can be either:

1. Program Office, DCMA, or contractor identified risks that the contractor is


addressing. The risk and contractor action will be independently assessed by DCMA, as
discussed in the “Documenting a Risk” section below.

2. Program Office, DCMA, or contractor identified risks that the contractor is not
addressing. The risk and resulting impacts, due to lack of contractor action, will be assessed by
DCMA, as discussed in the “Documenting a Risk” section below.

(b) Issue. An event or condition (contractor or program) that resulted in a negative


impact to the program (cost, schedule, or technical).

(c) Opportunity. A future event or condition (contractor or program) that may result
in a benefit to the program (cost, schedule, or technical).

(d) Observation. A noteworthy occurrence that has not been identified as a Risk,
Issue, or Opportunity. Observations are not envisioned to have a significant effect to the
program cost, schedule, or technical performance.

(5) Input and Description. Functional Assessment and Predictive Analysis. DCMA’s
independent assessment of the contractor root cause, the corrective action, and mitigation should
consider:

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 11


DCMA-MAN 3101-02, November 22, 2017

• Do we concur or not with the root cause?


• What other additional insights into the root cause do we have, if any?
• Is the recovery strategy executable?
• Will it achieve the desired results?
• Is the contractor executing to their plans?

(6) Supervisor Review and Approval (Optional). Supervisory approval is not required for
eFIT submittal unless a CMO requirement exists.

b. eFIT Quality. The PST and SPST Functional Specialist’s supervisor must periodically
review a sample of their Functional Specialist eFITs to evaluate the quality and provide
appropriate feedback to the originator.

(1) Functional FLS Reviews eFITs Using eFIT Rubric. At least once per quarter, the
PST and SPST Functional FLS must review at least one eFIT for each of their Functional
Specialists assigned to a PST or SPST. To obtain the eFIT, go to the program’s PST
Collaboration Site. This review must be accomplished using the eFIT Rubric located on the
Resource Page. The Rubric automatically calculates the score; scores below 85 percent are not
acceptable.

(2) Functional FLS Uses eFIT Score as Part of Functional Feedback. If the resultant eFIT
score is below 85 percent, then the supervisor must discuss relevant opportunities to improve the
quality of future eFITs.

(3) Functional FLS Reviews Subsequent eFITs. If the eFIT score is less than 85 percent,
the FLS must perform reviews of subsequent eFITs to ensure that the quality has improved to an
acceptable level.

3.3. PRODUCTION SUPPORTABILITY TABLE.

a. The table provides insight into current and future contractor production performance. It
supports and traces to DCMA independent projected delivery dates used in the PAR. The
analysis details current delivery trends and creates an understanding of how future performance
on remaining deliveries will either be met, slipped, or ahead of schedule. The table allows for
quick quantification of items not meeting delivery dates and quantifying schedule underrun or
overruns in terms of months. See the appropriate section of the Program Support Analysis and
Reporting User Guide on the Resource Page. The Production Supportability Table must be
developed at least quarterly to align with the PAR timeline but is not required for the following
situations:

(1) Low quantity deliveries (See the Program Support Analysis and Reporting User
Guide)

(2) Sustainment deliveries

(3) Engineering & Manufacturing Development (EMD) contract/Contract Line Item


Number (CLIN)/Delivery Orders (DO)

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 12


DCMA-MAN 3101-02, November 22, 2017

(4) Contract Data Requirements List (CDRL) Deliveries

b. Upload Production Supportability Table(s). The Production Supportability Table(s),


when completed, must be uploaded to the program’s PST Collaboration Site. This is the
supporting analysis for DCMA final projections provided and included in the PAR.

3.4. CONTRACTOR BUSINESS SYSTEM REPORTING.

a. Summary. CBSs are the first line of defense against fraud, waste, and abuse in DoD
contracts. Contractors with approved business systems allow the contractor and the DoD to
more confidently rely upon the information produced, which helps manage programs more
effectively. For contracts subject to the Cost Accounting Standards and containing one or more
of the following six business system clauses, the contractor must establish and maintain an
acceptable CBS, as required by the following clauses:

(1) DFARS 252.215-7002, Cost Estimating System Requirements

(2) DFARS 252.234-7002, Earned Value Management System (EVMS)

(3) DFARS 252.242-7004, Material Management and Accounting System

(4) DFARS 252.242-7006, Accounting System Administration

(5) DFARS 252.244-7001, Contractor Purchasing System Administration

(6) DFARS 252.245-7003, Contractor Property Management System Administration

b. Cognizant ACO Documents Reasons for CBS Disapproval or Not Evaluated. CBSs
status and status date is populated from the Contractor Business Analysis Repository (CBAR).
The cognizant ACO responsible for the CBSs must document reasons for any “Disapproved” or
“Not Evaluated” CBS in the program’s PST Collaboration Site. The pertinent information
supplied must include (as applicable):

(1) Pertinent information regarding DCMA’s determination of CBS approval or


disapproval or plans for the cognizant ACO to issue a final determination.

(2) CBS status of “Disapproved” or “Not Evaluated” must address why the CBS is not
approved.

(a) For Disapproved Systems, include the reasons for disapproval.

(b) Impact(s) of transmitted Level III or IV Corrective Action Request (CARs)


supporting CBS disapproval (if there is no impact to the reporting program, then state that);
address actual or proposed date of submission for the Corrective Action Plan (CAP), DCMA’s
assessment of the contractor’s status towards closing the CAR, and estimated time for follow-up
review.

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 13


DCMA-MAN 3101-02, November 22, 2017

(c) Identify whether or not a withhold was applied to the disapproved CBS. If a
withhold has been applied, identify:

1. Whether the payment withhold applies to the program (If it does not apply to
the program, still identify the withhold but state that it does not impact program).

2. The estimated withhold percent amount.

3. Whether the withhold is against progress payments, performance based


payments, or interim payments billed under cost, labor-hour, or time and materials contracts.

(d) Upcoming reviews planned for a CBS.

c. EVMS Center Documents Reasons for EVMS Disapproval or Not Evaluated


Ratings. The EVMS Center may supplement ACO information concerning any “Disapproved”
or “Not Evaluated” EVMS CBS in the Business Systems Input tab of the program’s PST
Collaboration Site. If provided, the supplemental information must include as applicable:

(1) Reason the EVMS is not approved.

(2) For Disapproved Systems, include the reasons for disapproval (include the guideline
numbers and titles that resulted in the disapproval) and must address reliability of the EVM data
used in EVM analysis. If the data is determined to be unreliable it will have an impact the CPA
rating.

(3) Impact(s) of transmitted Level III or IV CARs supporting EVMS disapproval (if
there is no impact, then state that); address actual or proposed date of submission for the CAP,
DCMA’s assessment of the contractor’s status towards closing the CAR, and estimated time for
follow-up review.

(4) Upcoming reviews planned for an EVMS CBS.

d. PI Reviews CBS Data and Determines Programmatic Impact. The PI, in conjunction
with the applicable ACO or EVMS Center, must identify any impacts to their specific program
resulting from significant CBS issues in the Business Systems Input tab of the program’s PST
Collaboration Site.

3.5. EVM ANALYSIS. EVM Analysis is performed for program conducting program
reporting, containing contracts that have the EVMS clause, and the contractor is submitting
EVM CDRLs. EVM analysis will be conducted using the Agency cost and schedule analysis
tools and user guides found on the Resource Page. The EVM Analyst is identified in the PSP or
SPSP.

a. Obtain and Evaluate Contractor EVM Data Submission. DCMA will use Defense
Cost and Resource Center (DCARC) EVM Central Repository (EVM-CR) to access official
contractor submissions for review and analysis unless the contract does not specify the use of
EVM-CR, in which case DCMA will receive the data directly from the contractor. Any EVM

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 14


DCMA-MAN 3101-02, November 22, 2017

data obtained from the contractor, which is not stored in EVM-CR, must be uploaded to the
Prime contract folder in the Electronic Document and Records Management (eDRMS). DCMA
EVM Analysts must register in DCARC as a “Reviewer” in order to access and provide
validation recommendations to the program office during the official review timeframe. See the
appropriate section of the Program Support Analysis and Reporting User Guide found on the
Resource Page for instructions on how to register with DCARC.

(1) Every month the EVM Analyst will download EVM-CR or contractor provided
Integrated Program Management Report (IPMR) or Contract Performance Report
(CPR)/Integrated Master Schedule (IMS) data and import the data into the appropriate DCMA
EVM analysis software. The EVM Analyst must ensure the data in the submitted IPMR or
CPR/IMS is complete, Data Item Description (DID) compliant, consistent, and reliable (Table 1).
The minimum data integrity metrics and tests for evaluating cost and schedule data are described
in Appendix 3A.

Table 1. DID Correlation


Name CPR/IMS DIDs IPMR DID
Format 1 - Work Breakdown Structure
Format 2 - Organizational Categories
DI-MGMT-81466A
Format 3 – Baseline
DI-MGMT-81861A
Format 4 – Staffing
Format 5 - Explanations and Problem Analyses
Format 6 - Integrated Master Schedule DI-MGMT-81650
Format 7 - Electronic History and Forecast File N/A

(2) For submissions in the EVM-CR, no later than (NLT) 10 business days after
contractor submission, the EVM Analysts must report any discrepancies and provide a
recommendation to the Program Office through the DCARC system as to whether the data is
complete and DID compliant. (See the appropriate section of the Program Support Analysis and
Reporting User Guide found on the Resource Page.) For other prime contractor submissions not
in the EVM-CR, the discrepancies will be directly reported to the program office through the PI.
The CMO may submit a contractual noncompliance or “other Contract Management”
documentation CAR pertaining to the missing or late EVM CDRL deliverable or for incorrect
CDRL data if the program office rejects the submission. This is not a system review and suspect
data that may indicate areas of concern with the EVM system will be forwarded to the EVMS
Center for review and determination of appropriate actions in accordance with DCMA-INST
210. The EVM Analyst must not create any EVMS CARs.

b. Determining Contract Risks or Issues. As part of the monthly analysis, the EVM
Analyst will evaluate variances and determine potential risks or issues.

(1) For those Work Breakdown Structure (WBS) elements with effort remaining, the
EVM Analyst must perform Cost Variance (CV), Schedule Variance (SV), and Variance at
Completion (VAC) analysis at the lowest reporting level in order to determine the WBS elements
that significantly contribute to the overall contract variances. This analysis is typically
performed with cumulative performance data; however, variance analysis employing current

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 15


DCMA-MAN 3101-02, November 22, 2017

period data may also be useful in identifying emerging trends that indicate current issues or may
signal potential risks.

(2) To determine the list of WBS elements, the EVM Analyst must select incomplete
WBS elements that may impact the contract and potential need for surveillance. The information
for these WBS elements will be provided to the PST or SPST functional specialists during the
review of the Program Risk table IAW DCMA-MAN 3101-01.

(3) The PST or SPST functional specialist must work with the other PST or SPST
members to provide to the EVM Analyst through eFITs (paragraph 3.2):

(a) An independent root cause.

(b) Feasibility of CAPs.

(c) Estimated recoverability of the variances.

(d) Any additional cost or schedule adjustments (impacts) to the contractor’s values.

c. DCMA Cost and Schedule Estimates. Independent DCMA Cost and Schedule
Estimates are of paramount importance to the acquisition community. See Appendix 3B for
additional details.

(1) Determine the Need for Revised Estimates. As part of the monthly analysis, the
EVM Analyst will make a determination as to whether DCMA Estimate at Completion
(EACDCMA) or Estimated Completion Date (ECDDCMA) needs to be reviewed outside of the
quarterly generation process. If any of the following conditions have occurred since the last time
these values were calculated, or if a Memorandum of Agreement (MOA) with the Program
Office requires one, perform an out-of-cycle calculation of the EACDCMA, DCMA Variance at
Completion (VACDCMA), or ECDDCMA and days of schedule slippage and update if necessary:

(a) Over Target Baseline (OTB) or Over Target Schedule (OTS).

(b) Notable contractor Estimate at Completion (EACKtr) change without a


corresponding Total Allocated Budget (TAB) change.

(c) Missed previously reported milestone or contractually required event.

(2) Quarterly Schedule Analysis. On a quarterly basis, in preparation for the PAR
submission, the EVM Analyst must provide to the PI DCMA’s assessment of the contract’s
schedule performance. This analysis uses the EVM and duration-based schedule metrics to
evaluate the contract schedule and forecast any potential schedule delays to the contract or
program. Contributing factors affecting the contract milestone completion dates include: critical
path, driving paths, lag, float, margin use/consumption, subcontractors, rework, test failures,
schedule delays, contract mods, Over Target Schedule (OTS), and single point adjustments.

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 16


DCMA-MAN 3101-02, November 22, 2017

(a) Schedule Performance. The EVM Analyst must analyze the Baseline Execution
Index (BEI) and missed task trends, correlate the tasks to the PST Functional provided root
causes, identify any corrective actions, and evaluate the impact on the schedule.

(b) ECDDCMA. Prior to determining the ECDDCMA, the EVM Analyst must apply the
functional specialists’ and SPST EVM Analysts’ schedule adjustments to the appropriate tasks in
the IMS. These schedule adjustments will account for known or forecasted slippages based on
functional surveillance or SPST EVM analysis. After the EVM Analyst has made all known
adjustments, he or she will use the Agency schedule analysis tool or the constraint method to
determine the DCMA independent critical path and the ECDDCMA for each milestone or
contractual required events. After incorporating all functional schedule adjustments, the EVM
Analyst must upload a copy of the DCMA independent critical path and contractor’s critical path
to the “Program Documents” tab in the program’s PST Collaboration Site.

(c) Contractually Required Events/Milestone Analysis. Include analysis of the


DCMA projected completion dates and trends of contractually required events and milestones.
Determine what WBS elements or tasks are driving DCMA’s projected completion date, any root
causes, the contractor’s proposed mitigations, and CAPs. Further explain the differences
between DCMA estimates and the contractor’s values.

(3) Quarterly Cost Analysis. On a quarterly basis, in preparation for the PAR
submission, the EVM Analyst must submit DCMA’s assessment of the contract’s cost
performance to the PST Collaboration Site. Cost analysis evaluates the impact to contract
performance based on numerous factors (e.g., Management Reserve (MR) use, subcontractors,
rework, test failures, schedule delays, contract mods, labor rates, material costs, OTB, single
point adjustments) and forecasts any potential cost overruns to the contract or program. The
provided assessment is an independent evaluation of the contract’s performance and includes
insight from DCMA functional specialists.

(a) EACDCMA. The EACDCMA is an analysis of WBS elements at the lowest reporting
level and the risk/opportunities that may impact implementation of contract level requirements.
This analysis involves determining the reasonableness of the WBS level EAC with information
gained from PST/SPST functional surveillance and SPST EVM analysis inputs. In addition to
the WBS level analysis, the contract level analysis will account for factors such as estimates of
known or anticipated risk areas, planned risk reductions, or cost containment measures.
Challenge, when appropriate, the contractor’s analysis and explanations. Perform independent
analyses and surveillance to support these challenges.

1. Lowest WBS Level EACDCMA. Lowest WBS level EACDCMA is composed of


two parts: the EVM formula based performance and the PST/SPST members’ cost adjustments
based on surveillance. The EVM Analyst must select the appropriate EAC methodology for
each WBS element (e.g., cumulative Cost Performance Index (CPIcum), 3 period average CPI
(CPI3 period average), composite, linear regression, manual). The methodology selected should only
change at significant transition points of that element and not vary from report period to report
period (Appendix 3B). The EVM Analyst will then adjust the formula based WBS element EAC
with the PST/SPST members’ cost adjustments. For prime contractor WBS elements that are
delegated to SPST EVM Analysts, the PST EVM Analyst assigned to the prime contract must

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 17


DCMA-MAN 3101-02, November 22, 2017

incorporate the SPST EVM Analysts’ EACDCMA for the respective WBS elements. When a
subcontractor with a fixed price type contract is represented as an element on the prime
contractor’s WBS, the WBS level EAC estimate for the subcontractor cannot exceed the
subcontract ceiling price.

2. Lowest WBS Level EACDCMA Realism. The EVM Analyst must evaluate the
confidence of the EAC by comparing the WBS level VAC to the existing cumulative Cost
Variance (CVCUM), comparing the CPICUM to the “To Complete Performance Index” (TCPIEAC),
and comparing the WBS level EACDCMA to the optimistic and pessimistic range of EACs. This
will provide warning indicators of any WBS level EACDCMA not in alignment with past
performance trends and significant differences should be explained and noted at the WBS level
before generating the contract level EACDCMA.

3. Contract Level EACDCMA. The EVM Analyst must generate the contract level
EACDCMA by summing the lower WBS level EACDCMA values and adding the expected MR
usage and risk adjustments. Since MR does not form part of the PMB, the EVM Analyst must
add the expected MR usage and risk adjustments not included in the WBS level EACDCMA.

a. Risk Items. Integrate functional specialist dollarized cost impact with the
EACDCMA for the cost, schedule, and technical risk items the functional specialist is monitoring.

b. MR Usage. Use the contractor provided MR log to compare the contract


percent complete to the percentage of MR usage. Determine if the rate of MR usage is sufficient
to complete the contract when compared to known risks. Even if the cost of the known risks is
less than the MR, the MR usage might still suggest using all of the MR to determine the contract
level EACDCMA value. Rationale and methodology for MR usage amount should be documented.

c. Contract level EACDCMA Realism. Evaluate the confidence of the contract


level EACDCMA using the same three tests as the WBS level realism check in paragraph
3.5.c.(3).(a).2. If the contract level EACDCMA realism is outside the established thresholds, then
verify the WBS level calculations and adjustments. If the deviation is supportable by inputs of
known issues or a recovery, then include the explanation to substantiate the EACDCMA. Since the
EACDCMA directly affects the program assessment and is one factor that the ACO uses in
determining the loss ratio and reductions in progress payments on applicable contracts, it is
important to be realistic.

(b) VACDCMA Analysis. The VACDCMA identifies either a projected overrun or


underrun. The EVM Analyst calculates the VACDCMA at the WBS and contract level.

1. WBS Level VACDCMA. The WBS level VACDCMA is the difference between
the WBS level Budget at Completion (BAC) and the respective EACDCMA. This determines the
performance drivers influencing the contract level VACDCMA. When reporting the results in the
“EVM Analysis” tab, summarize completed tasks into a summary line and focus on the top
current and future drivers.

2. Contract level VACDCMA. The contract level VACDCMA is the difference


between the TAB and contract level EACDCMA. The VACDCMA provides independent and
predictive insight into how much DCMA believes the contract will overrun or underrun on cost.

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 18


DCMA-MAN 3101-02, November 22, 2017

If an OTB has occurred, also track the percentage of DCMA Variance at Completion
(VACDCMA%) to the original contract budget base.

(c) EACDCMA and VACDCMA Comparative Analysis to the Contractor Values. The
EVM Analyst must address the key reasons for significant differences between the EACDCMA and
the contractor’s EAC (EACKtr) value.

d. Providing EVM Analysis Insights. Using the program’s PST Collaboration Site, the
EVM Analyst must enter the analysis results into the “EVM Analysis” tab for the specific
program, contract or CLIN, and report month. This will provide the PI the required EVM
information defined in Appendix 4B. The DCMA 14 point assessment is no longer a
requirement but may be performed and provided per the MOA. In this case, the 14 point
assessment will be loaded to the PST Collaboration Site “Program Documents” tab.

3.6. PRIME CONTROL OF SUBCONTRACTOR ASSESSMENT. The PCSA is a DCMA


assessment of the prime contractor’s compliance to established subcontractor management
processes, procedures, and controls for each specific program. Do not aggregate assessments
above the program level.

a. PCSA Development. Information supporting the PCSA table in the PAR must be
provided to the PI by the PST member identified in the PSP. The PCSA rating must be
determined using the PCSA tab within the PST Collaboration Site. Instructions can be found in
the appropriate section of the Program Support Analysis and Reporting User Guide found on the
Resource Page.

b. PCSA Ratings. Ratings must be verified and updated using multifunctional risk-based
surveillance execution results, audit findings, and input from external sources as applicable.

c. PCSA Ratings Determinations. PCSA rating determinations other than Confident


require:

(1) Narrative of what the prime contractor specifically failed to do to mitigate a


subcontractor performance issue.

(2) Narrative of what the prime contractor’s corrective and preventive actions for
resolving subcontractor management issues (e.g., CAR, CAP).

(3) Narrative of DCMA action and assessment of potential impact on the program.
Insert the narratives in the Notes section of the PCSA tab for use by the PI in completing the
PCSA section of the PAR.

d. PCSA Results. PCSA results are provided to the PI at least quarterly in accordance with
the PSP.

3.7. SUPPLIERS DRIVING THE RATINGS. The PI or SPI will use functional inputs, CAR
eTool and Delegation eTool to populate the table.

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 19


DCMA-MAN 3101-02, November 22, 2017

3.8. SPI SUPPORT PAR. The SPI must develop a Support PAR for each Major Program
effort they have been delegated to support. The PAR template is utilized for this purpose;
however, the applicable content of the reporting is as specified in the delegation from their next
higher SPI or PI.

a. SPI Drafts Support PAR in Collaboration with SPST Functional Input. The SPI
must familiarize themselves with all of the inputs provided by their SPST members in their
program’s PST Collaboration Site. The content for these inputs will be determined by the
accepted Letter of Delegation (LOD) and may include Support PARs and other types of reports
from sub-tier CMOs, eFITs, EVM information, Production Supportability Tables, CBS, and
others.

(1) SPI Discussions to Reach Consensus on Predictive Analysis. It is imperative that the
SPI have discussions with their SPST members to reach a consensus of the impact the various
inputs have on our predictive assessments. The Support PAR or alternative report must include
summarized information derived from functional input that contributes to the rating assessment
conclusions. This summary must be written by the SPI using the SPST members’ input.

(2) SPI Use of the PAR Template. The SPI must use the PAR Template found on the
Resource Page to create the Support PAR unless an alternative format was delegated. The
Support PAR, at a minimum, will contain the content specified in the applicable LOD. If the
PAR Template is used the SPI must label all non-applicable Support PAR elements as “N/A.”
Final Support PARs must be submitted by the due date specified in the LOD.

b. SPI Initiates CMO Review Process (Optional). A CMO review process is not required.
If the CMO may add a review process the SPI must review and consider all comments provided
by any CMO reviewers. The SPI must update the draft PAR to include those comments as
needed. The review must not interfere with the PAR submission timelines identified by the PI.

c. Report Submitted. The SPI must load the Support PAR or alternate report into the
program’s “Program Documents” tab in the program’s PST Collaboration Site.

SECTION 3: PROGRAM ANALYSIS AND INPUTS TO THE PROGRAM INTEGRATOR 20


DCMA-MAN 3101-02, Publication Date

APPENDIX 3A: CONTRACT DATA EVALUATION METRICS

3A.1. COST DATA INTEGRITY INDICATORS. CPR/IPMR Data Integrity Indicators are
metrics designed to provide confidence in the quality of the data being reviewed instead of
providing insight into the performance of a contract. The EVM Analyst should report any WBS
elements with one of the following conditions being tested for by these metrics.

a. BCWSCUM > BAC. The Budgeted Cost for Work Scheduled (BCWS) is the contract
budget time-phased over the period of performance. The summation of BCWS for all reporting
periods should equal the BAC. In other words, BCWS summation for all reporting periods
(BCWSCUM) should equal BAC on the month the contract is planned to complete. Both of these
values can be found on the IPMR/CPR Format 1. Due to this relationship, the value of
BCWSCUM should never exceed BAC. Errors may exist in EVM data resulting in this condition,
thereby making it necessary to perform this metric. Compare the value of BCWSCUM to the
value of BAC; if BCWSCUM is greater than BAC, consider this an error in the EVM data. There
is no plausible explanation. There may be no issue if the value of BCWSCUM is less than BAC.

b. BCWPCUM > BAC. The Budgeted Cost for Work Performed (BCWP) is the amount of
BCWS earned by the completion of work to date. Like the BCWSCUM, the Budgeted Cost for
Work Performed, cumulative (BCWPCUM), cannot exceed the value of BAC. The contract is
considered complete when BCWPCUM equals BAC. Compare the value of BCWPCUM to BAC.
If BCWPCUM is greater, then this is an error, otherwise there is no issue.

c. ACWP with No BAC. The Actual Cost of Work Performed (ACWP) is the total dollars
spent on labor, material, subcontracts, and other direct costs in the performance of the contract
statement of work (SOW). These costs are controlled by the accounting general ledger and
should reconcile between the accounting system and EVMS. Work should only be performed if
there is a clear contractual requirement. The BAC is required to be traceable to work
requirements in the contract SOW. If work is performed and the ACWP incurred without
applicable BAC, there may be a misalignment between the work and the requirements of the
contract. To test for this condition, simply review the IPMR/CPR Format 1 data for WBS
elements containing any instance of current or cumulative ACWP but no BAC. If there are
elements that meet these criteria, the contractor should provide justification. If this did not
occur, consider this an error.

d. Negative BAC or EAC. BAC is the total budget assigned to complete the work defined
within the contract. Likewise, EAC is the Estimate at Completion of the work. A negative total
budget is not logical. To test for this condition simply examine the IPMR/CPR Format 1 data for
a BAC or EAC less than zero. This test should be performed at the reported WBS levels as well
as the total program level. A BAC or EAC less than zero should be considered an error.

e. Negative BCWSCUM or Negative BCWSCUR. The BCWS is the time-phased contract


budget. The summation of BCWS for all reporting periods equals the total contract BAC. When
the initial baseline is established there should be no instances of negative BCWS. However, as
work progresses there may be legitimate reasons for re-planning of budget. Changes to the
baseline may result in a negative value for budget in the current reporting period (BCWSCUR). It

APPENDIX 3A: CONTRACT DATA EVALUATION METRICS 21


DCMA-MAN 3101-02, Publication Date

is not possible to re-plan more budget than has already been time-phased to date. Therefore,
there should not be an instance of negative BCWSCUM. To test for this condition simply examine
the current and cumulative sections of the IPMR/CPR Format 1 for BCWSCUM or BCWSCUR less
than zero.

f. Negative BCWPCUM or Negative BCWPCUR. There may be negative BCWP due to


wrong consideration for “Earned Value.” To test for this condition, simply examine the current
and cumulative sections of the IPMR/CPR Format 1 for BCWPCUM or BCWPCUR less than zero.

g. BCWP with No ACWP. Since work or materials must be paid for, it is not possible to
earn BCWP without incurring ACWP. This condition may occur for elements using the Level of
Effort (LOE) Earned Value Technique (EVT). In this case, it would signify the support work
that was planned to occur is not occurring due to some delay. This metric can be calculated
using the IPMR/CPR Format 1 data. Inspect the elements on the report for any instance of
current or cumulative BCWP with a corresponding current or cumulative ACWP equal to zero.

h. Completed Work with Estimate To Complete (ETC). Since work is considered


complete when an element’s BCWPCUM equals the element’s BAC, the ETC is the to complete
portion of the EAC. The ETC should be zero if the work is complete, as there should be no
projected future cost left to incur. Look for completed elements (BCWPCUM = BAC) with an
ETC other than zero. This condition may exist if labor or material invoices are lagging behind
and haven’t been paid yet. Be sure to adjust your EAC forecast to accommodate this error and
refer the issue to the EVMS Center.

i. Incomplete Work without ETC. If work has yet to be completed, there should be a
forecast of the remaining costs to be incurred. Determine if there are any elements that are
incomplete (BCWPCUM < BAC) and contain an ETC of zero. If this condition exists, consider it
an error.

j. ACWP on Completed Work. There may be valid reasons to incur cost (ACWP)
following the completion of work (BCWPCUM = BAC). However, this should not be considered
the norm. Review the IPMR/CPR Format 1 for the following:
• BCWPCUM = BAC
• BCWPCUR = 0
• ACWPCUR ≠ 0
Keep in mind there may be costs incurred in the month the element of work is complete. That is
why it’s necessary to check for BCWPCUR. This insures the work was completed in a prior
period and if ACWPCUR returns a value other than zero the metric is flagged.

k. BCWP with No BCWS. Since all budgeted work performed should have been
scheduled, occurrences of BCWP without BCWS should be commensurate with early starts in
the IMS. The values do not have to be equal since actual work will rarely match the baseline
work during project execution, but the values will equal at project completion. This metric can
be calculated using the IPMR/CPR Format 1 data. Inspect the elements on the report for any
instance of current or cumulative BCWP with a corresponding current or cumulative BCWS
equal to zero.

APPENDIX 3A: CONTRACT DATA EVALUATION METRICS 22


DCMA-MAN 3101-02, Publication Date

l. ACWPCUM > EAC. The EAC consists of two components, the actual costs incurred to
date (ACWPCUM) and the estimate of future costs to be incurred or the ETC. The ACWPCUM can
only be greater than EAC if the ETC is negative or extra cost incurred/recorded due to correction
of accounting, management, or ledger errors. There may be limited cases that would require a
negative ETC. Using the IPMR/CPR Format 1, examine the elements for any condition of
ACWPCUM greater than EAC. If this condition exists, adjust your EAC forecast to accommodate
the condition and refer the issue to the EVMS Center.

3A.2. SCHEDULE DATA INTEGRITY INDICATORS. To begin the analysis, exclude


Completed tasks, LOE tasks, Subprojects (called Summary tasks in MS Project), and Milestones.
These metrics provide the analyst with a framework for asking educated questions and in support
of forecasting schedule completion and estimates at complete. Identified concerns may be issues
of compliance and will be referred to the EVMS Center for follow-up.

a. Logic. This metric identifies incomplete tasks with missing logic links. It helps identify
how well or poorly the schedule is linked together. Any incomplete task that is missing a
predecessor and/or a successor is included in this metric.

b. Hard Constraints. This is a count of incomplete tasks with hard constraints in use.
Using hard constraints (e.g.; Must-Finish-On (MFO), Must-Start-On (MSO), Start-No-Later-
Than (SNLT), and Finish-No-Later-Than (FNLT)) may prevent tasks from moving with their
dependencies and, therefore, prevent the schedule from being logic-driven. Soft constraints such
as As-Soon-As-Possible (ASAP), Start-No-Earlier-Than (SNET), and Finish-No-Earlier-Than
(FNET) enable the schedule to be logic-driven.

c. Invalid Dates. This area of analysis includes planned tasks that have a forecast
start/finish date prior to the IMS status date, completed tasks that have actual start/finish dates
beyond the IMS status date, incorrectly statused finish dates when a task is not complete, and
tasks that have riding start dates. There should not be any invalid dates in the schedule.

d. Critical Path Test. The purpose is to test the integrity of the overall network logic and,
in particular, the critical path. If the contract completion date (or other milestone) is not delayed
in proportion (assuming zero float) to the amount of intentional slip that is introduced into the
schedule as part of this test, then there is broken logic somewhere in the network. Broken logic
is the result of missing predecessors and/or successors on tasks where they are needed. The IMS
passes the Critical Path Test if the project completion date (or other task/milestone) show a
negative total float number or a revised Early Finish date that is in proportion (assuming zero
float) to the amount of intentional slip applied.

e. Milestones with Duration. Includes milestones that are planned or in-progress whose
duration is greater than zero. Per the Earned Value Management System Interpretation Guide
(EVMSIG), milestone tasks should not have a duration.

f. Missing WBS. Activities without WBS values indicate poor planning and cause problems
in reporting information about that task. This metric includes only normal activities and
milestones that are planned, in-progress, or complete.

APPENDIX 3A: CONTRACT DATA EVALUATION METRICS 23


DCMA-MAN 3101-02, Publication Date

APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT


COMPLETION

3B.1. COST.

a. EAC. The EAC is the projection of the final cost of the contract/program. The process
starts with how much has already been expended (ACWP). It follows with how much work
remains, Budgeted Cost of Work Remaining (BCWR), and how much is needed to finish the
work as related to the amount of expenditures or costs incurred or recorded (“Actuals”). This
allows that EAC is made up of “actuals” and the Estimated (cost of the work left) To Complete
(ETC).

Estimate at Completion = EAC = ACWP + ETC


BAC−BCWP
where ETC = Performance Factor + Risks/Opportunities + ⋯
(PF)

where PF could be Cost Performance Index (CPI), Schedule Performance Index (SPIx), CPI x
SPIx, .x(CPI) +(1-.x)(SPIx) given .x: 0 < .x < 1.0 as determined with elaborated rationale
(Table 2). Completion of an OTB/OTS may impact the utility of any performance factor and
the impacts of the changes should be understood as part of the EAC development.

Table 2. Best Predictive EAC Performance Factors by Contract Completion Status


Percent Complete
EAC Performance Factor Early: Middle: Late:
(PF) 0%-40% 20%-80% 60%-100% Comment
CPI Assumes the contractor will operate at the same
Cumulative X X X efficiency for remainder of contract, typically
forecasts the lowest possible EAC
3-Periodb Avg X X X Weights current performance more heavily than
6-Periodb Avg X X cumulative past performance
12-Periodb Avg X X
CPI x SPIx Cumulative X X Usually produces the highest EAC
A variation of this formula (CPI6 x SPI6), also
6-Periodb Avg X X
proven accurate
SPICUM Assumes schedule will affect cost also but is
Cumulative X
more accurate early in the contract than later
Regression Using CPI that decreases within 10% of its
X stable value can be a good predictor of final
costs and should be studied further
Weighted Weights cost and schedule based on
.x(CPI)+(1-.x)(SPICUM); statistically the most
X X
accurate especially when using 50% CPI +50%
SPICUMa
a
According to DOD comments based on the work of David S. Christensen.
b
Changed Month to Period.
(1) WBS Level EAC. Performed at the WBS reporting level and rolled up to the
contract level. This EAC includes ACWP and ETC. ETC comprises the rollup of all the

APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT COMPLETION 24


DCMA-MAN 3101-02, Publication Date

remaining work, risks and opportunities at WBS reporting level, and risk adjustments that can be
mapped to a specific WBS element.

(a) Performance Factor. Select the Efficiency Factor that best describes the contract
and WBS element being evaluated using Table 2. On large programs, not all contracts or even
WBS elements will utilize the same performance factor method to accurately estimate its
completion value. Although the method may differ by contract or WBS element, the methods
should remain consistent from reporting period to reporting period.

(b) Risk Adjustments. The CPR/IPMR provides the contractors’ most likely EAC
(EACKtr) that accounts for some program/contract risk factors. It is important to review the
program/contract risk registry and determine if the risks included by the contractor in their most
likely EAC are reasonable. These risks may present a consequence in terms of either cost or
schedule. Technical risks may impact schedule and cost performances and initiate cost and
schedule risks. Risks, contingencies, and mitigation plans should be included in the schedule
and thereby in the cost system. Risks and issues not accounted for by the contractor or
adequately addressed through the performance factor should be considered. Participation in risk
management meetings between the contractor and the program office will facilitate this
understanding.

(c) Realism. The determination of estimate of work to complete (ETC) is directly


proportional to the cost of work remaining (BCWR). Similar to CPI, the remaining work
performance index or To Complete Performance Index (TCPI) is the ratio of work remaining
(BCWR) and future cost of work remaining (ETC). Evaluating this ratio can help determine
which WBS level EACs (contractor or DCMA) are not in line with past performance and need
further review to verify that DCMA is provided a reasonable estimate in the calculation based on
contractors’ historical CPI. A mathematical difference between 0.05 and 0.1 is used as an early
warning indicator that the forecasted completion cost could possibly become unrealistic, stale, or
was not updated recently. It is important to remember that this is only a guide for focusing
analysis.

(2) Contract Level. Determine the EACDCMA at the contract level.

(a) WBS Level Rollup. To determine the EACDCMA for the contract, the WBS level
EACs must be summed to the Contract level. The reason this is performed at the lower level
first is to prevent skewing the data by averaging out performances of individual WBS elements
by conducting the performance factor determination at only the contract level.

(b) Risks and Opportunity Adjustments. Any PST provided or known risk and
opportunity impacts adjustments that were not included at the WBS level will be applied at the
contract level to adjust the EACDCMA.

(c) MR Consumption. MR Consumption provides insight into how quickly the MR


is being depleted. Approved MR requests result in adjustments to the baseline, in terms of both
time, and the allocation or loading of additional budget into the PMB or BAC. Then, we may
view increase or decrease on the BAC or PMB value. MR is increased or decreased depending
on the type of modifications, task revisions, reprograming, replanning, rate changes, and other

APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT COMPLETION 25


DCMA-MAN 3101-02, Publication Date

factors. MR consumption is measured by dividing the program percent complete (%comp) by


the percentage of MR (%MR) used to date.

1. The contractor is required to track debits and credits to MR over time. These
changes should be reflected in the IPMR Format 5. It is important to account for all the MR
debits and credits when calculating this metric. It is not simply the current value of MR divided
by the original value of MR. In fact, if there are significant credits to MR since program
inception, the current MR value might actually be greater than the original value, even if there
was a debit of some MR.

2. The resulting MR consumption value should be equal to 1.0 ± 0.1. A value


greater than 1.0 indicates that the MR is possibly being too conservatively withheld, while a
value less than 1.0 indicates that there may not be enough MR to support the program through
completion. If the rate of MR usage is high, then it may indicate the original PMB did not
contain the necessary budget for accomplishing the contract SOW. Graphical Representation
depicting MR Consumption of both cases are indicated in Figure 1. It is important to monitor
and trend MR use over time when performing predictive analysis. Determine whether all the
MR is to be used or not while making decision for the EACDCMA.

Figure 1. Management Reserve Consumption

Mangement Reserve Consumption


120.0% 1.20
1.11
110.0% 1.10
100.0% 1.00
90.0% 0.90
0.89 0.87
80.0% 0.88
0.84 0.80
70.0% 0.70
60.0% 0.60
50.0% 0.50
40.0% 0.40
30.0% %MR %comp 0.30
20.0% 0.20
MR Consumption (Ratio) Lower MR Cons Marigin
10.0% 0.10
0.0% 0.00

(d) Realism. Calculate the EAC Realism of the contract level EAC for both the
DCMA’s and the contractor’s EAC. At the contract level, DCMA uses TAB instead of BAC in
the TCPI formula since historical trends show that most programs consume all the MR by the
end of the contract. The EAC realism value for both the Contractor and DCMA will be reported
to the PI as part of the EVM Analyst input.

APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT COMPLETION 26


DCMA-MAN 3101-02, Publication Date

b. VAC. VACDCMA is calculated at the contract level and represents the difference between
the TAB or Contract Budget Base and the estimated final cost. VAC is also measured in
percentage (%) and known as VAC percentage (VAC%). Why does DCMA use TAB instead of
BAC? At the contract level, as MR is used, it becomes part of the PMB, increasing the BAC.
Since most major programs historically use all their MR, the BAC at the completion of the
contract would equal the TAB. Given this, to be predictive, DCMA uses TAB to begin with to
reduce VAC fluctuations caused from applying MR. If a VAC is being calculated at the WBS
level, then BAC is still used as there are no WBS level TAB values.

𝐕𝐕𝐕𝐕𝐕𝐕𝐃𝐃𝐃𝐃𝐃𝐃𝐃𝐃
𝐕𝐕𝐕𝐕𝐕𝐕𝐃𝐃𝐃𝐃𝐃𝐃𝐃𝐃 % = ∗ 𝟏𝟏𝟏𝟏𝟏𝟏%
𝐓𝐓𝐓𝐓𝐓𝐓
𝐰𝐰𝐰𝐰𝐰𝐰𝐰𝐰𝐰𝐰, 𝐕𝐕𝐕𝐕𝐕𝐕𝐃𝐃𝐃𝐃𝐃𝐃𝐃𝐃 = 𝐓𝐓𝐓𝐓𝐓𝐓 − 𝐄𝐄𝐄𝐄𝐄𝐄

c. Comparative Analysis to Contractor Values. Explain DCMA’s methodology used to


generate the EAC value and explain significant differences between the contractor’s EAC and
DCMA’s.

3B.2. SCHEDULE.

a. Status. A variety of variables can be used to illustrate the current status of the schedule
(e.g.; Status Date, ECD, Percent Complete, Activity Counts, and Remaining Duration).

b. Baseline Execution Index. The BEI metric is an IMS-based metric that calculates the
efficiency with which tasks have been accomplished when measured against the baseline tasks at
a Status Date. BEI compares the cumulative number of tasks completed to the cumulative
number of tasks with a baseline finish date on or before the status date of the reporting period.
BEI does not provide insight into tasks completed early or late (before or after the baseline finish
date), as long as the task was completed prior to the status date of the reporting period. Missed
Task metrics provide further insight into on-time performance.

(1) If the contractor completes more tasks than planned, then the BEI will be higher than
1.00, reflecting a higher task throughput than planned. A BEI less than 0.95 should be
considered a flag and requires additional investigation. The PST needs to investigate areas of
interest to include but are not limited to:

• What is causing the work to not be performed on-time?


• Are the missed tasks on the critical path?
• Is there an impact to cost? If so, then what is the projected impact?
• Is there a monthly trend?
• What is the contractor doing to remedy the situation?
• What is DCMA doing to track performance?

(2) A consistently downward trending BEI and increasing missed task percentage can be
associated with variance trends. If these trends continue over the long run, the schedule may
become unreliable.

APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT COMPLETION 27


DCMA-MAN 3101-02, Publication Date

c. Missed Tasks. The BEI metric is an IMS-based metric that calculates the efficiency with
which tasks have been accomplished when measured against the baseline tasks at a Status Date.
Missed tasks are incomplete tasks with a baseline finish date that is before the status date.
Examination of a consistently high percentage of missed tasks, regarding the nature of the late
completion dates, is necessary. Use the following guidelines to make a proper determination:

(1) A high percentage of missed tasks could result from a series of tasks that on average
only a few days late or conversely months late. This is a clue when developing an independent
forecast date. For example, if the contractor is 76 days late on average in completing tasks, then
the analyst can use this information as a partial basis for forecasting that the contractor will be 76
days late to program completion.

(2) If the missed tasks are always non-critical but the critical path tasks are consistently
completed on time, the high missed task percentage may not be a major concern.

(3) If the missed tasks are always on the critical path, then the low hit task percentage is
a major concern and should be tracked as a risk.

(4) A consistently high missed task percentage and a consistently good BEI could mean
the tasks are only slightly late on average and the actual finishes for those late tasks may never
impact the BEI beyond the current period.

d. Critical Path and ECDDCMA. The ECDDCMA is predictive insight to project completion.
The analyst must ensure that the ECDDCMA is independent, based on in-depth critical path
analysis, evaluates past performance, leverages multifunctional surveillance results, and uses
available metrics and tools to provide measureable data. In order to perform this analysis, an
analyst first identifies the critical path.

(1) Critical Path. The program critical path is the sequence of discrete tasks/activities in
the network that has the longest total duration through the contract. Discrete tasks/activities
along the critical path have the least amount of float/slack. Be wary of contractor methodology
that states the critical path is comprised of all the tasks with zero or less total float. This is not
the same as the longest total duration with the least amount of total float (i.e., a single number,
not a range of numerical values). DCMA will utilize the contract submitted data to represent the
contractor’s values, but based on the PST Member inputs, may have to predict DCMA forecasted
completion dates that are different than the contractor’s and may result in a different critical path.

(2) Event and Milestone ECD. The ECD is measured at each of the program milestones,
contractually required events, and the contract finish date. As a contract can have a large number
of milestones and events, it is recommended for reporting purposes to focus on those that exist
either on the critical path or are of significant interest to the customer. The contractor’s work
schedule determines how many days equals a month. For example, a contractor that does not
work weekends has approximately 22 work days in a calendar month. Measurement can also be
done using calendar days. The net effect of all the task slippages on the critical path and
therefore the contract will indicate by how much the ECD has deviated from the Baseline.

(a) Baseline Finish Date. The baseline finish date is the scheduled completion date
found in the “Baseline Finish” field of each task or milestone in the IMS.

APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT COMPLETION 28


DCMA-MAN 3101-02, Publication Date

(b) ECDKtr. The ECD is defined as the date found in the “Forecast Finish” field of a
properly networked IMS (which is typically the same as the date found in the “Early Finish”
field) of each milestone or task.

(c) ECDDCMA. The EVM Analyst will report the to the PI the most recently
completed milestone or contractually required event, and the next milestone or event, the
contract finish date and any other tasks, milestones or events that illustrate DCMA’s projection
of the contractors progress to the schedule.

(3) Comparative Analysis to Contractor Values. Provide the PI with an explanation of


any significant difference between the Contractor and DCMA estimates as well as an explanation
of the root cause of any slippages.

APPENDIX 3B: DCMA COST AND SCHEDULE ESTIMATES AT COMPLETION 29


DCMA-MAN 3101-02, Publication Date

SECTION 4: PROGRAM REPORTING

4.1. QUARTERLY PAR.

a. PI Drafts PAR. DCMA PARs provide our acquisition partners with a comprehensive
and unbiased assessment of the health of the program. PIs must adhere to the contents of the
Program Support Analysis and Reporting User Guide found on the Resource Page in the creation
of their PAR. PAR submissions must be submitted quarterly according to the schedule for their
assigned grouping (i.e., A, B, or C) in Table 3. Program group assignments can be found in the
IWMS PAR tool. The due date for PAR approval is NLT the 6th business day of the month for
the group to which they are assigned. It is imperative that all parties responsible for PAR
development and approval comply with the PAR Group Timeline (Figure 2) to ensure comments
are received prior to the PAR approval submission as well as for timely PAR distribution for
OSD review.

Table 3. PAR Report Months


Group A Feb May Aug Nov
Group B Mar Jun Sep Dec
Group C Apr Jul Oct Jan

Figure 2. PAR Group Timeline

(1) PI Reviews Functional and SPI Inputs. PIs must familiarize themselves with all of
the inputs provided by their PST members and SPIs in the program’s PST Collaboration Site.
Inputs may include Support PARs, reports from sub-tier CMOs, eFITs, EVM analysis, and
Production Supportability Tables.

SECTION 4: PROGRAM REPORTING 30


DCMA-MAN 3101-02, Publication Date

(2) PI Collaborates with PST Functional Members and SPIs. It is imperative that the PI
have discussions with their PST functional members, SPIs, and applicable ACOs, DACOs or
CACOs. This serves to better appreciate the inputs provided and to reach consensus on the
impact of the various inputs have on DCMA’s predictive assessments pertaining to program cost
and schedule.

(3) PI Drafts PAR. In order to assure the pre-populated PAR information is correct, the
PI must assure the Program Information is current. The CPA, PA, and MA synopses provide
senior leaders the bottom line up front (BLUF) by summarizing the primary drivers to the
aggregate assessment. They should be written at a strategic level and provide a summary
overview of DCMA’s perspective on program performance. Do not include any information that
is not mentioned in the assessment narratives. State the impacts, issues, and risks driving the
rating. Quantify contract and program impacts. Do not include the Assessment Color, rating
period, or program name in the synopsis.

(a) PAR Section 1. The PI must ensure that the PAR Section 1 data elements are
completed as follows.

1. DCMA CPA. The CPA provides an aggregate assessment of the program’s


health based on cost and schedule, using current technical issues or risks analyses, within the
context of contracts administered by DCMA. The assessment must reflect DCMA’s independent
predictive analysis. Complete this section of the PAR using the criteria found in Appendix 4B.
The CPA has two entries, Assessment Color and Assessment Narrative.

2. DCMA PA. The PA provides an independent assessment for contracts


administered by DCMA, regarding the contractor’s ability to meet all required production goals
and requisite capabilities for manufacturing, assembly, and integration; including hardware and
software. EMD contracts will also be assessed with respect to their prototypes or impacts on the
production contract deliveries. Complete this section of the PAR using the criteria found in
Appendix 4C. The PA has two entries, Assessment Color and Assessment Narrative.

3. DCMA MA. The MA provides a CBS assessment for the prime contractor(s).
The contractor must establish and maintain an acceptable CBS for contracts subject to the Cost
Accounting Standards and containing one or more of the six business system clauses. Refer to
Appendix 4D for more information.

4. Aggregate Rollup. Complete the Aggregate Ratings for the reporting month.
When multiple contracts exist, the Lead CMO or cognizant CMO will determine the aggregate
rollup method. Refer to the appropriate section of the Program Support Analysis and Reporting
User Guide located on the Resource Page.

(b) PAR Section 2. Supplemental Analysis for CPA, PA, and MA in PAR Section 1.
Complete this section of the PAR using the criteria found in Appendices 4B and 4C.

1. Supplemental Analysis for CPA, PA, and MA in PAR Section 1. Describe the
aggregate methodology used and any extenuating circumstances affecting the aggregate rating.

SECTION 4: PROGRAM REPORTING 31


DCMA-MAN 3101-02, Publication Date

2. CPA Supplemental Analysis. Narrative that reinforces the drivers of the CPA
rating and supplements PAR Section 1. Include the following or “no additional information” if
these do not apply:
a. For contracts, CLINs, and DOs discussed in PAR Section 1.1 with ratings
driven by cost:

• Substantiate the methodology for EACDCMA development, include the


values for risks and MR use and other assumptions and rationale for
developing the EACDCMA

• If there has been an OTB, assess the VACDCMA% against the Contract
Budget Base to maintain visibility

b. Watch items that were not discussed in PAR Section 1.1 (CPA Narrative)
and do not currently impact the CPA rating; identify them as watch items.

3. PA Supplemental Analysis. Supplemental analysis that reinforces the drivers


of the PA rating. Include the following or “no additional information” if these do not apply:

a. Watch items that were not discussed in PAR Section 1.2 (PA Narrative)
and do not currently impact the PA rating; identify them as watch items.

b. Any risks or issues identified in the analysis of the Production


Supportability Table.

4. MA Supplemental Analysis. Include analysis that reinforces the drivers of the


MA rating. Additional details derived from the Contract Business Analysis Repository (CBAR)
or the ACO that may warrant attention. Use this section to report any major subcontracts with
disapproved CBS that may directly or indirectly impact the program. If there is no additional
information to report, then state “No additional information.”

5. Supply Chain Supplemental Analysis. Provide a synopsis of the narrative that


identifies what Supply Chain aspects are impacting the rating of the program.

(c) PAR Sections 3 and 4. The PI must ensure that the PAR Sections 3 and 4 data
elements are completed as follows.

1. Program Office Requested Data and Specific Reporting. Include any other
information that has been requested by the Program Office. This includes any Foreign Military
Sales (FMS) contract information that the Program Office has requested. If there is no requested
data or specific reporting, annotate “Not Applicable.”

2. Accuracy of PI Information. In order to assure the pre-populated PAR


information is correct, the PI must review the information for currency prior to submitting the
PAR for review and approval.

SECTION 4: PROGRAM REPORTING 32


DCMA-MAN 3101-02, Publication Date

3. Attachments. Do not upload attachments unless they are absolutely necessary


to tell the story or required by specific written requirements. This includes Support PARs,
eFITs, or EVM inputs.

b. PAR Review and Approval. After the draft PAR has been submitted for review in
Integrated Workload Management System (IWMS), all applicable DCMA organizations (e.g.,
CMO Leadership, Operational Units, and Headquarters) have the opportunity to review the draft
PAR. Review comments must be inserted in the IWMS PAR comment feature within the
following 3 business days. The intent of using the IWMS comments sections is to have a
common area to improve PAR quality before finalization. Visibility and documentation of these
comments and associated changes provide traceability, documentation, and insight to possible
training shortfalls and gaps for future mitigation through manuals and other venues. CMO
personnel must manage PAR improvement efforts through IWMS. Edits cannot be made after
PAR approval. The CMO Commander or Director is ultimately responsible and is the approval
authority for PAR content. The PAR approval due date for these quarterly submissions is NLT
the 6th business day of the month for the program group to which they are assigned.
(Subparagraph 1)

(1) PI Initiates the Agency Review Process. The PI must initiate the Agency PAR
review process per the appropriate section of the Program Support Analysis and Reporting User
Guide located on the Resource Page in accordance with the timeline in Figure 2.

(2) Program Management Office (PMO) Draft PAR Review. The PI must send a draft
copy of the quarterly PAR to the PMO for review and comment by the last business day of the
month. The PI must wait at least 3 business days from PMO submission to allow for PMO
feedback prior to PAR submission for approval. The PMO has the opportunity to comment prior
to the PI making the assessments available to DCMA management and possible dissemination
outside the Agency. The PI must inform the PMO that information contained within the
documents may be required to input into DAMIR if subjected to DAES reporting. If, after the 3
day window, the PMO has not provided any comments, the PI may consider this concurrence.

(3) PI Reviews Comments. The PI must review all comments for potential inclusion into
the PAR.

(4) PI Updates PAR. Based on the comments received from all reviewers, the PI updates
the draft PAR. The PI must not make any updates to the draft PAR based on PMO comments
which question our assumptions, conclusions, or independent assessment of the program. The PI
must, however, make updates to the draft PAR to correct any misstated facts.

(5) CMO Commander or Director Reviews PAR. The CMO Commander or Director or
designee must review the entire draft PAR submitted for approval. Returned PARs must have
comments explaining the reasons. Minimum elements to consider during review:

(a) Evaluate overall structure, readability, and an active voice format.

(b) Ensure PAR is free from spelling and grammatical errors.

SECTION 4: PROGRAM REPORTING 33


DCMA-MAN 3101-02, Publication Date

(c) Ensure the issue or risk, root cause, mitigation strategy, contract or program
impact, and DCMA analysis are clearly stated.

(d) Consistent data format to include date and dollar.

(e) Ensure real-time, objective communication with the PM on development and


content of the program assessments.

(6) CMO Commander or Director Approves PAR. The CMO Commander or Director or
designee must approve all PARs ready for release. There is no method for editing a PAR after
approval as it becomes an official document. Further correction can only be accomplished by
cancelling and republishing a new PAR or through PNs.

(7) PI Utilizes Customer Feedback. The PAR contains a survey link for customer
feedback. When a customer completes the survey, the feedback is provided to the CMO via
DCMA Headquarters. The CMOs should consider this feedback as an opportunity to improve
future reporting.

c. PAR Distribution. PARs must be distributed to our external acquisition partners in


accordance with the appropriate section of the Program Support Analysis and Reporting User
Guide located on the Resource Page within 1 business day after PAR approval.

d. PAR Quality. The Operational Unit representatives must provide an independent


evaluation of the quality of PARs developed by the CMOs. The purpose of this is to focus
Agency efforts on improving the quality of its program reporting. Coupled with these reviews
for quality is an engagement with the CMO personnel to facilitate the improvement through
collaboration when the PAR is considered to be of insufficient quality. CMOs have the option to
use the PAR Scoring Rubric to perform a self-assessment.

(1) Operational Unit Identifies PARs to be Reviewed. The applicable Operational Unit
representative must develop a plan for evaluating the quality of the PARs under their cognizance.

(2) Operational Unit Reviews PAR Using Scoring Rubric and Rework Metric. The
Operational Unit reviews their PARs using the Scoring Rubric and Rework Metric according to
their plan.

(3) Operational Unit Provides PAR Score, Feedback, and Recommendations to the PI, PI
FLS, and CMO Commander or Director. The Operational Unit must provide the results of their
review to the CMO Commander or Director, the PI FLS, and the PI at a minimum. The results
of the review should include the completed PAR Rubric coupled with any recommendations for
improvement that should be incorporated into the next PAR.

(4) Operational Unit and CMO Jointly Develop and Implement Get Well Plan. If the
results of an Operational Unit PAR quality review is a score of less than 85 percent, then a get
well plan must be jointly developed between the Operational Unit representative and
representatives from the CMO. The purpose of the get well plan is to develop an action oriented
approach to impart the requisite knowledge and skills so that subsequent PARs will be developed
which are of a sufficient quality. The plan must, at a minimum, include the steps to be taken to

SECTION 4: PROGRAM REPORTING 34


DCMA-MAN 3101-02, Publication Date

improve future PAR quality coupled with estimated dates in which each of these steps will be
completed. The plan will then be implemented by the estimated dates.

(5) Operational Unit and CMO Utilize Scoring Rubric and Rework Metric. If the results
of the initial PAR quality review were a score of less than 85 percent, then a follow-up review of
a subsequent PAR for that program must be completed by the applicable Operational Unit
representative using the complete Scoring Rubric and Rework Metric. The results of the follow-
up review must be distributed to the same CMO personnel as the original results.

(6) Operational Unit Loads PAR Scores. The Operational Units check out the PAR
Scoring Rubric spreadsheet from the DAES Strategic Metrics (PAR Rubric) library of the MPS
DAES Site and update the spreadsheet tabs, corresponding to the programs being reviewed, by
the last business day of the report month. A link to the MPS DAES Site is located on the
Resource Page.

(7) PM&BI Provides Agency metrics to DCMA Director. The PM&BI Executive
Directorate must provide the DCMA Director with metrics that synopsizes the results of the PAR
quality evaluations performed throughout the period.

e. PAR Blackout. The quarterly PAR submitted may contain source selection sensitive
information. Caution must be given to programs with ongoing source selections so that
competition sensitive information is not released. Source selection as defined in Federal
Acquisition Regulation 2.101 and 3.104-4 must be excluded from DAES Assessments. Other
information which could jeopardize the competitive nature of a successful source selection must
also be excluded. Only DCMA personnel with a need to know will have access to a PAR for a
source selection sensitive program. PIs should be aware of all circumstances where a program is
subject to source selection restrictions. Information pertaining to the establishment of program
Blackout is contained in DCMA-MAN 3101-01, “Program Support Life Cycle.”

4.2. PROGRAM NOTIFICATION (PN).

a. Summary. PNs are used to communicate updates and observations in addition to an


independent assessment of significant issues, risks, or opportunities affecting a program when
the reporting timeframe is outside the standard PAR cycle. The PN, just as with the PAR, is an
official DCMA product of record. However, the PN is a near-real time reporting tool allowing
for the dissemination of actionable acquisition insight in a more streamlined fashion than the
PAR. A PN is required to officially inform other customers even if this information has already
been provided to the Program Office by other means. SPIs utilize PNs in much the same fashion
as PIs. However, SPI PNs must be supplied to the next higher tier SPI or the PI as applicable.
The PI’s CMO Commander or Director is the release authority for any PN that is based on or
contains SPI PN information.

b. Minimum Conditions Requiring a PN. PNs must be generated when:

(1) The CMO has determined that the information in the previous PAR or PN has
changed in a significant way, not only when a rating changes.

SECTION 4: PROGRAM REPORTING 35


DCMA-MAN 3101-02, Publication Date

(2) The CMO identifies an issue or risk that changes the CPA, PA, and/or MA ratings
unless those changes can be incorporated into the next PAR in a near-real time basis.

(3) There is any significant evolution to an event described in the immediately preceding
PAR or PN.

(4) A program related event has been submitted as an Agency Weekly Activity Report
(WAR) entry.

(5) Any significant updates to previous reported forecasted dates.

(6) Upon stakeholder request.

c. PI or SPI Develops PN. PI or SPI develops PN according to the appropriate section of


the Program Support Analysis and Reporting User Guide located on the Resource Page.

d. PN Review. The following process must be used for PN review.

(1) PI/SPI Forwards the Completed PN for Review. The PI must submit the completed
PN to the Lead CMO Commander or Director, or their designee, for review. SPIs may draft PNs
for the PI but must upload them to the Program Notification tab in the program’s PST
Collaboration Site. The PI will review the SPI PN for content, the need to integrate with other
draft PNs, and make a determination if the PN will be released. The PI then coordinates any
updates prior to the PI submitting the PN to the Lead CMO for review.

(2) Lead CMO Commander or Director Review of PN. PNs must be reviewed by the
Lead CMO Commander or Director prior to release to ensure PN adequacy. When the Lead
CMO Commander or Director completes their review, they may approve, disapprove with
comments, or reject the PN. SPI PNs do not require formal approval by their CMO Commander
since only the approved PNs will be releasable outside of DCMA.

(3) Lead CMO Commander or Director Communicates Changes to PI. If the Lead CMO
Commander or Director is either returning the PN back to the PI for changes or is deleting the
PN, specific comments should be made to the PI clearly describing the rationale for either
returning or deleting the record. If the PN is returned with comments, the PI must address those
comments and resubmit the PN for review and approval.

e. Approve PN. The following process must be used for PN approval.

(1) PN Attachments. The intention is not to include any attachments unless they are
absolutely necessary to tell the story.

(2) PI Converts PN to Portable Document Format (PDF). Prior to approval, the PI must
convert the PN to a PDF.

(3) Lead CMO Commander or Director Approves PN. The Lead CMO Commander or
Director, or their designee, must digitally sign the PN when it is ready for release. The Lead
CMO Commander or Director will return the PN to the PI.

SECTION 4: PROGRAM REPORTING 36


DCMA-MAN 3101-02, Publication Date

(4) Posting the PN. PNs will be loaded into the program’s PST Collaboration Site.

f. PN Distribution. PNs must be distributed to our external acquisition partners utilizing the
same means as PAR distribution.

g. PI Incorporates PN Information in Next PAR. Information contained in PNs must be


addressed in the next PAR for continuity purposes. For example, if the CMO has used a PN to
identify the disapproval of a CBS and the decision was subsequently made to exercise a withhold
of payment, then this should be addressed in the next PAR.

4.3. DAES ASSESSMENT INPUT.

a. Summary. All DAES reportable programs are submitted to DAMIR on a quarterly basis
according to the Group Assignment. Each program has a CPA, PA, and MA color rating,
synopsis, and narrative assessment entered into DAMIR by the responsible CMO. DAMIR
access must be requested using DCMAF 3101-02-01, “DAMIR Access Request” following the
guidelines outlined in the appropriate section of the Program Support Analysis and Reporting
User Guide. See the Resource Page for the form as well as the user guide. The DAES
assessments uploaded into DAMIR are derived entirely from the ratings, synopses and Section 1
narratives of the PAR. All DAES reviews and comments must occur during the PAR review
process as described in paragraph 4.1.

b. PI Loads PAR Section 1 Ratings, Synopses, and Assessments into DAMIR. NLT the
7th working day of the month, the DAMIR Action Officer must create and enter a color rating,
synopsis, and narrative for each assessment area (CPA, PA, MA) for assigned programs.

c. CMO Commander, Director, or Deputy (Supervisor) Reviews Input Assessments in


DAMIR. NLT the 7th working day of the month, the CMO Commander, Director, or Deputy
(Supervisor) reviews the DAMIR Input Assessments for the same criteria in Appendices 4B and
4C, and paragraph 4.1.a. Upon review completion, the PI (Action Officer) and the CMO
Commander, Director, or Deputy (Supervisor) must promote all DAMIR assessments to PM&BI
by selecting the “Release” button.

d. PM&BI Releases DAES Assessment in DAMIR. The MPS Division is responsible to


perform a final review of all program assessments and execute the release authority role to
OUSD (AT&L) in DAMIR. This role must ensure all program assessments required for the
month have been promoted by the CMO Commander, Director, or designee. Release Authority
will ensure no obvious errors are present prior to releasing the program synopsis and assessments
NLT the 8th working day of the month. The MPS Division has authority to make changes,
updates, and release assessments to facilitate on-time release in DAMIR. Any changes made by
the MPS Division will require a notification to the OU and CMO.

e. Operational Unit Provides DAES Rework to PM&BI/Lead CMO/CMO. Operational


Units must record rework results on the Resource Page.

SECTION 4: PROGRAM REPORTING 37


DCMA-MAN 3101-02, Publication Date

APPENDIX 4A: PAR NARRATIVE CRITERIA

4A.1. PAR GUIDELINES.

a. Readability. The narratives must have a consistent flow (logic) and be clear, concise, and
primarily use active voice. Write PARs for Senior Leadership Review and always start with the
BLUF. Narratives must be written to convey the most significant rating driver first. Bullets or
subparagraphs may be useful for outlining multiple issues or events. Titles for paragraphs may
be useful in identifying main issues.

(1) Overall Formatting Guidelines. The PI must use the following formatting guidelines
throughout the PAR.

(a) Use a consistent date format throughout the PAR narratives.

(b) Spell out all acronyms the first time used in each assessment area (CPA, PA,
MA), with the exception of assessment Synopses and acronyms identified in the “PAR –
Common Acronym List” document. See the DCMA-INST 3101 Resource Page.

(2) Dollar Rounding Conventions. Use the following dollar rounding conventions in
your PAR. The exception is within tables, when consistent dollar convention should be used
(e.g., thousands or millions).

(a) Do Not Use Commas. Use the next higher convention (i.e., use $1.32B instead of
$1,322.12M).

(b) Use At Least One Non-Zero Digit Prior to Decimal. Have at least one non-zero
digit prior to the decimal (i.e., $2.32M instead of $0.002B). Use up to two decimal places.

(c) Different Assessment Narrative Convention is Acceptable. A different


convention is acceptable in the assessment narratives if it makes sense (i.e., The DCMA EAC is
$13.28B with a VAC of $112.45M).

(3) Synopses. The CPA, PA, and MA synopses provide senior leaders the BLUF by
summarizing the primary drivers to the aggregate assessment. They should be written at a
strategic level and provide a summary overview of DCMA’s perspective on program
performance. Synopses are limited to 350 characters, including spaces. Do not include any
information that is not mentioned in the assessment narratives. State the impacts, issues, and
risks driving the rating. Quantify contract and program impacts. Do not include the Assessment
Color, rating period, or program name in the synopsis.

b. Content Criteria. Issue and Risk Assessments must contain the following information as
a minimum:

(1) Issue or Risk Description. Describe the issue or risk. When addressing issues or
risks that contribute to and/or drive the CPA, PA, and MA color criteria, include the
nomenclature of the material (e.g., name, description), the Prime Contract number(s), and the

APPENDIX 4A: PAR NARRATIVE CRITERIA 38


DCMA-MAN 3101-02, Publication Date

Subcontractor name, if applicable. When discussing color changes in the ratings from the
previous PAR, an example format to use is “The (color) rating is (the same as, worse than, better
than) the last quarter’s PAR rating due to (risk or issue).”

(2) Root Cause. Document the root cause or source that resulted in the issue occurring;
include DCMA’s assessment of the root cause.

(3) Impact in Dollars and Days. Using DCMA predictive analysis, describe issue and
risk impacts, quantified in dollars and days. These dollar and day impacts, discovered from
functional surveillance, are critical for forecasting cost overruns, schedule slips, and executive
level decision making for programs and EVM reporting contracts.

(4) Contractor’s Mitigation Plan. Identify the contractor’s mitigation plan including any
corrective actions taken to resolve the issue or mitigate risk. Include the DCMA independent
assessment of the adequacy of contractor actions and the likely outcome.

(5) Path Forward. Document possible courses of action that DCMA or the PMO can take
to resolve the issue or mitigate the risk.

(6) Assessments Disclosure Criteria. Assessments for programs under solicitation,


selection of sources, or award of contracts must not disclose contractor bid, proposal, or source
selection information before the award of a Federal agency procurement contract to which the
information relates.

(7) Narratives References. Narratives in PAR Section 1 must not reference other sections
in the PAR.

APPENDIX 4A: PAR NARRATIVE CRITERIA 39


DCMA-MAN 3101-02, Publication Date

APPENDIX 4B: CONTRACT PERFORMANCE ASSESSMENT

4B.1. CPA Color. The minimum rating criteria, used for each contract/CLIN/DO/TI, are
provided in Table 4, Cost Assessment Color Criteria and Table 5, DCMA Schedule Slip
(Months) Color Criteria (CPA).

Table 4. Cost Assessment Color Criteria


Cost Assessment Color Criteria
VACDCMA% -17 -16 -15 -14 -13 -12 -11 -10 -9 -8 -7 -6 -5 -4 -3 -2 -1 0 1 2
against TAB | | | | | | | | | | | | | | | | | | | |
(EVM only) Red Yellow Green
<-15% > -15% and < -10% > -10%

Table 5. DCMA Schedule Slip (Months) Color Criteria (CPA)


Schedule Assessment Color Criteria
Contractually -2 -1 0 1 2 3 4 5
Required | | | | | | | | | | | | | | | | |
Events / Green Yellow Red
Delivery Slips < 1 months > 1 and ≤ 3 > 3 months
months
a. Changing the CPA Assessment Color. The PI may change the CPA Assessment Color
due to additional analysis/factors, with an explanation provided in the assessment narrative. For
example, delivered items but conditionally accepted or with waiver or deviation, disapproved
CBS, test failures, or high use of MR could lead a PI to downgrade the CPA Assessment Color.

b. For Disapproved EVMS or EVMS Level III or IV CAR. For EVMS that is
disapproved, or having a transmitted EVMS Level III or IV CAR:

(1) Lead CMO Requests Impact Statement. The Lead CMO requests an impact statement
from the EVMS Center which may be as little as a statement that disapproved EVMS does not
impact the CPA (e.g., no EVM reporting contracts on program).

(2) Aggregate Assessment Color Default. The aggregate assessment color for CPA must
not be Green when the EVMS Center impact statement has indicated that the cost or schedule
EVM data used to rate the CPA is not reliable.

(3) Impacts Explanation in Assessment Narrative. An explanation is included in the


assessment narrative of impacts to the analysis (i.e., determination of EAC).

c. Government Furnished Equipment (GFE) or Government Furnished Material


(GFM) Incorporated into CPA Ratings. GFE or GFM that impacts the contract will be
incorporated into the CPA ratings. FMS are not included in the Table, nor incorporated into the
aggregate CPA ratings. Sustainment contracts and contracts that are fully shipped or have all
DD Form 250s processed may be excluded, if not applicable to the aggregate rating.

APPENDIX 4B: CONTRACT PERFORMANCE ASSESSMENT 40


DCMA-MAN 3101-02, Publication Date

4B.2. CPA Narrative. The CPA narrative provides data, information, and analysis focused on
supporting the Assessment Color. The narrative is limited to 3,850 characters. Begin the
assessment with the Bottom Line; this could be a copy of the synopsis or similar expanded
statement. Address changes in Assessment Color from the previous PAR. When contract issues
or risks are discussed that would result in a worse rating than the Aggregate Assessment Rating,
then the Aggregate Rating must be briefly explained.

a. Based on DCMA’s independent assessment of current and future impacts, comment on


contracts, CLINs, DOs, or Technical Instructions (TI) driving the Aggregate Assessment Rating
include:

(1) Impact of significant cost, schedule, and technical issues or risks.

(2) Issue or risk description.

(3) Root cause.

(4) Contractor mitigation strategy.

(5) DCMA’s independent assessment of that mitigation strategy.

(6) Supplier or subcontractor name for supplier issues or risks.

(7) Issues, risks are listed in order of significance.

b. For shipbuilding programs, specify the GFE subprogram being assessed.

c. Maintain a consistent flow of information from current period’s PA and MA. For
example, if schedule delays drive the PA rating, or if business system in the MA is disapproved
(e.g., EVMS), then incorporate or address the issues and risks from the PA and MA in the CPA.

d. For contracts requiring EVM reporting, for contracts, CLINs, DOs, and TIs where the
aggregate rating is Yellow or Red, include, at a minimum:

(1) EACDCMA, VACDCMA, and VACDCMA% values

(2) A discussion of VACDCMA drivers; risks to remaining effort and the availability of
MR to offset any risks, methodology, and observed trends indicating changes in future
performance

(3) EACKtr if there are significant differences compared to EACDCMA (i.e., more than 5
percent) include values and explanation of difference

(4) A discussion of any known effects on program milestones or contractually required


events, including projected schedule slips

(5) An evaluation of performance estimates and trends against program performance


thresholds, objectives, and Technical Performance Measures (TPM) impacting or potentially
impacting contract cost and schedule.

APPENDIX 4B: CONTRACT PERFORMANCE ASSESSMENT 41


DCMA-MAN 3101-02, Publication Date

e. For contracts not requiring EVM reporting, for contracts, CLINs, DOs, and TIs where the
aggregate rating is Yellow or Red, include, at a minimum:

(1) An assessment of contract schedule performance and identify contractual deliveries


and events that will be late or are projected late to schedule by using the schedule assessment
criteria in Table 5 and by incorporating:

(a) Issues and risks that could jeopardize the contractor’s ability to meet contractual
requirements (e.g., staffing levels, labor rates, achievement of milestones, and technical goals)

(b) Forecasts on remaining deliveries based on current delivery trends or functional


assessments; include independent DCMA expected delivery date and quantified number of units
to be delivered (delivery assessments do not need to include CDRLs)

(c) Evaluations of performance estimates and trends against program performance


thresholds, objectives, and TPMs impacting or potentially impacting contract cost and schedule.

(2) Disapproved EVMS or transmitted EVMS Level III or IV CARs that do not impact
the CPA rating.

APPENDIX 4B: CONTRACT PERFORMANCE ASSESSMENT 42


DCMA-MAN 3101-02, Publication Date

APPENDIX 4C: PRODUCTION ASSESSMENT

4C.1. PA Color. Tables 6, 7 and 8 provide the minimum rating criteria for each EMD or
production contract, CLIN, DO, or TI listed in the “Table 1.0, Contract Aggregate Assessment”
with ability to downgrade (e.g., Yellow to Red) due to further analysis. Consider the relevant
items in Table 7 which influence contractually required events, deliveries, and production
readiness. Expand on the root cause, impact, and how the issue affects contractually required
events. The Production Assessment incorporates current issues and risks, even when that risk is
projected to impact the program in the future. GFE or GFM that impacts the contract will be
incorporated into the PA ratings. FMS contracts are not included in the Aggregate Table, nor
incorporated into the aggregate PA ratings. Sustainment contracts and contracts that are fully
shipped or have all DD Form 250s processed may be excluded, if not applicable to the aggregate
rating.

Table 6. DCMA Schedule Slip (Months) Color Criteria (PA)


Schedule Assessment Color Criteria
Contractually -2 -1 0 1 2 3 4 5
Required | | | | | | | | | | | | | | | | |
Events / Green Yellow Red
Delivery Slips < 1 months > 1 and ≤ 3 > 3 months
months

Table 7. Production Assessment Criteria


Production Assessment Criteria
Color Rating Criteria for Prime and Supply Chain
GREEN: No non-business system transmitted Level III/IV CARs exists
No Open Failure Review Board (FRB) action impacting delivery or
contractual events
Open Priority II Software Defect closure rate meets supplier plan with no
projected impact to SW build delivery or delivered product capabilities
No open Priority I Software Defect
YELLOW: One or more non-business system transmitted Level III/IV CARs exists
Or One or more draft Level III/IV CARs exists that impacts or may impact
delivery schedules or requisite capabilities for manufacturing, assembly, or
integration
Or Open Priority II Software Defect closure rate impacts planned SW build
delivery date or delivered product capabilities
Or Open Priority I Software Defect exists, with no projected impact to the
SW Build delivery date or delivered product capabilities
Or Open FRB action impacting delivery or contractual test events, with
Root Cause and CAP identified
RED: One or more transmitted Level III/IV CARs exist that impacts or may
impact delivery schedules or requisite capabilities for manufacturing,
assembly, or integration
Or Open Priority I Software Defect exists that impacts planned SW build
delivery date or delivered product capabilities
Or Open FRB action impacting delivery or contractual test events, with no
Root Cause or CAP identified
Or Product decertification by Program Executive Officer or Program Office
Or Non-acceptance of product by DCMA

APPENDIX 4C: PRODUCTION ASSESSMENT 43


DCMA-MAN 3101-02, Publication Date

Table 8. Production Assessment Checklist


Production Assessment Checklist
Assessment Criteria Assessment Criteria Explanation
Contractor has controls in place for the oversight and surveillance of subcontract /supplier efforts,
Contractor-Supplier
evaluates their sources for cost, quality and technical performance. Make-or-buy program is
Relationships
applied in the best interest of the Government.
Engineering Design/ Are requirements stable? Are there test issues requiring redesign? Are product specifications
Configuration Stability changing to accommodate cost/schedule? Are there open requests for deviations, waivers, or
variances? Are there open major Engineering Change Proposals? What is the status of TPMs?
Are there issues/risks associated with system engineering reviews that could impact production
(e.g., Preliminary Design Review (PDR), Critical Design Review (CDR), Physical Configuration
Audit (PCA), Functional Configuration Audit (FCA) or testing)?
Production Schedule What is the program production status (ahead or delinquent to schedule)? What is the impact of
Status this performance (e.g. will not meet established schedule, achieving entrance/exit criteria)?
Are there issues/risks associated with subcontractors impacting the prime contractor? What is the
impact to cost, schedule, and technical performance? Are these issues/risks being addressed by
Supply Chain
the prime or subcontractor and are the corrective actions preventing them from adversely
impacting the program?
Does the contractor have adequate production resources and manufacturing capacity to meet
Production Resources
production goals?
Are there any process capability issues that could impact production/production readiness? Are
there any open PQDRs? Are the PQDRs closed in a reasonable time? Evaluate scrap/rework
Product Quality Issues
rates, First Time Through, First Time Yield, First Pass Yield as an indicator that further analysis
or improvements may be necessary.
Quality Mgmt System Does the contractor adhere to its QMS and follow its own policies? Has there been a company
(QMS) audit that identified findings which could impact the product?
Manufacturing Is the contractor ready for production? What is the MRL for the contractor? What is DCMA’s
Readiness Level Manufacturing Risk Assessment (MSRA)
(MRL)

4C.2. PA Narrative. The PA narrative provides data, information, and analysis focused on
supporting the Assessment Color. For aggregate assessments that are rated Green, provide
justification to support the assessment. If identifying issues and risks in a Green assessment,
consider downgrading to Yellow to raise awareness of the issues or risks. The narrative is
limited to 3,850 characters. Begin the assessment with the Bottom Line; this could be a copy of
the synopsis or similar expanded statement. Address changes in Assessment Color from the
previous PAR. When contract issues or risks are discussed that would result in a worse rating
than the Aggregate Assessment Rating, then the Aggregate Rating must be briefly explained.

a. Based on DCMA’s independent assessment of current and future impacts, comment on


contracts, CLINs, DOs, or TIs driving the Aggregate Assessment Rating include:

(1) Impact of significant cost, schedule, and technical issues or risks.

(2) Issue or risk description.

(3) Root cause.

(4) Contractor mitigation strategy.

(5) DCMA’s independent assessment of that mitigation strategy.

(6) Supplier or subcontractor name for supplier issues or risks.

APPENDIX 4C: PRODUCTION ASSESSMENT 44


DCMA-MAN 3101-02, Publication Date

(7) Issues, risks are listed in order of significance.

(8) When corrective action or risk mitigation is projected to resolve an issue or risk in a
future quarter, this predictive analysis rating change improvement must be identified in the
corresponding assessments in Sections 1 and 2 of the PAR, but is not incorporated into the
current period assessment rating until the issue or risk has been actually resolved.

(9) Impacts of transmitted Level III and IV CARs, address actual or proposed date of
submission for the CAP and DCMA’s assessment of the contractor’s status towards closing the
CAR should be included in the narrative.

b. Maintain a consistent flow of information from current period’s CPA and MA.

APPENDIX 4C: PRODUCTION ASSESSMENT 45


DCMA-MAN 3101-02, Publication Date

APPENDIX 4D: MANAGEMENT ASSESSMENT

4D.1. MA Color. The minimum rating criteria are provided in Table 9, Management
Assessment Criteria, to determine the Assessment Color. If an EVMS is disapproved or has a
transmitted EVMS Level III or IV CAR, request an impact statement from the EVMS Center to
determine impact to CPA rating and narrative.

Table 9. Management Assessment Criteria


Management Assessment Criteria

Color Rating Criteria for CBS Status

GREEN: All six CBSs are Approved, Not Evaluated, or Not Applicable and there
are no transmitted or draft Level III/IV CAR against a CBS.

YELLOW: All six CBSs are Approved or Not Evaluated AND:


• There is a draft Level III/IV CAR against a CBS; or
• An initial determination has been issued to the contractor
identifying significant deficiencies in a CBS approval/disapproval
process; or
• The CBS is under a legacy review with a Level III/IV CAR, but
no final determination to disapprove the system has been made.
RED: One or more CBS is disapproved or has a transmitted Level III/IV CAR
against a CBS not under a legacy review

4D.2. MA Narrative. The MA narrative provides data, information, and analysis focused on
supporting the assessment. The narrative is limited to 2,500 characters. Begin the assessment
with the Bottom Line; this could be a copy of the synopsis or similar expanded statement.
Address changes in Assessment Color from the previous PAR. Non-Evaluated or Disapproved
CBS ratings must have the following in the MA Narrative:

(1) A summary statement

(2) Address changes in Assessment Color from previous quarter in the first paragraph

(3) Pertinent information regarding DCMA’s determination of CBS approval or


disapproval or plans for the ACO to issue a final determination

(4) Disapproved Systems must include the drivers for disapproval (e.g., for EVMS,
include the guideline numbers and titles that resulted in the disapproval; for Material
Management and Accounting System, include the standard)

(5) Impact of transmitted Level III or IV CARs supporting CBS disapproval (if there is
no impact, state that); address actual or proposed date of submission for the CAP, DCMA’s

APPENDIX 4D: MANAGEMENT ASSESSMENT 46


DCMA-MAN 3101-02, Publication Date

assessment of the contractor’s status towards closing the CAR, and estimated time for follow-up
review

(6) Identify if a withhold applies to the disapproved CBS. If so, discuss:

(a) Whether the payment withhold applies to the program. If it does not apply to the
program, still identify the withhold but state that it does not impact the program.

(b) The estimated withhold percent amount

(c) Whether the withhold is against progress payment, performance based payments,
or interim payments billed under cost, labor-hour, or time and materials contracts

(7) Comment on significant CBS issues, their impact to individual contracts (if there is
no impact, state that), and contractor’s ability to execute the contract

(8) Upcoming reviews planned for a CBS.

APPENDIX 4D: MANAGEMENT ASSESSMENT 47


DCMA-MAN 3101-02, Publication Date

GLOSSARY

G.1. DEFINITIONS. Unless otherwise noted, these terms and their definitions are for the
purpose of this issuance.

ACAT I. Programs categorized as Major Defense Acquisition Program (MDAP) or Major


Automated Information Systems (MAIS) programs that have been designated ACAT I by the
Milestone Decision Authority.

ACWP. The total dollars spent on labor, material, subcontracts, and other direct costs in the
performance of the contract SOW. These costs are controlled by the accounting general ledger
and should reconcile between the accounting system and EVMS. ACWP is independently
reported by the contractor’s accounting system. Simply stated: “actuals.”

BCWP. Dollarized value of all work actually accomplished in a given time period or Earned
Value. This is equal to the sum of the budgets for completed WPs, completed portions of open
WPs, apportioned effort earned on the base tasks, and the value of LOE activities. BCWP is not
realized until the work is completed.

BCWR. Represents that portion of the budget for work not yet accomplished within a Control
Account. It is the difference between the BAC and the BCWPCUM.

BCWS. Dollarized value of all work scheduled to be accomplished in a given time period or
Planned Value. The sum of the performance budgets for all work scheduled to be accomplished
within a given time period. This includes detailed WPs, apportioned effort, LOE packages,
planning packages, and Summary Level Planning Packages.
BEI. The BEI metric is an IMS-based metric that calculates the efficiency with which tasks
have been accomplished when measured against the baseline tasks at a Status Date. BEI tasks do
not include Summary or LOE tasks.
Tasks Completed Qty of Tasks Completed
BEI = =
Baseline Count Qty of Tasks Completed + Qty of Tasks Missing Baseline Finish
CMT. The CMT reviews new contracts; performs an initial contract review; determines skill-set
and PST organizational requirements to support new major programs; and as deemed necessary
by the ACO, conducts a Post Award Orientation Conference with all CMT members assigned to
that contract.

Cognizant ACO. The administrative contracting officer responsible for performing the duty per
this manual, includes the DACO, CACO and ACO.

CPI. CPI is an efficiency factor representing the relationship between the performance
accomplished (BCWP) and the actual cost expended (ACWP). CPR/IPMR Format 1 contains
the BCWP and ACWP data. CPI can be calculated for current period (monthly) or cumulative
(to date).
BCWP
CPIx = ACWPx
x

GLOSSARY 48
DCMA-MAN 3101-02, Publication Date

where: x is current period (cur) or cumulative (cum).

Critical Path. Critical path is a sequence of discrete lower level tasks/activities in the network
that add up to the longest overall duration through an end point. The critical path determines the
shortest time possible to compete the contract. Any delay of an activity on the critical path
directly impacts the baselined completion date; i.e., there is no float on the critical path. Lower
level tasks/activities along the critical path have the least amount of float/slack (scheduling
flexibility) and cannot be delayed without delaying the finish time of the end point effort.

CV. The difference between BCWP and ACWP. It can be measured using cumulative (CUM)
or current (CUR) values at either the WP or the contract level. CPR/IPMR Format 1 contains the
BCWP and ACWP data as well as the correlating CVs. The CV% metric quantifies the
magnitude of the CV by dividing CV by BCWP and multiplying by 100. The formulas for
calculating CV and CV% are:
CV
x
CVx = BCWPx − ACWPx and CVx % = BCWP × 100
x
where: x is current period (cur) or cumulative (cum).
DAES. Principal mechanism for tracking programs between milestone reviews. It is both a
reporting and review process serving two primary purposes: (1) Provide awareness of the
execution status of all reporting programs, and (2) Provide assessments that enable identification
of emerging execution issues that warrant the attention of senior leadership.

DAMIR. OSD tool used to communicate program assessments and information across the DoD
Acquisition Enterprise.

MAIS. DoD acquisition program for an automated information system that is either designated
by the Milestone Decision Authority as a MAIS, or estimated to exceed certain dollar levels.

Major Programs. A term used by DCMA to identify those programs with specific reporting
requirements. Major Programs include (unless approved by exception):
• ACAT I/MDAPs
• DAES programs (excluding MAIS)
• Missile Defense Agency Ballistic Missile Defense System programs
• Strategic Systems Programs
• Additional programs or sub-programs designated by the PM&BI Executive Director.

MDAP. ACAT I programs are MDAPs. Programs estimated by the OUSD(AT&L) to require
eventual expenditure for Research, Development, Test and Evaluation of more than $365 million
(FY 2000 constant dollars) or procurement of more than $2.19 billion (FY 2000 constant
dollars), or those designated by the OUSD(AT&L) to be MDAPs.

Memorandum of Agreement. The program Memorandum of Agreement is a bi-lateral or


multi-lateral document endorsed by the CMO Commander or Director and PMO Manager, which
identifies the goals of DCMA support.

MR Consumption Ratio. MR Consumption Ratio is the ratio of percent Complete to Percent


MR.

GLOSSARY 49
DCMA-MAN 3101-02, Publication Date

Percent Complete
MR Consumption Ratio =
Percent MR
Operational Unit. DCMA organizational entity charged with ensuring mission accomplishment
for their organization. For purposes of this manual only, Operational Units include: East,
Central and West Regions, the International Directorate, and the Special Programs Directorate.

Performance Assessments and Root Cause Analyses (PARCA). Carries out performance
assessments of MDAPs and conducts root cause analyses for those MDAPs with Nunn-McCurdy
breach status or when requested by senior DoD officials.

Percent Complete. Percent complete is the percentage of the amount of completed work to date
to the PMB or BAC. The formula for percent complete (%):
BCWPcum
Percent Complete (%) = %comp = × 100
BAC
Percent MR. Percent MR is the percentage of MR used to the amount of MR on the contract.
Total Amount of MR Used
Percent MR = x 100%
Total Amount of MR Added to the Contract
Post Award Orientation Conference. A Post Award Orientation Conference may be held to
perform a detailed review of the contract, specifically highlighting and discussing complex terms
and conditions. The conference will ensure that all parties understand contractual requirements.

Predictive Analysis. The collection, examination, and synthesis of information and data from
our on-site presence which states (in terms of future cost, schedule, and performance) what we
forecast will happen based on our special knowledge of the supplier and program.

PI. Primary DCMA representative to the procuring customer and leads a PST comprised of
functional experts. PI assesses contractor performance, predicts future performance, and makes
actionable recommendations related to future programmatic efforts.

PST. The PST is a matrixed multifunctional team led by a PI which supports a major acquisition
program. The PST may include functional specialists from contract administration, earned value
management, quality assurance, engineering, software, manufacturing and production, supply
chain management, as well as other functions.

Report Month. The month and year associated with the program’s group.

Reporting Level. The reporting level specified in the CDRL. Usually at least at Contract Work
Breakdown Structure (CWBS) level 3 except for high cost and high risk items were the level is
established to ensure the necessary information for effective management control. It is not
necessary for the reporting levels in different legs of the CWBS to be the same.
Software Defects. Priority I and II Software Defect terminology is determined by Contractor
Command Media and Quality Management System (QMS) for definitions. Common software
categorization and definitions of defects can be found in IEEE 12207.

GLOSSARY 50
DCMA-MAN 3101-02, Publication Date

SPIX. The Schedule Performance Index (SPIX) is an efficiency factor representing the
relationship between the performance achieved or Earned Value or BCWP and Planned Value or
BCWS. CPR/IPMR Format 1 contains the BCWP and BCWS data. SPI can be calculated for
current period (monthly) or cumulative (to date).
BCWPx
SPIx =
BCWSx
where: x is current period (cur) or cumulative (cum).
SPI. Primary DCMA representative to either the PI or the next higher tier SPI. The SPI
provides input to the PI concerning their independent assessment of the program element(s) they
have been delegated. The SPI leads a SPST comprised of functional experts.

SPST. The SPST is a matrixed multifunctional team led by a Support Program Integrator which
supports a significant element, subcontract, or subsystem of a major acquisition program.

SV. SV is the difference between BCWP and BCWS. CPR/IPMR Format 1 contains the BCWP
and BCWS data as well as the correlating SVs. SV can be measured using cumulative (CUM) or
current (CUR) values at either the WP or the contract level. The SV% metric quantifies the
magnitude of the SV by dividing SV by BCWS and multiplying by 100. The formula for
calculating SV and SV% are:
SVx
SVx = BCWPx − BCWSx and SVx % = BCWS × 100
x

where x represents cumulative (CUM) or current(CUR)

TCPI. TCPI is the ratio of work remaining (BCWR) and future cost of work remaining (ETC).
BCWR BAC ∗ − BCWP
TCPIEAC = =
ETC EAC − ACWP
*For DCMA the formula will use BAC at the WBS element level and TAB at the contract level.

Watch Item. Any issue, risk or observation that is not currently driving the rating but
significant enough to report. Report watch items in Section 2 of the PAR.

GLOSSARY 51
DCMA-MAN 3101-02, Publication Date

G.2. ACRONYMS.

ACAT Acquisition Category


ACO Administrative Contracting Officer
ACWP Actual Cost of Work Performed
ACWPCUM Actual Cost of Work Performed, cumulative

BAC Budget at Completion


BCWP Budgeted Cost for Work Performed
BCWPCUM Budgeted Cost for Work Performed, cumulative
BCWR Budgeted Cost of Work Remaining
BCWS Budgeted Cost for Work Scheduled
BCWSCUM BCWS, cumulative
BCWSCUR Budgeted Cost for Work Scheduled, current reporting period
BEI Baseline Execution Index
BLUF Bottom Line Up Front

CACO Corporate Administrative Contracting Officer


CAP Corrective Action Plan
CAR Corrective Action Request
CBS Contractor Business System
CDR Critical Design Review
CDRL Contract Data Requirements List
CLIN Contract Line Item Number
CMO Contract Management Office
CPA Contract Performance Assessment
CPI Cost Performance Index
CPICUM Cost Performance Index, cumulative
CPR Contract Performance Report
CWBS Contract Work Breakdown Structure
CV Cost Variance

DACO Divisional Administrative Contracting Officer


DAES Defense Acquisition Executive Summary
DAMIR Defense Acquisition Management Information Retrieval
DCARC Defense Cost and Resource Center
DCMA-INST DCMA Instruction
DID Data Item Description
DO Delivery Order

EAC Estimate at Completion


EACDCMA DCMA’s Estimate at Completion
EACKtr contractor’s Estimate at Completion
ECD Estimated Completion Date
ECDDCMA DCMA’s Estimated Completion Date
eFIT Electronic Functional Input Template

GLOSSARY 52
DCMA-MAN 3101-02, Publication Date

EMD Engineering & Manufacturing Development


ETC Estimate to Complete
EVM Earned Value Management
EVM-CR EVM Central Repository
EVMS Earned Value Management System
EVMSIG Earned Value Management System Interpretation Guide

FCA Functional Configuration Audit


FLS First Level Supervisor
FMS Foreign Military Sales
FRB Failure Review Board

GFE Government Furnished Equipment


GFM Government Furnished Material

IMS Integrated Master Schedule


IPMR Integrated Program Management Report
IWMS Integrated Workload Management System

LOD Letter of Delegation


LOE Level of Effort

MA Management Assessment
MAIS Major Automated Information System
MDAP Major Defense Acquisition Program
MR Management Reserve
MRL Manufacturing Readiness Level
MSRA Manufacturing System Risk Assessment

NLT No Later Than

OSD Office of the Secretary of Defense


OTB Over Target Baseline
OTS Over Target Schedule
OUSD(AT&L) Office of the Under Secretary of Defense for Acquisition, Technology,
and Logistics

PA Production Assessment
PAR Program Assessment Report
PCA Physical Configuration Audit
PCSA Prime Control of Subcontractor Assessment
PDR Preliminary Design Review
PF Performance Factor
PI Program Integrator
PM&BI Portfolio Management & Business Integration Executive Directorate
PMB Performance Measurement Baseline
PMO Program Management Office
PN Program Notification

GLOSSARY 53
DCMA-MAN 3101-02, Publication Date

PQDR Product Quality Deficiency Report


PSP Program Support Plan
PST Program Support Team

QMS Quality Management System

SAP Special Access Program


SCI Sensitive Compartmented Information
SPI Support Program Integrator
SPIX Schedule Performance Index
SPST Support Program Support Team
SOW Statement of Work
SV Schedule Variance

TAB Total Allocated Budget


TCPI To Complete Performance Index
TI Technical Instruction
TPM Technical Performance Measure

VAC Variance at Completion


VACDCMA DCMA’s Variance at Completion

WBS Work Breakdown Structure


WP Work Package

GLOSSARY 54
REFERENCES
DCMA-INST 3101, “Program Support,” July 28, 2017
DCMA-INST 210, “Earned Value Management System (EVMS) – Standard Surveillance
Instruction (SSI),” February 29, 2012, as amended
DCMA-INST 501, “Policy Issuances Program,” April 13, 2017
DFARS, Subpart, 252.234-7001, “Notice of Earned Value Management System”
DFARS, Subpart, 252.234-7002, “Earned Value Management System”
DFARS, Subpart, 252.242-7004, “Material Management and Accounting System”
DFARS, Subpart, 252.242-7006, “Accounting System Administration”
DFARS, Subpart, 252.244-7001, “Contractor Purchasing System Administration”
DFARS, Subpart, 252.245-7003, “Contractor Property Management System Administration”
DI-MGMT-81466A, “Contract Performance Report (CPR),” March 30, 2005
DI-MGMT-81650, “Integrated Master Schedule (IMS),” March 30, 2005
DI-MGMT-81861A, “Integrated Program Management Report (IPMR),” September 16, 2015
DoDD 5105.64, “Defense Contract Management Agency (DCMA),” January 10, 2013
FAR 42.302(a), “Contract Administration Functions”
GAO-09-3SP, “Cost Assessment Guide: Best Practices for Estimating and Managing Program
Costs,” March 2009

REFERENCES 55

You might also like