Final Dcips Repor
Final Dcips Repor
Final Dcips Repor
NATIONAL ACADEMY OF
PUBLIC ADMINISTRATION
For the U. S. Congress and the Department of Defense
June 2010
PANEL
* Academy Fellow
Officers of the Academy
______________________________________________________________________________
The views expressed in this report are those of the Panel. They do not necessarily reflect the
views of the Academy as an institution.
ii
FOREWORD
The terrorist attacks of September 11, 2001 brought the missions and roles of the United States
intelligence community into the public consciousness in ways previously unseen. Subsequent
successful and unsuccessful attempts around the world, including the 2009 Christmas Day and
the recent New York Times Square bombing attempts, have continued to demonstrate the
importance of high-performing intelligence personnel in protecting and strengthening our
nation’s security.
At the core of the intelligence apparatus are dedicated men and women entrusted with the most
important and sensitive missions related to the national security of the United States. They
include the civilian employees of the U.S. Department of Defense’s (DoD) intelligence
components—50,000 strong—who work tirelessly in the public interest at the National Security
Agency, Defense Intelligence Agency, National Geospatial Intelligence Agency, Defense
Security Service, National Reconnaissance Office, and the intelligence elements of the Army,
Navy, Air Force, and Marine Corps.
These public servants work to achieve missions that are more critical than ever: defending our
nation and thwarting attack. The Academy Panel recognized that the manner in which the
performance of these employees is assessed and rewarded is as important to how—and how well
—they do their work, as it is to recruitment and retention. A culture that encourages “connecting
the dots” and finding new ways to look at “dots” cannot be built on a system that rewards
longevity over performance. The Panel concluded that a performance-based pay system that
provides recognition for individual as well as collaborative performance can produce more
robust discussion and better intelligence products that will significantly strengthen our ability to
thwart attacks.
For the past four years, DoD has engaged in the design and implementation of the Defense
Civilian Intelligence Personnel System (DCIPS), which is intended to unify the DoD intelligence
components under a single human resources management system. DCIPS represents
transformational cultural change that requires paying as much attention to the system’s
implementation as to its design. The National Academy Panel recognized the soundness of
DCIPS’ design, the urgency of the effort, as well as the need to make certain changes in its
planned implementation prior to moving forward. This report provides key recommendations
aimed at encouraging greater collaboration among the intelligence components, restoring and
building employee trust in DCIPS and, most importantly, strengthening personal accountability
in the performance of agency missions. The stakes have never been higher. And, our
intelligence personnel deserve nothing less.
The National Academy was pleased to conduct this review for Congress and the Secretary of
Defense. I want to thank the Academy Panel for its excellent and diligent work and the study
team for its significant contributions. In addition, I wish to acknowledge the vital assistance
provided by the Undersecretary of Defense for Intelligence and his staff, as well as the DoD
intelligence components and the Office of the Director of National Intelligence. Finally, my
appreciation goes to those personnel who provided access to critical information and contributed
iii
their insights through interviews, focus groups, agency forums, colloquia, and online dialogue.
Their work should not go unnoticed by the American public, which owes them a significant debt
of gratitude.
Jennifer L. Dorn
President and Chief Executive Officer
iv
TABLE OF CONTENTS
FOREWORD................................................................................................................................ iii
ACRONYMS ................................................................................................................................ ix
Background ................................................................................................................................. 1
Methodology ............................................................................................................................... 3
Organization of the Report.......................................................................................................... 4
Introduction ................................................................................................................................. 7
Introduction ............................................................................................................................... 21
v
National Intelligence Civilian Compensation Program ............................................................ 24
Equity ........................................................................................................................................ 41
Internal Equity/Organizational Contribution ........................................................................ 41
Pay Pools............................................................................................................................... 43
External/Market Equity......................................................................................................... 44
Conclusion................................................................................................................................. 56
Recommendations ..................................................................................................................... 56
Introduction ............................................................................................................................... 59
Preparedness.............................................................................................................................. 62
Engagement........................................................................................................................... 63
Accountability....................................................................................................................... 64
Resources .............................................................................................................................. 64
vi
Governance ........................................................................................................................... 65
Information Access ............................................................................................................... 66
Outreach................................................................................................................................ 69
Feedback ............................................................................................................................... 71
Planning ................................................................................................................................ 72
Delivery................................................................................................................................. 73
Inclusion................................................................................................................................ 76
Work Stream Planning and Coordination ............................................................................. 78
HR Business Processes and Procedures................................................................................ 79
Tools and Technology Infrastructure.................................................................................... 80
Structured Approach ............................................................................................................. 81
Progress ..................................................................................................................................... 83
Differentiating Performance ................................................................................................. 84
Pay for Performance ............................................................................................................. 86
Cost Management ................................................................................................................. 87
Fairness ................................................................................................................................. 88
Transparency......................................................................................................................... 89
Trust ...................................................................................................................................... 89
Work Stream Planning and Status ........................................................................................ 90
Performance Management System Execution ...................................................................... 90
Employee Support for DCIPS............................................................................................... 90
Conclusions ............................................................................................................................... 91
Introduction ............................................................................................................................... 93
Employee Perceptions of DCIPS .......................................................................................... 93
NGA Experiences with Pay-for-Performance ...................................................................... 93
vii
CHAPTER 6: THE WAY FORWARD FOR DCIPS........................................................... 105
Table 1-1. Status of DCIPS Implementation Efforts at DoD Intelligence Components ............... 2
Table 2-1. Major Efforts to Link Federal Pay to Performance .................................................... 13
Table 3-1. Summary of DCIPS Design Assessment.................................................................... 21
Table 3-2. Design Principles for Performance-Based Compensation Systems ........................... 23
Table 3-3. DCIPS Performance Elements ................................................................................... 37
Table 3-4. Conversion of Average Rating to Evaluation of Record............................................ 40
Table 3-5. Evaluation of Record and Performance Payout Eligibility ........................................ 42
Table 3-7. Comparison of DCIPS and NSPS Design .................................................................. 53
Table 4-1. Overview of the OPM Objectives-Based Assessment Framework............................ 61
Table 5-1. NGA Indices and Comparative Results.................................................................... 101
APPENDICES
viii
ACRONYMS
ix
OUSD(I) Office of the Under Secretary of Defense for Intelligence
P4P Pay-for-Performance
PAA Performance Appraisal Application
PM PRA Performance Management Performance Review Authority
PP PRA Pay Pool Performance Review Authority
PMO Program Management Office
POC Point of Contact
RMSG Resource Management Sub-group
SMART Specific, Measurable, Achievable, Relevant, Time-framed
SME Subject Matter Experts
Team DCIPS Review Team
TSA Transportation Security Administration
USD(I) Under Secretary of Defense for Intelligence
USD(P&R) Under Secretary of Defense for Personnel and Readiness
USMC United States Marine Corps
x
EXECUTIVE SUMMARY
The terrorist attacks of September 11, 2001, ushered in an era of fundamental change to the
Intelligence Community (IC), and underscored the urgent need for improvements in the way its
agencies assess and manage their human resources. Studies conducted in the wake of the attacks
conclude that agencies missed or misinterpreted signals pointing to a major terrorist attack, and
that they failed to “connect the dots” linking the actions of the 9/11 terrorists to the plot.
Creating a unified human capital framework that encourages individuals and intelligence
agencies to work together toward a common goal became a cornerstone of the reform efforts. By
implementing a human resources management system that more directly links pay to
performance, the Department of Defense is seeking to improve both individual and
organizational performance through greater cooperation and collaboration that will ultimately
lead to better intelligence products. These products will enable America’s military, security,
and law enforcement personnel to better perform their jobs and thwart attacks.
In this way, at its most fundamental level, the Defense Civilian Intelligence Personnel System
(DCIPS) is intended to help protect the national security interests of the United States. Lee
Hamilton, former Congressman, Chairman of the House Permanent Select committee on
Intelligence, and Vice Chairman of the 9/11 Commission, highlighted for the Academy Panel the
importance of a unified, performance-based IC personnel system to the nation’s ability to defend
against terrorist attacks:
DCIPS was the result of an effort to develop a unified, performance-based human resources
management system for nine U.S. Department of Defense (DoD) intelligence components,
whose collective mission is to protect the national security of the United States. The system is in
various stages of implementation in each of the components, and is ultimately expected to affect
more than 50,000 employees.
In large part due to perceptions that DCIPS could result in unfair treatment of minorities and
women, the National Defense Authorization Act (NDAA) for Fiscal Year 2010 directed the
Secretary of Defense, the Director of the Office of Personnel Management, and the Director of
National Intelligence (DNI) to designate an independent organization to conduct a review of
DCIPS. In anticipation of the review, NDAA suspended the base-pay setting portions of the
DCIPS’ performance-based compensation system until December 31, 2010; however, it
preserved DoD’s authority to award bonuses, maintain a pay-band structure, and implement the
performance evaluation process under DCIPS.
Selected in January 2010 to conduct the review, the National Academy of Public Administration
(Academy) appointed an expert Panel to assess and make recommendations regarding DCIPS’
design, implementation, and impact.
xi
The Academy Panel applauds the effort that the Office of Under Secretary of Defense for
Intelligence (OUSD(I)) has made to enhance the ability of the DoD intelligence components to
accomplish their mission by creating and implementing DCIPS. The creation and introduction
of DCIPS have been approached with great seriousness, hard work, and creativity. The Panel
has been impressed both with the DCIPS system and the people who work within it.
Unlike the General Schedule (GS) compensation system it is intended to replace, DCIPS
provides performance-based compensation increases in lieu of tenure-based salary increases. It
also differs from the GS system in providing for a stronger, more rigorous performance
management system, and places positions in five broad pay bands rather than the 15 GS grade
levels. Significantly, DCIPS retains the Merit Systems Principles of Section 2301 of Title 5,
United States Code. This means that employees covered by DCIPS continue to have the same
protections and safeguards from unfair treatment as all other federal employees.
The Panel found no indication that DCIPS is creating problems related to diversity or fair pay.
In fact, the Panel concluded that there is nothing inherent in the DCIPS design that would lead to
such negative impacts. The analysis of NGA data shows that disparities in the ratings of
minorities and women compared to other employees existed before and after DCIPS was
implemented. These long-standing disparities may be caused by biases of individual managers
or may accurately reflect differences in individual performance. In either event, they are clearly
not attributable to inherent flaws in either the design or implementation of DCIPS. The Panel
has recommended further analysis of the results of the NGA implementation of DCIPS to
determine whether individual managers are engaging in unfair practices or treatment of certain
classes of employees.
With regard to the three key focus areas of its investigation, the Panel finds that:
• The design of DCIPS is fundamentally sound and conforms to accepted principles for
designing performance-based compensation systems, including appropriate equity
considerations and internal checks and balances to ensure fairness. The use of a tailored
occupational framework, a single pay band structure, a rigorous performance management
system, and separate performance management and pay determination processes, and its
planned process for ongoing system evaluation all contribute to the strength of DCIPS’
design. The Panel has identified a number of areas where improvements can be made, but
considers these to be opportunities to further tailor, strengthen, and refine a system that is
fundamentally sound, rather than rectifications of fatal design problems.
• Implementation of DCIPS has been flawed. OUSD(I) must establish a stronger foundation
for organizational change. In particular, leadership in every component must visibly
demonstrate that it fully supports the system. Further, OUSD(I) leadership must allocate
sufficient staff time and other resources to develop a more comprehensive implementation
strategy; a stronger system of governance and accountability; clearer messages; and refined
business rules, tools and training.
xii
• It is too soon to draw conclusions about the impact of DCIPS, due to the limited amount
of experience with the system. Only one DoD intelligence component has fully
implemented it, and the NDAA suspended significant portions of the system for this year.
The Panel recommends further analysis of NGA’s 2009 performance evaluations and payouts
to determine if there are issues regarding protected classes that warrant further attention.
However, the Panel finds nothing inherent in the DCIPS’ design that would lead to negative
impacts with regard to career progression or diversity.
Based on these findings and the mission-critical nature of this effort, the Panel recommends that
OUSD(I) act with urgency to address the implementation issues that have been identified, and
phase in the DCIPS performance-based compensation elements based on readiness assessments
of the remaining DoD intelligence components.
xiii
This Page Left Intentionally Blank.
xiv
CHAPTER 1
BACKGROUND
Using the NICCP framework as guidance, DoD developed a human resources management
system, the Defense Civilian Intelligence Personnel System (DCIPS), which includes new
compensation and performance management processes. DCIPS is designed to provide a single
system for DoD intelligence components that rewards individual performance contributing to the
organization’s mission, and that enhances the ability of those components to attract and retain
high performing candidates.2
DCIPS is designed specifically for intelligence components and other DoD intelligence positions
designated by the Under Secretary of Defense for Intelligence (USD(I)), including those at the
Defense Intelligence Agency (DIA), National Geospatial Intelligence Agency (NGA), National
Reconnaissance Office (NRO), National Security Agency (NSA), Defense Security Service
(DSS), Office of the Under Secretary of Defense for Intelligence (OUSD(I)), and intelligence
elements of the Army, Air Force, Navy, and Marine Corps.
The components adopted all or parts of DCIPS and were scheduled for complete adoption, as
indicated in Table 1-1.
1
The Intelligence Reform and Terrorism Prevention Act of 2004, among other items, directed the DNI to establish
common personnel standards for IC personnel. See Pub. L. 108-458, Sec. 102A(f)(3)(a), Dec. 17, 2004.
2
DoD Worldwide HR Conference. DCIPS PowerPoint briefing, July 2009.
1
Table 1-1. Status of DCIPS Implementation Efforts at DoD
Intelligence Components: FY 2010
Pay Pools*:
Pay Pools*:
Performance Band Personnel Salary No DCIPS Pay
Bonuses
Management Structure Policies Increases Pools+
Only
and Bonuses
DIA X X X X
Navy/
X X X X
Marines
OUSD(I) X X X X
DSS X X X X
Army X X X X
Air Force X X X X
NRO
X X X X
(DoD)
NSA X X X
NGA X X X X
* Pay pools for FY 2010 were established in 2009.
Responding to perceptions by Members of Congress of the potential under DCIPS for unfair
treatment of minorities and women, Congress suspended implementation of portions of the
system’s performance-based compensation authorities in the FY 2010 National Defense
Authorization Act (NDAA), from October 28, 2009, through December 31, 2010.3 The NDAA
permitted DoD to continue with implementation of DCIPS’ performance management aspects.
In addition, the NDAA directed that the Secretary of Defense, Director of the U.S. Office of
Personnel Management, and DNI designate an independent organization to review DCIPS and
submit a final report and recommendations to the Secretary of Defense and the Congressional
oversight committees by June 1, 2010.
The NDAA specified that the following issues be assessed during the course of the review:
3
The National Defense Authorization Act for Fiscal Year 2010, Pub. L. No. 111-84, Sec. 1114, 2009.
2
• The adequacy of the training, policy guidelines, and other preparations afforded in
connection with transitioning to that system.
Selected in January 2010 to conduct the review, the National Academy of Public Administration
(Academy) formed an expert Panel for that purpose. The Panel’s findings, conclusions, and
recommendations are discussed in the following chapters of this report.
METHODOLOGY
The Academy study team organized its data collection efforts around issues related to DCIPS’
design, implementation, and impact; it conducted the assessment in a manner consistent with
guidance provided in the U.S. Office of Personnel Management’s (OPM) handbook for
evaluating alternative personnel systems (APS), the Alternative Personnel Systems Objectives-
Based Assessment Framework Handbook (OPM Framework). It augmented this framework with
additional assessment criteria, including the Academy’s own design principles,4 and specifically
assessed the extent to which DCIPS retains and promotes merit systems principles. In addition,
the team evaluated and applied lessons learned from public literature on alternative pay systems
and recently implemented federal compensation systems.
The Academy used numerous techniques to collect qualitative and quantitative data from a wide
range of sources. These techniques included:
• Open forums at DoD intelligence component sites that allowed employees to express
their views of DCIPS directly to the study team. Every DoD intelligence component
hosted at least one site visit;
• An online dialogue tool that obtained input from program stakeholders and
employees throughout the organization. The tool, open from March 8 to April 9,
2010, received comments from more than 1,800 employees;
• Interviews with senior officials from every DoD intelligence component;
• Two focus groups of senior DoD intelligence component managers at the Senior
Executive, GS-15, or equivalent levels;
• A focus group of DoD intelligence component HR managers held at a national
DCIPS Conference attended by representatives from each intelligence component;
• Two colloquia of subject matter experts (SME) with experience in public and private
sector performance-based compensation systems, including two members of the
Business Board Panel that reviewed the National Security Personnel System in 20095;
and
• Presentations made by senior DoD, IC, HR officials, and other experts at meetings of
the Academy Panel and Panel member discussions with these senior officials.
4
National Academy of Public Administration, Recommending Performance Based Federal Pay, May 2004.
5
Described in more detail in Chapter 2.
3
The study team also collected and reviewed a wide variety of documents and background
materials related to DCIPS, performance management, and performance-based compensation
systems.
Four study team members attended the national DCIPS conference hosted by the OUSD(I)
Human Capital Management Office (HCMO). This conference focused on effective
implementation of the temporary pay system established (“DCIPS Interim”) during the NDAA
suspension of certain DCIPS pay authorities. The team members attended every session, met
with groups of HR leaders, and participated in several one-on-one discussions.
This report presents the Panel’s findings, conclusions, and recommendations in the following
sequence:
6
Mock pay pools are conducted to allow organizations to experience the pay processes prior to full implementation
and make adjustments, as necessary.
4
• Chapter 6: The Way Forward for DCIPS. Presents the Panel’s overall findings and
recommendations regarding whether and how DCIPS should proceed.
• Chapter 7: Panel Recommendations. Provides a consolidated list of the Panel’s
recommendations for ease of reference.
5
This Page Left Intentionally Blank.
6
CHAPTER 2
INTRODUCTION
The terrorist attacks of September 11, 2001 introduced fundamental changes to the IC, including
pressure to change the way its agencies manage their human resources. Studies conducted in the
wake of the attacks noted that U.S. intelligence agencies missed or misinterpreted signals
pointing to a major terrorist attack and that they failed to “connect the dots” linking the actions
of the 9/11 terrorists to the plot.
The consensus emerging from these studies was that the historical challenge for IC agencies,
both civilian and military—to share information and work collaboratively—contributed
significantly to this failure. The studies suggested that closer working relationships among the
agencies would strengthen national intelligence operations and, by extension, assist in protecting
national security. The studies also concluded that a common human capital framework was an
important mechanism for bringing about closer IC working relationships and collaboration.7
During the same period, performance-based compensation systems were being introduced in the
federal government as a replacement for the decades-old GS pay system. Advocates view these
systems, widely used in the private sector, as an important tool for driving change. The premise
is that rewarding employees with salary increases and bonus payments for results, rather than for
longevity on the job, improves organizational results. As former OPM Director Linda Springer
noted in 2005, the federal government “is not doing anything that’s new, that hasn’t been done
by millions and millions of people for decades” by adopting performance-based compensation.8
The intersection of these two forces—the need to strengthen collaboration among intelligence
agencies and increased use of performance-based compensation systems—coupled with ODNI
efforts to respond to Congressional direction to adopt a common human resources framework,
laid the foundation for DCIPS. The effort to implement it across the DoD intelligence
components began in 2008 and 2009.
The Secretary of Defense was given authority to establish common personnel policies for
Department of Defense (DoD) intelligence components in 1996.9 In 1997, the Office of the
Undersecretary of Defense for Personnel and Readiness (OUSD (P&R)) and the Assistant
7
Report of the National Commission on Terrorist Attacks Upon the United States (9/11 Commission), 2004, p. 414
8
Government Executive. Aug. 4, 2005. http://www.govexec.com/story_page.cfm?articleid=31908
9
Public Law 104-201. National Defense Authorization Act for Fiscal Year 1997.
7
Secretary of Defense for Command, Control, and Counterintelligence developed the basic
policies.
By 1999, the effort had resulted in a functioning IC Assignment Program (ICAP), which
produced rotational assignment guidelines for aspiring Senior Executive Service candidates
across the IC. They loosely tied to the Defense Leadership and Management Program (DLAMP)
largely because funding from that effort could offset the cost of backfilling rotational
assignments within DoD. The governing board included representatives from across the IC, as
well as the OUSD (P&R).
During the same period, NGA (first known as the National Imagery and Mapping Agency) was
created to bring together six predecessor organizations with disparate civilian personnel systems.
NGA chose a single HR system to streamline administration and establish its identity.
In 1998, the Office of the Secretary of Defense authorized NGA to conduct a five year pilot test
that was later extended. Widely regarded as a success, the “Total Pay Compensation” program
substantially influenced the design of the ODNI Pay Modernization framework and provided
underlying principles for what would become DCIPS. By the time DCIPS was being considered
for expansion to other intelligence components, NGA had almost a decade’s worth of experience
with this type of performance-based management system.
The 9/11 terrorist attacks set in motion efforts to determine how the United States was caught by
surprise and establish steps to prevent this type of attack from happening again. The first study,
conducted by the House Permanent Select Committee on Intelligence and the Senate Select
Committee on Intelligence, found that:
The important point is that the Intelligence Community, for a variety of reasons,
did not bring together and fully appreciate a range of information that could have
greatly enhanced its chances of uncovering and preventing Usama Bin Ladin’s
plan to attack these United States on September 11, 2001.10
This was followed by the 2004 9/11 Commission report, which identified structural barriers to
performing joint intelligence work:
10
Report of the Joint Inquiry into the Terrorist Attacks of September 11, 2001, Dec. 2002, p. 33.
11
Ibid, p. 77.
8
National intelligence is still organized around the collection disciplines of the
home agencies, not the joint mission. The importance of integrated, all- source
analysis cannot be overstated. Without it, it is not possible to “connect the dots.”
No one component holds all the relevant information.12
The Commission recommended the establishment of a National Intelligence Director that would
have, among other powers, the authority to:
…set personnel policies to establish standards for education and training and
facilitate assignments…across agency lines.13
With the Intelligence Reform and Terrorism Prevention Act of 2004 (IRTPA), Congress adopted
most of the 9/11 Commission’s recommendations, including creation of the DNI. This new
position would report to the President and have broad responsibilities for intelligence issues,
including the ability to establish HR policies for the IC that would serve the purposes of:
The ODNI would also “…prescribe, in consultation with…other agencies or elements of the
intelligence community, and the heads of their respective departments, personnel policies and
12
Final Report of the National Commission on Terrorist Attacks Upon the United States (9/11 Commission Report),
2004, p. 408.
13
Ibid, p. 414.
14
Intelligence Reform and Terrorism Prevention Act Of 2004, Sec. 102(A)(3)(f)(1-5), Public Law 108–458, 118
Stat. 3649, Dec. 17, 2004.
9
programs applicable to the intelligence community…”15 President George W. Bush signed the
IRTPA into law in December 2004.
In February 2004, President Bush signed an Executive Order creating the Commission on the
Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction, more
widely known as the WMD Commission.16 The commission studied the intelligence failures that
led to the IC’s conclusions prior to the March 2003 initiation of Operation Iraqi Freedom that
Iraq had been developing WMDs.
The WMD Commission recommended that the DNI establish a central HR authority for the IC;
create a uniform system for performance evaluations and compensation; develop a more
comprehensive and creative set of performance incentives; direct a “joint” personnel rotation
system; and establish a National Intelligence University.17
Based on guidance provided by the 9/11 and WMD Commissions, the ODNI developed a
Strategic Human Capital Plan in 2006 that identified the major challenges to building a strong IC
human resources program. These challenges included:
ODNI officials concluded that the GS pay system, created in the 1940s, was inadequate to meet
the challenges that the IC now faced. Among other things, ODNI believed that the IC workforce
had changed significantly since the system did not align with modern notions of performance-
based compensation. Clerks who rarely changed jobs or positions had been replaced by highly
skilled and specialized knowledge workers who were more mobile. Further, the GS system
15
Ibid.
16
Executive Order 13328, Feb. 6, 2004.
17
The Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction.
Report to the President of the United States, Mar. 31, 2005, p. 321,.
10
rewarded longevity over performance and pay increases were delayed as employees built time
within grade.18
In cooperation with Cabinet departments and agencies with authority to establish pay systems for
IC employees, ODNI began to design an overarching framework that moved away from the GS
system and toward more performance-based systems. The resulting pay modernization
framework had two fundamental elements at its core:
The following chart identifies the IC agencies and the names of their respective Pay
Modernization programs based on the framework:
18
National Intelligence Civilian Compensation Program. Intelligence Community (IC) Pay Modernization Key
Facts. PowerPoint briefing, May 15, 2008.
11
Role of the Office of the Under Secretary of Defense (Intelligence)
Operating under the Pay Modernization framework and the NGA model, OUSD(I) began to
develop its own human capital system in 2006 for the DoD intelligence components, including
itself, the National Security Agency, Defense Intelligence Agency, National Reconnaissance
Office, Defense Security Service, NGA, Army Intelligence, Navy Intelligence, Marine Corps
Intelligence, and Air Force Intelligence.
This new performance-based compensation system, the Defense Civilian Intelligence Personnel
System (DCIPS), was developed through a joint effort involving all DoD intelligence
components. In 2007, the decision was made to use a phased approach, with the components
implementing all or portions of DCIPS over several years.19 Figure 2-2 identifies major events
along the path of DCIPS’ development.
19
See Table 1-1 for further detail regarding this phased approach.
12
OTHERS FACTORS THAT IMPACTED DCIPS
Linking employee pay more closely to job performance is not new to the federal government. A
table published by the MSPB’s Office of Policy Evaluation identifies some of the major efforts
to bring this about over the past half century.
2002 Homeland Security Act creates Department of Homeland Security and provides
authority for it to design its own pay system
2003 • National Defense Authorization Act for fiscal year 2004 grants DoD authority to
develop and implement a new pay system
• Human Capital Performance Fund established
2004 SES performance-based compensation plan implemented
13
A piecemeal approach to granting pay authorities to federal agencies has led to what OPM John
Berry has described as the “balkanization” of federal pay systems.
• Not all employees are equal; some contribute much more than others. The GS step-
increase system rewards longevity, not performance;
• Funds are limited and it is necessary to make the best use of the available money; across-
the-board or general salary increases do not represent the best use of funds;
• It will enhance recruiting among the “millennial” generation of workers who are more
accustomed to instant feedback and recognition and would not be content with a tenure-
based system;
• It helps reinforce the performance management system by putting some amount of
potential pay increase or bonuses at risk;
• The prospect of pay increases as an effective motivator is a deeply entrenched value in
the United States;
• Performance-based compensation is virtually universal for white-collar workers outside
the public sector, and has proven effective in driving organizational performance in the
private sector; and
• Most of the demonstration projects have implemented performance-based, broadband pay
systems, and OPM evaluations have concluded that these interventions have produced
improvements to agency results-oriented performance culture and the ability to recruit
and retain a high-quality workforce.21
• Gains in productivity and mission performance must exceed the costs of performance
measurement if performance-based compensation is to work. Because performance
measurement in federal work is imprecise, there is little evidence that these systems are
worth the costs;
• Federal work is multidimensional, done in teams, and subject to multiple supervisors and
multiple objectives. Linking pay to individual performance has potentially negative
consequences: undermining teamwork, levels of cooperation, and even relationships
among teams within an organization;
20
Excerpted from Pay for Performance: A Guide for Federal Managers. Howard Risher. IBM Center for the
Business of Government, Nov. 2004.
21
Testimony of former OPM Director Linda Springer before the House Subcommittee on Federal Workforce, Postal
Service, and the District of Columbia, July 31, 2007.
22
Excerpted from Pay for Performance. A.C. Hyde. The Public Manager. 2008, supplemented by Academy
research.
14
• Partisan politics could play an increasing role in the bureaucracy, which could have a
potential impact on the “neutral competence” of the public service;
• Giving managers additional flexibility to set pay can aggravate existing biases in the
system;
• The GS pay system can accomplish all of the goals of performance-based compensation
without the disruption; and
• Most (performance-based compensation) plans share two attributes: They absorb vast
amounts of management time and resources, and they make everybody unhappy.23
On balance, both sides of the argument for and against performance-based compensation have
merit. In any event, the Panel believes that a decision to implement such a system must be
weighed very carefully, and a decision to move forward must be made in the context of what is
most appropriate for the mission and environment of the agency.
Congress provided authority for DoD to develop the system and implementation began in 2006;
NSPS replaced the GS grade and step system with a pay band system intended to provide a more
flexible, mission-based approach that linked individual performance to mission and
organizational goals. NSPS created new policies for establishing pay levels, tenure, hiring,
reassignment, promotion, collective bargaining, pay, performance measurement, and recognition.
The 2003 legislation authorizing NSPS included highly controversial provisions dealing with
labor management issues that resulted in federal litigation. The courts eventually decided in
favor of DoD which, over unions’ objections, continued with its implementation plans.
By 2009, some 211,000 DoD non-intelligence employees were covered under this new system.
Union opposition remained strong, however, and Congress reversed the labor management
decisions in 2008. By then, the relationship between DoD and its labor unions was characterized
by one union official as follows:
23
William Mercer, Leader to Leader, Winter 1997, p. 611.
24
The National Defense Authorization Act for Fiscal Year 2010, Pub. L. No. 111-84, section 1114, 2009.
15
Our delegates…(believe) the whole intention of NSPS was to bust unions and
dismantle the federal civil service…(I)f it’s any way related to NSPS, it’s going to
be toxic, it’s not going to have employee buy-in.25
Adding to the controversy, as recently reported in the press, a 2008 report found that white
employees received higher average performance ratings, salary increases, and bonuses under
NSPS than employees of other races and ethnicities, and that raises and bonuses were sometimes
inconsistent with corresponding performance ratings.26
In 2009, DoD asked the Defense Business Board, an independent advisory body that operates
under the provisions of the Federal Advisory Committee Act, to establish a task group to conduct
a review of NSPS. The task group was to provide recommendations to help DoD determine “if
the underlying design principles and methodology for implementation (of NSPS) are reflected in
the program objectives; whether the program objectives are being met; and whether NSPS is
operating in a fair, transparent, and effective manner…”27
The Review of the National Security Personnel System, published in June 2009, called for DoD
to “reconstruct” NSPS in a way that challenged the system’s assumptions and design. The report
stated that a fix would not be sufficient to solve the problems that NSPS faced. It stopped short
of calling for the abolishment of NSPS but recommended that the existing moratorium on
transitions of more work units into NSPS be continued.
The report recommended that DoD engage the workforce in the reconstruction; re-commit to
partnership and collaboration with the unions; and commit to strategic management and
investment in career civil servants. It also recommended changes to processes that involved
trust, transparency, monitoring of progress, performance management, and classification.28
These recommendations were never acted upon given the elimination of NSPS.
When Congress created DHS in 2002, it gave the new department authority to replace the GS
system with a performance-based compensation system.29 NSPS aside, the DHS effort covered
the largest block of federal employees under such a system. It, too, was vigorously opposed by
employee unions and halted by a series of court rulings in 2006. DHS put the performance-
based compensation portion of the system, known as MAX HR, on hold in 2007 but continued
with the performance management, appeals, and adverse action portions.
In DHS’ fiscal year 2009 appropriation, Congress withheld funding for this new system30 and
DHS chose to cease its implementation efforts except at the Transportation Security
25
Union leaders make NSPS repeal, personnel reform major priorities. Alyssa Rosenberg. Govexec.com, Sept. 4,
2009.
26
Defense, OPM to review NSPS performance pay system. Federal Times.com. March 16, 2009.
27
DoD News Release. http://www.defense.gov/Releases/Release.aspx?ReleaseID=12679. May 15, 2009.
28
Report to the Secretary of Defense. Review of the National Security Personnel System. Report FY09-06.
29
Pub. L. 107-296, T. VIII, Subtitle E, Sec. 841, Nov. 25, 2002.
30
Pub. L. 110-329, Sep. 30, 2008.
16
Administration (TSA), which operated under a different statute.31 Non-TSA DHS employees
were returned to the GS system.
Ensure the federal workforce and its leaders are fully accountable, fairly appraised, and
have the tools, systems, and resources to perform at the highest levels to achieve superior
results. Help agencies become high-performing organizations by:
• Designing performance management systems that are integrated with agency
program planning and clearly show employees how their actions drive agency
results; and
• Creating fair and credible standards for individual performance appraisal and
accountability.
.
Recognize, select, and sustain individuals who provide strong leadership and direction for
agencies by:
• Evaluating the agency’s effectiveness in holding leaders accountable for agency
performance; and
• Ensuring agencies make meaningful distinctions in evaluating and recognizing
different levels of management performance.32
The U.S. Government Accountability Office (GAO) has extensively studied performance-based
compensation systems, including DCIPS and NSPS.33 Its most recent examination of the former
found that although DOD had taken “steps to implement internal safeguards to ensure that
DCIPS is fair, effective, and credible…” the implementation of some safeguards could be
improved.34 As a result of discussion groups conducted with DoD employees and supervisors,
31
Aviation and Transportation Security Act, Pub. L. 107-71, Sec. 111(d), Nov. 19, 2001.
32
http://fehb.opm.gov/strategicplan/StrategicPlan_20100310.pdf
33
In addition to reviewing the performance-based compensation systems of other federal agencies, GAO has a
performance-based compensation system of its own. Under authorities provided by the GAO Personnel Act of 1980
(Pub. L. 96-191), GAO implemented a broad-band performance-based compensation system for GAO analysts and
specialists in 2006 and 2007. This system was designed to provide rewards based on knowledge, skills, and
performance, as opposed to longevity. It also provided managers with additional flexibility to assign and use staff.
See GAO Human Capital Reform Act of 2004, Pub. L. 108-271.
34
DOD Civilian Personnel: Intelligence Personnel System Incorporates Safeguards, but Opportunities Exist for
17
GAO found positive views about the concept of pay for performance but believed that DCIPS
was being implemented too quickly with many questions unanswered.35
In its final report, GAO recommended that DoD institutionalize a process for employee
involvement in future design and implementation changes to DCIPS. Among its
recommendations:
• Issue guidance on the analysis of finalized ratings that explains how demographic
analysis for ratings will be conducted to ensure equity, fairness, and non-discrimination
in ratings;
• Finalize and execute a DCIPS evaluation plan with metrics to assess the system, to
include internal safeguards, and help ensure the department evaluates the impact of
DCIPS; and
• Expeditiously implement processes to accurately identify and measure employee
perceptions, and ensure those mechanisms include questions regarding certain
safeguards, such as the internal grievance process and employee acceptance of DCIPS.36
As noted earlier in the discussions of the DoD NSPS and DHS MAX HR experiences,
organizations that represent federal employees have been less that enthusiastic regarding
alternative pay systems. In another example, a 2009 survey conducted by Federally Employed
Women (FEW) of its members who were working under performance-based compensation
systems found that, by a two to one ratio, respondents did not support these systems.38 FEW
members did, however, cite some benefits of performance-based compensation systems,
including the requirement that employees and supervisors meet annually to discuss performance,
mutually setting objectives that allow employees to know exactly how their job fits into mission
accomplishments, and more directly rewarding employees for their work rather than their
longevity on the job.
Objections in the survey focused on implementation issues, not the principle of linking pay to the
work performed. These included a lack of training and instructions for managers, pay pool
panels with no connections to the workers whose salaries they determine, and an emphasis on
writing rather than presentation skills in supervisory evaluations of their staff.
18
NATIONAL DEFENSE AUTHORIZATION ACT OF 2010
The strong level of Executive and Legislative Branch support for creation of performance-based
compensation systems has weakened since 2003. Both MAX HR and NSPS were controversial
from their inception and the targets of litigation from employee unions. Reacting to the
resistance of their federal employee constituents to performance-based compensation design and
implementation, Members of Congress initiated agency inquiries and frequent committee
hearings.
As a further sign of the flagging political support for such systems, now-President Barack
Obama wrote to the President of the American Federation of Government Employees during the
closing weeks of the 2008 presidential campaign to express his priorities on federal workforce
issues:
Further, the class action lawsuits alleging race, gender, and age bias by employees
placed under pay systems similar to NSPS in other agencies should give us pause.
I cannot and will not support a pay system which discriminates against
employees, and I cannot and will not support a pay system which ultimately is
designed to suppress wages for civilian DoD employees over time.
In March 2009, DoD suspended conversion of new DoD elements into NSPS pending the
Defense Business Board review described earlier. In April, eight House chairmen and
subcommittee chairmen sent a letter to the U.S. Office of Management and Budget urging the
Obama Administration to suspend any further government-wide implementation of performance-
based compensation. Subsequently, the Conference Report for the Fiscal Year 2010 National
Defense Authorization Act (NDAA) “repeal(ed) the authority for the National Security
Personnel System (NSPS) and require(d) the transition of NSPS employees to previously
existing civilian personnel systems…”
19
The 2010 NDAA, signed into law in October 2009, terminated NSPS. It did not order an end to
DCIPS but did suspend certain DCIPS performance-based compensation authorities until
December 31, 2010,39 including a prohibition against setting pay using the pay pool process,
which was to have begun in January 2010. It allowed DoD to implement the performance
management provisions of DCIP and exempted NGA, the only DoD intelligence component to
have fully implemented DCIPS at the time of the suspension.
DCIPS Interim
This “strategic pause” in DCIPS implementation and suspension of its pay authority required
OUSD(I) to develop an interim system—DCIPS Interim—to calculate employee pay and
implement the performance management elements of DCIPS not affected by the NDAA. The
result has been workforce confusion over whether the problems perceived arise from DCIPS
itself or from the interim system.
The Academy’s open forums and online dialogue indicate that employees routinely confuse the
interim policies and practices with DCIPS policies and practices. Knowing exactly where
DCIPS ends and DCIPS Interim begins is almost strictly the province of DoD intelligence
component HR professionals.
39
The 2010 NDAA suspended fixing “rates of basic pay” under DCIPS “for employees and positions within any
element of the Intelligence Community,” except for NGA. It also required “rates of basic pay” to be fixed in
accordance with provisions of law that would otherwise apply during the period beginning on the date of enactment
and ending on Dec. 31, 2010. Pub. L. 11-84, Sec. 1114, 2009.
20
CHAPTER 3
INTRODUCTION
The Panel has concluded that DCIPS’ design is fundamentally sound and adheres to accepted
design principles for performance-based compensation systems. Most importantly, DCIPS fully
retains the Merit Systems Principles and other basic protections afforded employees in the
federal civil service and include a set of checks and balances to ensure fairness and equity in
performance management and pay pool decisions. It also incorporates design features derived
from lessons learned from best practices and challenges faced by the recently-terminated NSPS.
Looking beyond these fundamental attributes, the Panel believes that DCIPS’ design includes
several other strengths: the simplicity and clarity of its occupational structure, a single pay
banding system, its rigorous performance management system, separate performance
management and pay pool processes, and its planned process for ongoing system evaluation.
Although the Panel has identified a number of areas in this chapter where improvements can be
made, the Panel does not consider these to be fatal design flaws, but, rather, opportunities to
further tailor, strengthen, and refine a system that is already fundamentally sound.
In that context, the Panel believes that full acceptance of DCIPS will require examination of its
performance-based compensation policies and further tailoring the system to the mission of the
DoD Intelligence Enterprise so that DCIPS becomes a part of its culture, rather than just another
HR experiment. Based upon the findings discussed below, the Panel offers several
recommendations, listed at the end of this chapter, to strengthen DCIPS’ design.
Table 3-1 summarizes the Panel’s findings regarding DCIPS’ alignment with the design
principles that form the assessment framework.
21
DCIPS’s Alignment with Design
Principles
Design Principle
Fully Partially Not
Aligned Aligned Aligned
compensation system. At a minimum, the
performance management system allows managers
and supervisors to distinguish “Outstanding,”
“Fully Successful,” and “Unacceptable”
performers.
5. The system identifies the balance among three
aspects of equity: internal, external/market, and 9
organizational contribution.
In response to the NDAA’s mandate, as described earlier, this chapter addresses the
appropriateness or inappropriateness of DCIPS in light of the complexities of the workforce
affected, and its sufficiency in providing protections for diversity in the promotion and retention
of personnel. The chapter approaches these issues by considering DCIPS’ design, describing the
components of its performance-based compensation system and comparing them with guiding
principles that form the assessment framework. The assessment focuses on those aspects that are
documented in official policies (DoD Instructions) and other guidance, such as fact sheets,
memoranda, and official guidance issued by OUSD(I).
The chapter then compares DCIPS with the GS/GG system40 that has been in place in the federal
government for more than 60 years. It also identifies the similarities and differences between
DCIPS and the NSPS, which was developed for DoD’s non-intelligence workforce but then
repealed.
40
GS is the designation for the General Schedule that establishes position and pay levels in the federal government,
while GG is the designation used for GS-like positions in the Excepted Service. Salary rates for most GG positions
are identical to those of GS positions.
22
FRAMEWORK FOR ASSESSING DCIPS’ DESIGN
Two main sources provide guidance for assessing the design of a performance-based
compensation system. First, a 2004 Academy Panel study identified “design principles” for such
a system.41 Although the report recommended the development of a government-wide system
using broad-banding and market pay, the design principles are equally relevant to agency-
specific ones. Second, a 2006 MSPB report provided detailed guidance for federal agencies that
wish to undertake the design and implementation of a performance-based compensation system.42
These two sources, coupled with 2008 OPM guidance43 and validated by additional research of
best practices, provide a consolidated set of design principles. Table 3-1 summarizes these
principles, each of which will be used to assess DCIPS throughout this chapter.
41
National Academy of Public Administration, Recommending Performance-Based Federal Pay, May 2004.
Hereafter “2004 Academy Panel Design Principles Study.”
42
U.S. Merit Systems Protection Board, Designing an Effective Pay for Performance Compensation System, Jan.
2006. Hereafter “2006 MSPB Design Report.”
43
Office of Personnel Management, Alternative Personnel Systems Objectives-Based Assessment Framework
Handbook, Oct. 2008.
23
Design Feature Guiding Principle
Balances to Ensure Fairness disputed band classification and performance decisions.
Flexibility The system is sufficiently flexible and responsive to changing market
conditions to meet the agency’s needs.
Ongoing System Evaluation The system’s design includes a requirement for ongoing
evaluation of the system with the possibility of corrective action.
To supplement and update these criteria, the Academy conducted two colloquia attended by
Panel members, Academy Fellows, and senior experts on performance management and pay-for-
performance systems. The participants assessed DCIPS’ design and ranked the importance of
specific design elements. The results showed that the top three elements were: (1) linkage to
mission, (2) a performance management system that differentiates levels of performance, and (3)
transparency. These results were applied in the Panel’s assessment of DCIPS.
DCIPS was designed to conform to the policies of the NICCP, which was promulgated by ODNI
for the entire IC and discussed in Chapter 2. NICCP’s goal was to unify the 17 IC components
under one common framework in place of the six different personnel systems that were used. IC
agencies and Executive departments with authority to create their own compensation systems
must incorporate NICCP principles into their own systems.
The NICCP framework responds to concerns that the GS system no longer meets the needs of the
IC workforce. Like many agencies across the federal government, ODNI viewed the system as
inadequate because it fosters the perception that promotions are based on longevity, not merit;
lacks the necessary tools to hold poor performers accountable; and does not include a strong
basis for linking pay to performance.
As illustrated in Figure 3-1, the NICCP case for modernizing compensation is based on a tiered
set of objectives: strengthen and transform the IC; provide a level playing field for IC agencies
and elements; and reinforce and reward excellence.
24
Figure 3-1. Objectives of the NICCP Framework44
DCIPS conforms to the NICCP framework and, at the same time, tailors it to the needs of the
DoD intelligence components. For example, NICCP defines a common occupational structure
and provides a general framework for setting basic rates of pay, managing performance, and
paying for performance, but DCIPS’ specific design features include a comprehensive
performance management system and performance-based compensation system, both of which
offer greater specificity, including defining roles and responsibilities for managing and
overseeing the system. Each design feature is discussed in the following sections.
It is unclear whether federal performance management systems can achieve the goal of enhanced
performance without linking performance to compensation. Nor are there strong research results
that link performance-based compensation systems to improved individual or organizational
performance. A 1991 National Research Council report concluded that variable pay plans can
produce positive effects, but that there is insufficient evidence to determine conclusively whether
merit pay—also known as pay for performance—can enhance individual performance.45
44
ODNI, NICCP Framework.
45
National Research Council, Pay for Performance, George T. Milkovich and Alexandra K. Wigdor, Editors, with
Renae F. Broderick and Anne S. Mayor, National Academy Press (Washington, D.C.: National Academy Press),
1991.
25
Additionally, the Review of Public Personnel Administration recently shed light on this topic in
a discussion of the Performance Management and Recognition System (PMRS), a recent attempt
to implement performance-based compensation in the federal government.46 The article noted
that the PMRS generated little, if any, evidence of a positive effect on productivity, worker
satisfaction, or job turnover. It further found that the new performance-based compensation
system for the Senior Executive Service had no impact on performance.47
Ultimately, the decision to shift from the GS/GG system to a performance-based compensation
system must be made with appropriate consideration of the system’s intended goals and
objectives, the resulting challenges, and an organization’s readiness for sweeping change.
Implementing a system like DCIPS requires a major cultural shift, and agency leaders are best
positioned to determine whether their goals can be fully achieved within an existing framework
or a new system. For some organizations, the GS/GG system may prove adequate, while others
may find it necessary to design a unique system.
USD(I) has concluded that a new system—DCIPS—is needed to support the goals of the DoD
intelligence components. The focus then turns to the issues discussed above: the intended goals
and objectives, the challenges, organizational readiness, and whether aspects of the system could
be improved to ensure equity, fairness, and meaningful recognition for employees.
DCIPS was designed as a comprehensive system for the DoD intelligence components that will
affect all aspects of HR management, including performance management, compensation,
position classification, recruitment and staffing, and employee development. Although it is
envisioned as a broad, multi-faceted HR system, only a few of its elements were fully
operational and documented in approved policies at the time of this review. These pertain to a
performance-based compensation system, specifically:
Based on its evaluation according to the design principles that have been drawn from the MSPB,
Academy, and OPM guidance, the Panel finds that the design of DCIPS is fundamentally sound.
Use of a tailored occupational framework, a single pay band structure, a rigorous performance
management system, and separate performance management and pay pool processes all
contribute to the strength of DCIPS’ design. The sections that follow describe how the specific
elements of DCIPS align with the design criteria that form the assessment framework.
46
James S. Bowman, “The Success of Failure: the Paradox of Performance Pay,” Review of Public Personnel
Administration 2010; 30, Nov. 5, 2009, pp. 70-87.
47
Ibid, p. 73.
26
PRINCIPLE
The system is transparent and easy for managers and employees to understand.
The DCIPS’ policies and guidance that describe the overall design of the system are generally
clear and easy to understand. For the performance-based compensation system, governing
policies have been developed and supplemented with clearly written guidance that is available to
all employees, managers, and HR staff affected.
Although the manner in which these policies have been implemented (as discussed in Chapter 4)
has caused employees to question the transparency of the system, this does not alter the Panel’s
belief that the fundamental design of the system is transparent and relatively easy to understand.
It is the implementation of the policies that has led to confusion, more so than the actual content
or intent of those policies.
In addition, the lack of policies and procedures for several major elements of the system has
adversely impacted employees’ perceptions of DCIPS’ transparency. Most policies have been
drafted and are in various stages of review and approval. However, the lack of finished policies
in critical areas, especially those affecting career progression and pay administration (both of
which are linked to performance), has generated a great deal of confusion among the workforce
and has undermined the system’s transparency and credibility. For example, in the open forums
and online dialogue, employees expressed major concerns about the absence of clear policies
governing advancement from one pay band to another.
Another critical gap is the lack of a formal policy for considering an employee’s “highest
previous rate” (HPR).49 Although OUSD(I) officials indicated that they did not intentionally
eliminate the use of HPR, the unavailability of this tool has reportedly disadvantaged certain
employees who held higher salaries prior to conversion to DCIPS.
Academy interviews, focus groups, and the online dialogue indicate that some employees have
lost confidence and trust in DCIPS because they are unable to obtain answers, or consistent
48
Pay for Performance (PFP) Implementation Best Practices and Lessons Learned Research Study, Booz Allen
Hamilton, June 18, 2008.
49
Highest previous rate means the highest actual rate of basic pay previously received, or the actual rate of basic pay
for the highest grade and step previously held, by an individual, depending on the position in which the individual
was employed. [5 CFR 531.202]
27
answers, to their questions and concerns. These frustrations have been heightened by the
perceived lack of knowledge demonstrated by their servicing HR staffs, who themselves have
been hampered by the lack of clear policies.
Finding 3-1
Overall, DCIPS’ design is transparent and easy to understand, but the lack of approved policies
in areas affecting career progression and pay administration is creating confusion and mistrust
among the workforce.
PRINCIPLE
The performance-based compensation system is designed to support the
organization’s mission and environment.
A successfully designed performance-based compensation system must have clear goals that are
well communicated and understood throughout the workforce. As discussed by MSPB,50
agencies seeking to implement a performance-based compensation system must establish clear,
realistic goals prior to undertaking change.
DoD policies clearly express DCIPS’ purposes and the ways in which it is intended to support
the mission, goals, and objectives of the DoD Intelligence Enterprise. However, managers and
employees have varying levels of understanding about the goals.
Senior DoD officials generally agreed that DCIPS’ overarching goal is to unify the DoD
intelligence components under a common HR management system. The official policy governing
DCIPS51 includes more specific objectives:
50
2006 MSPB Design Report, p. 7.
51
DoD Civilian Personnel Management System: Volume 2001, Defense Civilian Intelligence Personnel System
(DCIPS) Introduction, Dec. 29, 2008.
28
The Defense Intelligence Enterprise Human Capital Strategic Plan (2010–2015) discusses
DCIPS’ purpose in more detail.52 The plan states that “DCIPS will provide DoD leaders and
managers with the consistent policy framework needed to hire, develop, compensate, recognize,
reward, and retain the high-performing civilian workforce necessary to accomplish the
intelligence needs.” It further alludes to DCIPS in Objective 1.2 of its workforce goal:
“Implement and ensure consistent management and sustainment of DCIPS across the
Enterprise.”
Senior managers who participated in the focus groups described DCIPS as a way to achieve
specific organizational goals or process improvements—for example, make DoD a unified
enterprise or stop infighting among DoD intelligence components. None of these managers
described DCIPS goals in terms of how they affect mission outcomes.
Meanwhile, HR officials in the DoD intelligence components most often described DCIPS’ goals
in the context of improving and speeding up HR processes—for example, improved ability to
attract, hire, and retain quality staff. Employees who attended the open forums and participated
in the online dialogue had different views and levels of understanding about the goals. Overall,
they viewed DCIPS as a system designed to support HR functions and processes, but offered a
variety of reasons for why the system exists:
A clear, consistent message about DCIPS goals must be communicated to the workforce, and
senior management must reinforce it frequently. OUSD(I) has not accomplished this to date.53
Moving forward, DoD components will be hard pressed to ensure buy-in and measure DCIPS’
success without first ensuring common understanding of the system’s goals and demonstrating
how the system supports DoD’s intelligence mission.
The level at which performance is assessed and rewarded should reinforce the desired scope of
collaboration; this is a fundamental principle of rewarding collaboration. As noted in Figure 3-1.
Objectives of the NICCP Framework, reinforcing the IC-wide values of “Commitment, Courage,
52
Issued by the Office of the Under Secretary of Defense for Intelligence, Human Capital Management Office,
undated.
53
Chapter 4 deals with these communication and strategy challenges in more detail.
29
and Collaboration” is a key goal of IC pay modernization. DCIPS policies and guiding
documents are clear that the system was designed to support this goal.
According to its implementation plan,54 the business case for implementing DCIPS is grounded
in the need to increase sharing and collaboration for the purpose of developing a stronger
“community perspective.” However, this goal is not adequately reinforced by the DCIPS
performance management system, which focuses on individual performance rather than team or
organizational performance.
Although the system does not preclude the assessment of group or organizational performance,
OUSD(I) has not yet developed procedures for evaluating and rewarding these types of
performance as part of the annual performance rating process. The standard element
“Engagement and Collaboration” provides a way to measure individual employee performance
in such areas as building relationships and promoting collaboration, but there are no comparable
measures for teams, groups, or the organization as a whole. Further, the DCIPS policy on
awards and recognition55 provides for team-based awards for special one-time acts, but not in
connection with the annual performance evaluation process.
Academy colloquia attendees voiced strong concerns about this aspect of the system’s design;
they suggested that the focus on individual performance pits employees against each other and is
contrary to the goal of unification. Online dialogue and open forum participants expressed
similar concerns. Some seemed satisfied with DCIPS’ use of the standard performance element
to evaluate an employee’s contribution to team performance, but more believed that DCIPS will
inhibit collaboration by encouraging individual performance at the expense of team achievement.
Further, a recently completed OUSD(I) DCIPS Survey of all DoD intelligence component
employees56 indicated that the lowest percentage of favorable responses pertained to the question
dealing with the impact of DCIPS on collaboration. On average, 10 percent of employees agreed
or strongly agreed that DCIPS will contribute to increased collaboration within their organization
or component.57
Given DCIPS’ goal to support unification of the DoD intelligence components and the IC and
the concerns raised by experts and employees, it is necessary to develop a specific methodology
to evaluate and reward group and organizational performance. OUSD(I) officials indicated that a
process for doing so will be “included in the long-term evolutionary planning for the program.”
They noted that team rewards sometimes fail to recognize different levels of individual
performance within the group. However, DCIPS was designed to provide an HR system that
supports the broader goal of integrating the IC; a process to measure and reward group
performance concurrently with individual performance is essential. As MSPB has noted,
“Rewarding only individuals when mutual support helps advance organizational goals may
54
Program Plan for DCIPS Implementation, Jan. 2008.
55
Department of Defense Instruction Number 1400.25-V2008. DoD Civilian Personnel Management System:
Defense Civilian Intelligence Personnel System (DCIPS) Awards and Recognition, January 15, 2010.
56
Preliminary Results of OUSD(I) Survey of DoD intelligence components: Table of Frequencies, provided by
OUSD(I), Apr. 30, 2010. Hereafter “2010 DCIPS Survey Preliminary Results.”
57
Ibid. Question 80.
30
discourage teamwork…to the organization’s detriment.”58 Until procedures have been developed
to evaluate and reward group performance, monetary and non-monetary awards59 will be limited
to one-time acts, rather than overall annual performance.
OUSD(I) should consider alternative mechanisms to evaluate and reward group and
organizational performance. One method of linking individual rewards to organizational
performance is in use at the Federal Aviation Administration (FAA), which offers its employees
two types of performance recognition. First, an organizational success increase (OSI), in the
form of an annual base pay increase, is granted to most employees so long as organizational
performance goals are met. Second, a superior contribution increase (SCI) is provided with the
OSI to a percentage of highly-ranked employees based on individual performance and
contributions.
Finding 3-2
DCIPS is aligned with the mission, goals, and objectives of the DoD intelligence components,
but the lack of a process for measuring and rewarding group or team performance is not
supportive of the broader IC goal of increased collaboration.
PRINCIPLE
The system uses a simplified classification process with streamlined pay bands.
A key strength of DCIPS’ design is its use of a simplified classification process and five pay
bands that are part of an occupational structure that consists of the following components:
• Mission Categories. Broad classifications of work that include (1) Collections and
Operations; (2) Processing and Exploitation; (3) Analysis and Production; (4) Research
and Technology; (5) Enterprise Information Technology; (6) Enterprise Management
Support; and (7) Mission Management.
58
2006 MSPB Design Report, p. 11.
59
Department of Defense Instruction Number 1400.25-V 2008, DoD Civilian Personnel Management System:
Defense Civilian Intelligence Personnel System (DCIPS) Awards and Recognition, January 15, 2010.
60
2004 Academy Panel Design Principles Study, p. 45.
31
• Occupational Groups. Groups of positions that share common technical qualifications,
competency requirements, career paths, and progression patterns.
• Work Categories. Broad sets of occupational groups that are characterized by common
types of work. There are three DCIPS work categories:
1. entry/developmental
2. full performance
3. senior
4. expert
• Pay Bands. Grades and steps are replaced and a salary range is aligned to the scope and
difficulty of work. There are five DCIPS pay bands, each with a defined minimum and
maximum rate of pay.
Figure 3-2 shows the DCIPS pay band structure and its alignment with the work categories and
work levels.
32
Figure 3-2. DCIPS 2010 Pay Band Structure
Finding 3-3
DCIPS effectively employs a simplified classification process within a pay banding structure.
Although the DCIPS pay banding structure has greatly simplified the classification process,
some managers view a single structure for nonsupervisory employees and their supervisors
negatively. As the system provides no additional compensation for supervisors, it creates a type
of “salary compression” that can result in subordinate employees earning the same or higher
salary as a supervisor in the same band. Many employees also view it as a disincentive for
taking on additional responsibilities that accompany a supervisory role. Under DCIPS, there is
no mechanism to adjust supervisor salaries to account for this situation.
DoD officials characterize this aspect positively, asserting that a “dual track” enables high
performing technical personnel to progress in salary without having to become managers. Some
stated that supervision and management are not inherently worthy of higher salary compensation,
though they viewed the role of supervisor as critical to successful implementation of the DCIPS
performance management system.
Both MSPB and the Academy acknowledge the pivotal role of supervisors in a performance-
based compensation system. For example, MSPB’s guidance indicates that, “pay for
33
performance demands a higher level of supervisory skill than traditional tenure-based systems”61
and places more pressure on supervisors to perform their responsibilities well and treat their
employees fairly. The Academy Panel’s guidance on performance-based compensation systems
strongly articulates the need for separate bands for supervisors and managers.62
The Panel acknowledges that most performance management responsibilities under DCIPS—
establishing performance objectives, engaging in ongoing dialogue with employees, and rating
performance—also were required under the GS/GG system. Apparently, these responsibilities
were not being fully performed prior to their importance being highlighted under DCIPS.
Nevertheless, the Panel believes that it is important to DCIPS’ success to recognize and reward
the role that supervisors play in the performance management process. Absent such a tool,
DoD’s ability to attract and retain high-quality supervisors to DCIPS positions likely will be
impaired.
Other federal HR systems have used different approaches to recognize the critical role of
supervisors in performance-based pay systems. Under the Department of Commerce’s
Alternative Pay System (formerly a demonstration project), all supervisors are eligible for
salaries up to six percent higher than the maximum rate of their pay bands. This additional
compensation, which can be attained through performance pay increases granted in connection
with the regular performance appraisal cycle, provides an additional incentive for supervisors
and managers who perform well in these roles.63
Finding 3-4
DCIPS’ lack of specific salary incentives for supervisors within the pay band structure may
impede DoD’s ability to attract and retain high quality supervisors.
PRINCIPLE
A rigorous performance management system that identifies the “Outstanding,”
“Successful,” and “Unacceptable” performers is the foundation for the performance-
based compensation component of the pay system.
Academy64 and MSPB65 guidance demonstrates that a rigorous performance management system
is the foundation of an effective performance-based compensation system, as did the GAO in
2003 Congressional testimony.66 All three sources indicate that the system must require and
61
2006 MSPB Design Report, p. 6.
62
2004 Academy Design Principles Study, p. 25.
63
“Department of Commerce Alternative Personnel System Operating Procedures,” updated May 23, 2007.
64
2004 Academy Panel Design Principles Study, p. 50.
65
2006 MSPB Design Report, p. 6.
66
GAO, Results-Oriented Cultures: Modern Performance Management Systems are Needed to Effectively Support
Pay for Performance, GAO-03-612T (Washington, DC), Apr. 1, 2003.
34
enable managers and supervisors to communicate the agency’s goals and values to employees
and the way that performance will be measured. As the GAO testimony pointed out:
While there will be debate and disagreement about the merits of individual
reform proposals, all should be able to agree that a performance management
system with adequate safeguards, including reasonable transparency and
appropriate accountability mechanisms in place, must serve as the fundamental
underpinning of any fair, effective, and appropriate pay reform.67
DCIPS’ performance management system is another strong aspect of its design, and DoD
intelligence component managers and employees identify it as one of DCIPS’ most positive
features. Both groups indicated that requiring continuing dialogue between employees and
rating officials is a welcome change that will lead to better understanding between employees
and their supervisors and better distinctions among performance levels. They also believed that
requiring rating officials to conduct and document at least one performance discussion with
employees at the mid-point of the rating period is another strong feature supporting improved
performance and meaningful distinctions at the end of the rating cycle.
Managers and employees generally view the DCIPS performance management system
positively, but a significant number stated that it creates an administrative burden for
supervisors. Effective performance management should be viewed not as an additional duty, but
as an inherent part of a supervisor’s normal responsibilities. Yet many DoD intelligence
component managers perform technical, analytical, or operational duties, as well.
If DCIPS is to succeed, OUSD(I) and DoD intelligence component senior officials must ensure
that all supervisors receive the requisite training to implement and administer the performance
management system effectively. Additionally, they must stress the importance of
communication throughout the rating cycle so that performance management duties are spread
over its entirety. As the GAO testimony noted, an effective performance management system is
not used for episodic occurrences once or twice annually, but as a tool to help an organization
manage its workforce daily.68 Supervisors are the linchpin in the system, and it is critical to
provide them with the tools, training, and resources they need to execute their responsibilities.
Differentiating Performance
67
Ibid, p. 1.
68
Ibid, p. 1.
35
Finding 3-5
DCIPS’ design includes a rigorous performance management system that allows supervisors and
managers to distinguish effectively among levels of performance, but the performance elements
and standards should be reviewed to determine whether they fully support DCIPS’ goals and
objectives.
Performance Objectives
Each employee is rated on three to six performance objectives that are aligned with and cascade
from the agency’s mission, goals, and objectives. According to OUSD(I) guidance,69 the
objectives communicate major individual, team, and organizational responsibilities. Yet specific
policy guidance70 is limited to individual performance requirements and requires that objectives
focus on larger or more significant aspects of the employee’s work and specific results or
outcomes. DCIPS policy requires that individual performance objectives be based on the work
of the specific position and are appropriate for the employee’s pay band and occupational
category.
Consistent with the NICCP, OUSD(I) policy holds that each performance objective be described
in a way that it is specific, measurable, achievable, relevant, and time-bound—that is, following
the SMART criteria. Although these criteria have been a widely accepted methodology for
developing performance objectives for decades, recent research indicates that they are not
universally appropriate. For example, Leadership IQ71 conducted a study in 2010 involving more
than 4,000 employees from 397 organizations to determine goal-setting processes that help
employees achieve great outcomes.72 The study’ findings include:
• Employees are rated on goals that are not particularly helpful. Only 15 percent of
respondents strongly agreed that their goals will help them achieve great
accomplishments, while 13 percent strongly agreed that they will help them maximize
their full potential;
• For employees to achieve great outcomes, their goals must require them to learn new
skills and/or knowledge; and
• To motivate employees to achieve great outcomes, goals must be vividly stated so that
they practically “leap off the paper.”
The Leadership IQ study recommended a new goal-setting process for organizations to inspire
their employees to greater achievements and vividly experience a sense of accomplishment when
69
Writing Effective Performance Objectives, June 2009. http://dcips.dtic.mil/
70
Department of Defense Instruction Number 1400.25-V2011, DoD Civilian Personnel Management System:
Defense Civilian Intelligence Personnel System (DCIPS) performance Management, January 15, 2010.
71
Leadership IQ, headquartered in Washington, DC with regional offices in Atlanta and Westport, Connecticut,
provides best practices research and executive education to the world’s leading companies and their leaders.
72
Leadership IQ, Are Smart Goals Dumb, Apr. 2010.
36
they achieve their goals. Given the importance of the DoD intelligence mission, the nature and
complexity of intelligence work, and the large population of high performers in the intelligence
components, a more tailored methodology for creating individual objectives is needed to
motivate employees and meaningfully distinguish levels of performance.
Additionally, new supervisors need guidance to craft appropriate objectives tailored to the
positions for which they are responsible. OUSD(I) already has developed a useful guide to
writing performance objectives, and it is investing in an online database of “exemplar” objectives
expected to improve the consistency and appropriateness of performance objectives. These steps
should prove helpful, though additional training and guidance is needed for developing
administrative and support employee objectives.73
Performance Elements
Definition
Element Supervisory/Managerial
Nonsupervisory Employee
Employee
1. Accountability Measures the extent to which the In addition to the requirements for
for Results employee takes responsibility for the nonsupervisory employees,
work, sets and/or meets priorities, supervisors are expected to use the
and organizes and utilizes time and same skills to accept responsibility
resources efficiently and effectively for and achieve results through the
to achieve the desired results, actions and contributions of their
consistent with the organization’s subordinates and the organization as
goals and objectives. a whole.
2. Communication Measures the extent to which an In addition to the expectations for
employee is able to comprehend and nonsupervisory employees, DCIPS
convey information with and from supervisors are expected to use
others in writing, reading, listening, effective communication skills to
and verbal and nonverbal action. build cohesive work teams, develop
Employees also are expected to use a individual skills, and improve
variety of media in communicating performance.
and making presentations appropriate
to the audience.
3. Critical Thinking Measures an employee’s ability to In addition to the requirements for
use logic, analysis, synthesis, nonsupervisory employees,
73
OUSD(I) has acknowledged the need to review performance standards to determine whether employees in support
occupations are rated lower than those in mission-oriented ones.
37
Definition
Element Supervisory/Managerial
Nonsupervisory Employee
Employee
creativity, judgment, and systematic supervisors are expected to establish
approaches to gather, evaluate, and a work environment where
use multiple sources of information employees feel free to engage in
to effectively inform decisions and open, candid exchanges of
outcomes. information and diverse points of
view.
4. Engagement and Measures the extent to which the In addition to the requirements for
Collaboration employee is able to recognize, value, nonsupervisory employees,
build, and leverage organizationally- supervisors are expected to create an
appropriate, diverse collaborative environment that promotes
networks of coworkers, peers, engagement, collaboration,
customers, stakeholders, and teams integration, and the sharing of
within an organization and/or across information and knowledge.
the DoD components with DCIPS
positions and the IC.
5. Personal Measures the extent to which the Supervisors and managers are
Leadership and employee is able to demonstrate expected to exhibit the same
Integrity/ personal initiative and innovation, as individual personal leadership
Leadership and well as integrity, honesty, openness, behaviors as all IC employees. In
Integrity and respect for diversity in dealings their supervisory or managerial role,
with coworkers, peers, customers, they also are expected to achieve
stakeholders, teams, and organizational goals and objectives
collaborative networks across the IC. by creating shared vision and mission
Employees are also expected to within their organization; establishing
demonstrate core organizational and a work environment that promotes
IC values, including selfless service, equal opportunity, integrity, diversity
a commitment to excellence, and the (of both persons and points of view),
courage and conviction to express critical thinking, collaboration, and
their professional views. information sharing; mobilizing
employees, stakeholders, and
networks in support of their
objectives; and recognizing and
rewarding individual and team
excellence, enterprise focus,
innovation, and collaboration.
6. Technical Measures the extent to which Supervisors and managers are
Expertise/ employees acquire and apply expected to possess the technical
Managerial knowledge, subject matter expertise, proficiency in their mission area
Proficiency tradecraft, and/or technical appropriate to their role as supervisor
competency necessary to achieve or manager. They also are expected
results. to leverage that proficiency to plan
for, acquire, organize, integrate,
develop, and prioritize human,
financial, material, information, and
other resources to accomplish their
organization’s mission and
objectives. In so doing, all
38
Definition
Element Supervisory/Managerial
Nonsupervisory Employee
Employee
supervisors and managers are
expected to focus on the development
and productivity of their subordinates
by setting clear performance
expectations, providing ongoing
coaching and feedback, evaluating
the contributions of individual
employees to organizational results,
and linking performance ratings and
rewards to the accomplishment of
those results.
Despite apparent satisfaction with the goals of the DCIPS performance management process,
managers and employees raised concerns that the standard performance elements are difficult to
rate and introduce a high degree of subjectivity into the rating process, with an inappropriate
impact on the rating’s final outcome. For example, some employees complained that use of the
“Personal Leadership and Integrity” element is inappropriate and difficult to judge. Further, it
was not clear to some employees why the performance elements receive so much weight in the
performance rating process (40 percent).
There has been a growing trend toward introducing behavioral measures into performance
evaluations; the challenge is to strike the appropriate balance between them and objective
measures. For performance-based compensation systems, it is critical that the balance tilt more
toward clearly documented and measured aspects of performance to provide a defensible basis
for determining performance payouts. OUSD(I) will find it difficult to gain full acceptance of
the performance management system if it retains the performance elements as they are currently
structured.
Performance Standards
Some employees believe that the general standards for summary rating levels are biased toward
work that directly affects the agency’s intelligence mission. Although this is not intended, the
descriptions of Successful and higher performance imply that only work directly impacting the
intelligence mission warrants higher ratings. For example, an Outstanding rating requires that an
employee’s overall contribution result in an “extraordinary effect or impact” on mission
objectives.
The rating descriptions, shown in Table 3-3, have caused employees in the Professional and
Administrative/Technician Work categories to question whether their work can ever be rated at
the highest levels since it does not directly impact the mission, especially when these employees
are in the same pay pools with those in mission-oriented work categories, such as intelligence
analytical and operational work.
39
Immediate corrective action is needed to improve standards for summary rating levels. As
currently written, it is not clear that these standards are equally applicable to all employees under
DCIPS. Further, supervisors and managers will need more training and guidance on applying the
standards to ensure that all employees are afforded the same opportunity to excel.
Rating Determination
As with the objectives, a score from 1 to 5 is assigned to assess employee performance on each
element. To determine the overall rating, the rating official averages the scores for the objectives
and the elements individually, and then averages the two. The final rating is rounded to the
nearest tenth of a point and converted to an evaluation of record using the general standards
described in Table 3-3.
Finding 3-6
As currently designed, DCIPS has the potential to result in inequitable treatment for employees
who perform work that does not directly support the DoD intelligence mission.
The Panel finds that the DCIPS performance management system permits managers and
supervisors to make performance distinctions, but would benefit from further review and
improvement. In this regard, OUSD(I) has indicated that it intends to review the system in
40
cooperation with ODNI. The review will focus on the performance elements, with a view
toward simplifying and reducing their number, ensuring their relevance and value, and verifying
that standards measure as intended. These steps should help to strengthen the performance
management system so that it is effective in achieving desired outcomes in a fair and equitable
manner.
EQUITY
PRINCIPLE
The system identifies the balance among the three aspects of equity: internal,
external/market, and performance/contribution.
In its 2004 report,74 an Academy Panel noted that performance-based compensation systems are
designed to conform to “equity theory”—that is, employees perform best if they know their
compensation is commensurate with the work they perform and understand how others are
compensated. Employees expect equitable treatment, and their perceptions of equity affect job
satisfaction. The generally accepted elements of equity include internal, external/market, and
contribution equity.
In the federal government, internal equity traditionally has been achieved through the
classification process, which requires jobs with similar duties and responsibilities to be assigned
the same grade, resulting in “equal pay for equal work.” Under performance-based
compensation systems, however, internal equity is redefined so that individual performance has
greater impact on compensation, linking it more directly to accomplishments and organizational
contribution.. External/market equity advocates paying employees at salary levels comparable to
those available in other organizations, both inside and outside the federal government.
Contribution equity holds that employees who contribute or perform at higher levels deserve
higher salaries.
As discussed below, DCIPS’ design includes features that balance the three aspects of equity in
the performance management, pay pool, and market alignment features of the system.
Employees are more likely to accept compensation decisions if they perceive that the procedures
used to make them are fair and affect everyone the same. Under DCIPS, internal equity is
achieved by linking eligibility for salary increases and bonuses to employees’ ratings of record,
which reflect their accomplishments for the rating period and their achievement of specific
objectives supporting the organization’s mission, goals, and objectives. A Successful
performance rating entitles the employee to receive at least the “floor”75 of the annual
74
2004 Academy Panel Design Principles Study, p. 15.
75
The minimum performance increase in base salary that an employee performing at the Successful level and
eligible for a performance payout may receive. USD(I) establishes the amount annually. Under DCIPS, the amount
initially equals the annual General Pay Increase that Congress authorizes annually for federal GS employees.
41
performance payout, while employees rated as Minimally Successful may be eligible for a
portion of the floor. Employees rated Unacceptable are not eligible for this floor or any other
performance-based increase or bonus. Table 3-4 shows the relationship between payout
eligibility and employee rating levels.
Overall Average
Evaluation of Record Performance Payout Eligibility
Rating
4.6 – 5.0 Outstanding (5) Eligible for performance-based salary
3.6 – 4.5 Excellent (4) increase, performance bonus, and full Local
2.6 – 3.5 Successful (3) Market Supplement (LMS).76
2.0 – to 2.5 Minimally Successful (2) Eligible for a portion of floor increase.
Ineligible for performance-based salary
increase; ineligible for bonus.
<2.0 or rating of 1.0 Unacceptable Ineligible for LMS, floor increase,
on any objective performance-based salary increase, and
bonus.
DCIPS also uses mathematical algorithms to determine salary increases and bonuses in support
of internal equity. For each employee covered under DCIPS, an algorithm determines an initial
recommendation for salary increases.77 As illustrated in Figure 3-3, the algorithm uses the same
inputs for each employee: performance rating, position in the pay band, and a predetermined
percentage of base pay. It is designed to ensure that each employee’s salary increase is
computed using the “mid-point principle” so that the rate of the increase declines as the ratio of
the employee’s salary to the midpoint of the band increases. Thus, the rate of salary progression
decreases as employees move through the pay bands and moves more of them toward the middle.
This is similar to the longer periods that GS employees wait as they enter the higher steps of the
15 GS/GG grades.
76
The Local Market Supplement is an addition to the compensation of employees assigned to a geographic region or
occupation within a geographic or range of geographic regions. It reflects the competitive requirements for the
applicable labor market. On initial implementation of DCIPS, this amount generally will correspond to GS locality
rates, and is considered part of basic compensation for retirement purposes.
77
Department of Defense Instruction 1400.25, V2012, DoD Civilian Personnel Management System: Defense
Civilian Personnel Management System: Defense Civilian Intelligence Personnel System (DCIPS) Performance-
Based Compensation, Sep. 15, 2009.
42
Source: OUSD(I) Human Capital Management Office.
Pay Pools
43
Although the salary and bonus increase algorithms are intended to achieve internal equity, the
pay pools’ structure and composition also have an impact on equity. DCIPS policy allows DoD
intelligence components to use their own discretion in structuring pay pools based on such
considerations as:
Providing this flexibility to the components has the potential to introduce variation in the size
and composition of pay pools, which can influence an employee’s performance-based salary
increase or bonus. For example, a pay pool with many high-salaried employees will be funded at
a higher dollar amount than one with a relatively lower salary mix, thus making more funds
available to the former. Similarly, pay pools with a greater percentage of employees with high
performance ratings can affect potential payouts because the higher ratings will dilute payouts
from the available funds.78 Wide variations in pay pool size is especially evident at larger DoD
agencies such as DIA, where it was reported that the smallest pay pool had 37 employees and the
largest had 2,205 employees.31
OUSD(I) officials acknowledge that the policy on structuring pay pools provides too much
discretion to components and can result in inconsistent treatment of employees in the same pay
band who perform at the same level. Absent more controls and guidance in the design, the
perceptions of unfairness may be more prevalent. Additional policies and clarifying guidance
are needed to ensure increased equity among pay pools in these decisions.
Allowing components to include different occupational groups in the same pay pool also raises
issues impacting equity. As noted previously, employees in administrative and support
occupations have less direct impact on the mission and may be viewed as less worthy of rewards
for their performance than other employees in mission-critical occupations. This situation could
result in disparate impact on employees due to the nature of their work, rather than the quality of
their performance.
External/Market Equity
External/market equity is necessary to ensure that employee salaries are competitive with those
outside the agency. Currently, DCIPS is designed to achieve external equity through use of the
Local Market Supplement (LMS), which initially will tie to GS/GG locality pay areas and
associated locality rates. However, DCIPS’ goal is to develop a market pricing methodology that
78
ODNI modeling generally showed that payout results were much more consistent for pay pools of 100 or more
employees.
31
DIA Briefing at DCIPS Interim Conference, Southbridge, Massachusetts, Feb. 2010.
44
replaces the current government-wide locality pay methodology. Achieving full market
comparability is necessary if DoD intelligence components are to succeed in attracting and
retaining top talent and becoming “the employer of choice.”
The Panel is encouraged that OUSD(I) has begun to develop an approach for conducting surveys
to assess salary comparability with appropriate markets. This will be helpful to gain further
support for DCIPS. Additionally, OUSD(I) reported an ongoing review of compensation in the
continental United States, Hawaii, Alaska, and Pacific Islands to assess pay comparability.
These steps will help ensure that all three aspects of equity are fully integrated into the design.
Finding 3-7
DCIPS successfully balances internal, external/market, and contribution equity, but internal
equity could be enhanced by a more structured approach to pay pool composition to ensure that
employees with similar duties, responsibilities, and performance ratings are treated equitably.
45
CHECKS AND BALANCES
PRINCIPLE
The system is designed to include a set of checks and balances to ensure fairness.
DCIPS’ performance evaluation and pay pool processes include a system of checks and balances
designed to ensure fairness. These aspects of DCIPS’ design should help mitigate employees’
concerns about the potential impact of DCIPS on career and salary progression.
Two officials have key roles in ensuring fairness and equity in the performance management
process: the reviewing official and the performance management performance review authority
(PM PRA). The former reviews ratings prepared by subordinate rating officials for consistency
and compliance with policies and guidelines. If the reviewing official does not agree with the
narrative or numerical ratings, he or she is required to discuss and resolve the issue with the
rating official. If this dialogue does not end successfully, the reviewing official has the authority
to change the rating to ensure that standards and guidance are applied consistently. The PM
PRA, an official senior to the reviewing official, reviews all evaluations of record to ensure
consistency as well as legal and regulatory compliance. In the pay pool process, DoD
intelligence component heads affected by DCIPS serve as the Pay Pool Performance Review
Authority (PP PRA). They have final approval authority for pay pool recommendations and can
return payouts results to the pay pool manager for remediation if they believe a situation
demands it.
Although DCIPS policies provide a mechanism to review ratings for consistency and compliance
with policies and guidelines, no official policy requires an examination of ratings across the DoD
intelligence components to identify disparate treatment. Draft DCIPS evaluation policy includes
a requirement to examine pay equity across pay pools and protected groups.80 In addition, ODNI
officials indicate that they will review DCIPS performance management and payout results for
adverse impact on protected groups and share the results with the IC Office for Equal
Opportunity and Diversity for validation. OUSD(I) has begun the process to analyze payout
79
Michael M. Harris, Brad Gilbreath, and James A. Sunday, “A Longitudinal Examination of a Merit Pay System
Relationships Among Performance Ratings, Merit Increases, and Total Pay Increases,” Journal of Applied
Psychology, (83), 1998, pp.825-831.
80
The term “protected groups” is used here as defined in equal opportunity laws, including The Equal Pay Act of
1963, as amended; Title VII of the Civil Rights Act of 1964, as amended by the Equal Employment Opportunity Act
of 1972 and the Pregnancy Disability Act of 1978; The Rehabilitation Act of 1973, as amended; The Age
Discrimination in Employment Act of 1967, as amended; and The Civil Rights Act of 1991.
46
results, but the final DCIPS evaluation policy should include a formal mechanism to examine the
impact on employees of protected groups to conform to MSPB guidance.81 For example, a
formal review panel could be formed to review demographic data on gender and RNO and
identify disparate treatment among certain groups. These panels may question ratings—not
overrule them—and rating patterns showing a higher average for one group than for others.
MSPB points out that the credibility of a performance-based compensation system may be
enhanced by establishing an appeals process, providing employees a way to challenge ratings or
pay decisions that they believe are unfair.82 Employees under DCIPS may seek reconsideration
of their ratings by the PM PRA. If dissatisfied with that outcome, they may request further
reconsideration by the DoD component head. No mechanism exists in DCIPS for them to
challenge individual payout decisions, but they may raise specific concerns regarding the pay
pool process under their agency grievance procedures.
In light of employees’ concerns about equity and fairness in the pay pool processes, it is
advisable to provide employees with additional avenues to challenge their ratings and pay
decisions.
There is no formal process to challenging a pay band decision under DCIPS. To ensure that
DCIPS employees have the same rights as others, a process should be established to permit
employees to challenge the decision to assign their position to a specific pay band.
Finding 3-8
DCIPS includes a set of checks and balances in the performance management and payout
processes to ensure fair treatment of all employees, but it lacks a mechanism to challenge pay
band decisions and a strong mechanism to hold managers and supervisors accountable for their
roles in ensuring fairness and equity.
81
2006 MSPB Design Report, p. 34.
82
2006 MSPB Design Report, p. 34.
47
DCIPS FUNDING
PRINCIPLE
Adequate funding is necessary to ensure success of a performance-based pay system.
OUSD(I) policy for funding pay pools conforms to IC-wide policy guidance for pay
modernization, which requires that newly-implemented performance-based compensation
systems remain budget neutral. Under current policy, separate budget recommendations are
established annually to fund salary and bonus pools 84 In accordance with USD(I) funding
guidance, DoD intelligence components allocate money to salary increase pools by choosing and
multiplying a funding percentage by the sum of the base salaries of those employees eligible for
payouts. The policy requires that salary increase budgets be no less than the total funds that
would have been available for step increases, quality step increases, and within-band promotions
had there been no conversion to DCIPS. Similarly, the bonus budgets cannot be less than the
cash awards available had DoD not converted. Pools can only be increased under special
circumstances; one common reason is outstanding organizational performance or contribution to
the component’s mission. Pay pools also may reserve a portion of their budget for unanticipated
requirements, exceptional performance, market anomalies, or other circumstances.
Although DCIPS funding conforms to IC pay modernization policy, current pay pool funding
will not prove adequate over the long term to sustain meaningful payouts for all deserving
employees. Consequently, the system likely will limit rewards for satisfactory (Successful level
rating) employees to ensure more substantial payouts for top performers. Experts who attended
the Academy’s colloquia characterize this as an unintended “win-lose” situation for most
employees. MSPB’s guidance affirms that funding performance-based compensation systems
based on money from existing sources typically results in some employees obtaining more than
they otherwise would have and others less. MSPB notes that this discrepancy seems most
problematic for the “good, solid employees” who may no longer receive regular, albeit modest,
increases to recognize their contributions.85 The alternative is to reduce awards for high
performers to spread available funds more broadly; this is not desirable either as the premise of
performance-based compensation is that top performers should receive greater salary increases
and bonuses.
83
2006 MSPB Design Report, pp. 20-21.
84
Department of Defense Instruction Number 1400.25-V2012, DoD Civilian Personnel Management System:
Defense Civilian Intelligence Personnel System (DCIPS) Performance-Based Compensation, dated January 15,
2010.
85
2006 MSPB Design Report, pp. 20-21.
48
The Panel believes that adequate funding for pay pools should be analyzed further. Other
methods are available to instill confidence in DoD intelligence components and employees that
funds will be available to reward solid performance. For example, OUSD(I) could consider
tapping other sources to create a separate performance management fund from which to provide
meaningful increases. Also, MSPB guidance suggests that it is possible for agencies to pursue
other funding options, such as a working capital fund or a supplemental appropriation to support
payouts for deserving employees.86
Finding 3-9
OUSD(I)’s approach to funding salary increase and bonus pools in a budget-neutral manner
will result in redistributing available funds, but may not provide adequate funding to reward
performance achievements of all deserving employees.
PRINCIPLE
The performance-based compensation system must be sufficiently flexible and responsive
to changing labor market conditions to meet the agency’s HR needs for years to come.
It is difficult to anticipate the changes that can occur in an agency over time. Nonetheless, a new
performance-based compensation system should include a plan to adjust the system to reflect
changes in the organization and the way that the workforce is managed within it. Under DCIPS,
the LMS will provide flexibility to respond to changing local market conditions. However,
OUSD(I)’s approach does not appear to link to its Strategic Human Capital Plan and does not
provide for adjustments that account for changes in the broader labor market. A prior Academy
Panel emphasized the need for a process that enables market alignment for specific occupational
groups, as necessary. 87
Finding 3-10
DCIPS does not currently include an approach for responding to the changes in the broader
labor market when such changes impact compensation equity for DCIPS employees.
86
Ibid, p. 21.
87
2004 Academy Panel Design Principles Study, p. 34.
49
ONGOING SYSTEM EVALUATION
PRINCIPLE
A performance-based compensation system should be evaluated regularly and
modified when necessary.
DCIPS’ overarching policy includes a provision for ongoing evaluation of the system against its
broad policy goals.89 Although the final policy has not been released, the plan is to conduct
ongoing review and modification of both DCIPS design and implementation based on:
To ensure credibility of the evaluation plan, it is important to establish early the metrics to be
used to assess the achievement of DCIPS’ goals and the system’s impact on the DoD intelligence
components’ missions. Additionally, it is necessary to widely communicate the evaluation
results and changes made in response to employees’ concerns.
Finding 3-11
DCIPS’ design includes a process for ongoing evaluation and modification of the system, but an
official evaluation policy is not yet in place and, thus, there is no formal requirement for analysis
of performance management and pay pool results to determine impact on women, minorities,
and other protected groups.
88
2006 MSPB Design Report, p. 33.
89
DoD Civilian Personnel Management System: Volume 2001, Defense Civilian Personnel System (DCIPS)
Introduction, Dec. 29, 2008.
50
HOW DCIPS COMPARES TO THE GS/GG SYSTEM
The GS/GG system is the federal government’s primary classification and pay system for white-
collar employees. Employee pay is largely determined in accordance with government-wide
rules consistent with the GS classification system that places positions in one of 15 grades based
on duties, responsibilities, and qualifications requirements. For more than 20 years, federal
agencies have grown increasingly frustrated by the “one-size-fits-all” rules and regulations of the
GS/GG system; they have either sought relief through legislation or subtly adopted practices that
are inconsistent with the laws and regulations. In recent years, more agencies have opted out of
the system through individual legislation allowing them to create their own systems, almost all of
which have included some form of broad-banding and performance-based compensation.
Notwithstanding differences between DCIPS and the GS/GG system, a major strength of DCIPS’
design is that it continues the employee protections afforded to all federal civil servants under the
GS/GG system, as required by the Merit Systems Principles90 and Prohibited Personnel
Practices.91 One Merit Systems Principle requires that all employees and applicants for
employment receive fair and equitable treatment in all aspects of HR management without
regard to political affiliation, race, color, religion, national origin, sex, marital status, age, or
handicap condition, and with proper regard for their privacy and constitutional rights.(See
Appendix C for a complete list of the principles and prohibited personnel practices.)
DCIPS also includes the expectation that all HR decisions will be made in a manner that is
efficient, effective, fair, and free from political interference. Additionally, DCIPS does not alter
policies governing retirement benefits and eligibility, health and life insurance, leave, attendance,
and other similar benefits. Beyond these core protections, DCIPS differs from the GS/GG system
in several significant ways. It creates broad pay bands in lieu of the 15 grades, introduces
performance-based compensation in place of longevity-based salary increases, and requires a
stronger, more rigorous performance management system. Table 3-5 summarizes how DCIPS’
major features compare with the GS/GG system.
90
5 U.S. C. 2301.
91
5 USC 2302 (b).
51
HOW DCIPS COMPARES TO NSPS
As discussed in Chapter 2, NSPS was developed to replace the GS/GG system for DoD’s non-
intelligence workforce. Like DCIPS, it reflected Executive Branch concerns that the GS system
was no longer adequate to recruit, hire, and compensate the workforce needed to support DoD’s
national security mission. NSPS encountered legal challenges from employees and unions
alleging that its provisions were applied inconsistently, resulting in disparate pay outcomes for
affected employees. The FY 2010 NDAA repealed NSPS’ statutory authority and directed the
Secretary of Defense to terminate it and transition all covered employees out from it no later than
January 1, 2012.92
DCIPS and NSPS share several design characteristics.93 Both were designed to foster a strong
performance culture by creating an HR system that more directly links employee pay to
performance and contribution to the DoD mission.94 Both employed pay bands that replace the
15 GS or GG grades, with salary progression within the bands based on annual performance
assessments. They also increased communication between employees and their supervisors.
Finally, they both use pay pools funded from available resources to provide for performance-
based compensation.
NSPS policies required the integration, rather than separation, of performance management and
pay pool processes, a key area where it and DCIPS diverge. Other key differences are the areas
that have the greatest impact on employees’ compensation and their perceptions of system
fairness. Given that DCIPS’ performance evaluation and pay pool processes are separate, for
example, there is no commingling of salary and bonus pool funds. Unlike NSPS, DCIPS policy
requires that employee ratings be prepared and approved prior to the pay pool process. Further,
DCIPS does not permit pay pool officials to change ratings in the process of deciding salary or
bonus payouts. In contrast, NSPS pay pool panels had authority to change performance
management ratings during their deliberations to determine performance-based payouts and
require the supervisor to accept them, even if the supervisor disagreed.95 The Defense Business
Board Report noted this as a major area fueling employee mistrust of the system and its
processes.
OUSD(I) officials indicated that they were attentive to DoD’s challenges with NSPS and applied
those lessons learned to DCIPS’ design features. Although some online dialogue and open
forum participants expressed concern about the fairness of ratings and pay pool processes, these
do not appear to be a function of the DCIPS’ design, but a result of how supervisors and
managers are implementing the system’s provisions. Table 3-6 provides a more detailed
comparison between DCIPS and NSPS. Key differences are highlighted in yellow.
92
Pub. L. 111-84, Sec. 1113.
93
NSPS changed the classification, compensation, recruitment, and staffing of DoD positions, but this comparison is
limited to aspects of NSPS that can be compared to DCIPS’ existing features, as officially documented in approved
policies.
94
DoD 1400.25-M, SC 1940, Subchapter 1940, Performance Management, dated Dec. 1, 2008.
95
As reported in the Defense Business Board Report to the Secretary of Defense, “Review of the National Security
Personnel System,” July 2009, which references the NSPS 2008 Evaluation Report, pp. 5-10.
52
Table 3-7. Comparison of DCIPS and NSPS Design
Design Feature DCIPS NSPS
Merit Systems Principles Merit Systems Principles and other employee Merit Systems Principles and other employee
protections are retained and supported by governing protections were retained and supported by governing
policies. policies.96
Mission Alignment Alignment with intelligence and organizational Alignment with national security mission
mission documented in policy. documented in policy.
Occupational Structure Three work categories and several occupational Four career groups; job titles aligned with these
groups. groups.97
Pay Structure One common pay band structure that uses five pay Multiple pay bands within four career groups and
bands arrayed across three different work categories. several pay schedules.98
Performance Management
Rating Cycle Fiscal Year cycle. Same.
Performance Objectives Each employee generally rated on three to six Each employee rated on three to five weighted job
performance objectives linked to the agency mission; objectives.99
one is the minimum objective required.
Performance Elements Six performance elements used and considered Seven contributing factors100 used to assess the
separately from the objectives: manner of performance important for the
1. Accountability for Results accomplishment of each objective:
2. Communication 1. Communication
3. Critical Thinking 2. Cooperation and Teamwork
4. Engagement and Collaboration 3. Critical Thinking
5. Personal Leadership and Integrity/Leadership 4. Customer Focus
6. Technical Expertise/Managerial Proficiency 5. Leadership
6. Resource Management
Elements evaluated separately from job objectives. 7. Technical Proficiency
96
DoD 1400.25-M, SC1940, Subchapter 1940: Performance Management, p. 2.
97
NSPS career groups included Standard Career Group, Medical Career Group, Scientific and Engineering Career Group, and Investigative and Protective
Services Career Group.
98
The four pay schedules are Professional/Analytical, Technician/Support, Supervisor/Manager, and Student.
99
DoD 1400.25-M, SC1940, Subchapter 1940: Performance Management, p. 7-8.
53
Design Feature DCIPS NSPS
Contributing factors used to adjust the ratings of job
objectives.
Rating Decisions Rating decisions determined by the rating official and Ratings determined by the Pay Pool Panel and
approved by reviewing official prior to the pay pool approved by the Pay Pool Manager during pay pool
process. deliberations.101
Reconsideration of Ratings Employees can request reconsideration of the rating by Within ten days of receiving a rating, an employee
submitting a written request to the PM PRA within 10 could request reconsideration of the rating by
days. submitting a written request for reconsideration to the
Pay Pool Manager.103 A bargaining unit employee
could challenge a rating of record through a
negotiated grievance procedure.104
Recognition for Although DCIPS does not preclude recognition for Pay Pool Manager had authority to approve specific
Organizational and/or Team team/organizational performance, there is currently no recognition for Organizational and/or Team
Achievement formal process in place to recognize and reward team Achievement (OAR).105
or organizational achievement.
Pay Pool Process
Structure of Pools Separate pools for salary increases and bonuses. Combined salary increase and bonus pools.
Funding of Pools Salary increase budgets will not be less than that Funding of pools through three different sources of
which would have been available for step increases existing funds:106 (1) funds spent on step increases,
100
Ibid, p. 8.
101
Ibid, p. 3.
102
Ibid, p. 16.
103
Ibid, p. 21.
104
Ibid, p. 23.
105
Ibid, p. 8.
54
Design Feature DCIPS NSPS
quality step increases and within-band promotions had
quality step increases, and promotions between GS
DoD not converted to DCIPS. grades that no longer exist under NSPS, (2) funds
that remain available from the government-wide
Bonus budgets generally will not be less than the funds general pay increase (GPI) after the Secretary makes
that would have been available for cash awards and/or decisions to fund Rate Range Adjustments and/or
component bonuses had DoD not converted to DCIPS. Local Market Supplements, and (3) funds historically
spent for performance based cash awards. Additional
funds could be added to the pools at the discretion of
the component organization.
Eligibility for Payout Employees rated at Level 3 and above guaranteed the Employees rated Level 2 and above guaranteed 60
DCIPS “floor,” i.e., full GPI. Employees rated at percent of the GPI.107
Level 2 are initially guaranteed 60 percent of the
DCIPS “floor.”
Checks and Balances to Ensure Fairness
Review of Performance Separate oversight and review of performance No separation of oversight and review authority for
Management and Payout management and payout decisions. For performance performance management and pay pool processes.108
Decisions management, the reviewing official and Performance The Performance Review Authority, Pool Managers,
Management Performance Review Authority have and Pay Pool Panels provide review and oversight of
oversight roles. For payout decisions, the Pay Pool both the performance management and pay pool
Performance Review Authority reviews and approves processes.109
final pay pool decisions.
Process for Reviewing Included as integral component of draft DCIPS Post-decisional analysis of rating results to identify
Ratings and Payouts to Evaluation Policy. barriers to equitable treatment and corrective actions.
Assess Impact on Protected
Groups
Ongoing System Evaluation
A formal evaluation policy is under development, but A formal Evaluation Plan was published on June 30,
not yet published. 2007.
106
DoD 1400.25-M, sC1930, Subchapter 1930: Compensation Architecture Pay Policy, SC1930.9.2, p. 6.
107
NSPS Frequently Asked Questions at www.cpms.osd.mil/nsps.
108
DoD 1400.25-M, SC 940, Subchapter 1940: Performance Management, p. 3.
109
Ibid, pp. 3-4.
55
The Panel believes that OUSD(I) has heeded the challenges and lessons learned from the NSPS
experience. Consequently, DCIPS’ performance and pay pool management policies are more
transparent and provide for more equitable treatment of all employees.
CONCLUSION
As previously noted, the Panel has concluded that DCIPS’ design is fundamentally sound and
adheres to accepted design principles for performance-based compensation systems. DCIPS
fully retains the protections afforded employees in the federal civil service and includes checks
and balances to ensure fairness and equity in performance management and pay pool decisions.
It also incorporates design features derived from lessons learned from best practices and
challenges faced by the recently-terminated NSPS.
DCIPS’ design includes several other strengths: the simplicity and clarity of its occupational
structure, a single pay banding system, its rigorous performance management system, separate
performance management and pay pool processes, and its planned process for ongoing system
evaluation. Although the Panel has identified a number of areas in this chapter where
improvements can be made, the Panel does not consider these to be fatal design flaws, but,
rather, opportunities to further tailor, strengthen, and refine a system that is already
fundamentally sound.
The Panel offers several recommendations below to further strengthen DCIPS’ design.
RECOMMENDATIONS
Recommendation 2. OUSD(I) should review and assess models for measuring and
rewarding team and organizational performance under DCIPS to ensure alignment with
the IC’s broad goals.
• Review its policies regarding pay pool composition to ensure equitable treatment of
similarly situated employees. This review should examine the policy for
determining the size of pay pools and practice of assigning employees of different
work categories to the same pay pool.
56
Recommendation 4. To ensure equitable treatment of all employees, OUSD(I) should
review the performance management system to:
• Clarify and strengthen its guidance for developing performance objectives to ensure
that managers and supervisors fully understand ways to develop appropriate
objectives for all employees, including those in non-mission work categories.
• Refine and modify the impact of the performance elements to ensure that they
permit meaningful and appropriate assessments of factors affecting overall
performance.
• Adjust the performance standards for summary rating levels so that they permit the
same performance assessments for all categories of work.
Recommendation 6. OUSD(I) should finalize its evaluation policy and ensure that it defines
a process for monitoring DCIPS’ impact on salary increases, bonuses, and career
progression of women, minorities, and other protected groups.
57
This Page Left Intentionally Blank.
58
CHAPTER 4
INTRODUCTION
Unlike most performance management initiatives, the implementation of DCIPS has not been
driven by a specific “performance problem.” Rather, the goal is more structural and process
related. An underlying assumption is that disparate personnel systems pose a potential risk to the
accomplishment of the overall DoD intelligence mission. However, advocates do not go so far
as to draw a link between DCIPS and the production of better intelligence. DCIPS’ goal is to
achieve greater unity and uniformity in personnel management. As of yet there is no plan for
measuring and assessing DCIPS’ ultimate success. This poses one challenge to implementation:
The end point lacks clear definition.
The number of DoD intelligence components engaged in this sweeping change presents an
additional challenge. Introducing DCIPS in one organization is difficult. Conforming to the
DCIPS conversion schedule—which entails managing shifts in values and behaviors
simultaneously within multiple organizations with very different cultures and characteristics—is
daunting. For example, the civilian intelligence workforce within the military services—unlike
DIA, DSS, NGA, NRO, and NSA—faces unique tests:
• The affected workforce is smaller—slightly fewer than 2,800 in the Navy and
approximately 200 in the Marine Corps.
• There is predictably high turnover of the uniformed supervisors, requiring retraining of
new supervisors every two to three years.
• Supervisors of DCIPS employees often must be conversant with and able to apply
multiple personnel systems to the members of their varied workforces.
59
• The workforce is very geographically dispersed.
The NDAA pause is yet another challenge to implementation. DCIPS Interim, put into place
following the suspension of DCIPS pay authorities, has been a source of confusion, frustration,
and discouragement for many in the workforce, including HR implementers. For example, many
negative sentiments expressed by DoD intelligence component personnel appear to be strongly
influenced by the effects of the NDAA pause, and have little to do with DCIPS overall. Thus,
implementation has been attempted in an environment beset by internal and external challenges.
This chapter focuses on how DCIPS has been implemented and addresses the provision of the
NDAA requiring an assessment of the adequacy of the training, policy guidelines, and other
preparations afforded in connection with transitioning to that system. The OPM Alternative
Personnel Systems Objectives-Based Assessment Framework was used as the basis to assess
DCIPS’ implementation.110 Using this framework, and the elements considered essential to
effective implementation of alternative pay systems, the Panel provides a series of findings,
conclusions, and recommendations.
EVALUATION FRAMEWORK
OPM’s Alternative Personnel Systems Objectives-Based Assessment Framework has been used
to guide the Panel’s assessment of DCIPS implementation.111 The dimensions and elements that
comprise the framework (see Table 4-1) are based on lessons learned from federal government
demonstration projects involving alternative personnel systems, as well as best practices drawn
from large human capital transformation programs.
The standards used to assess DCIPS’ design in Chapter 3 focused on the presence or absence of
necessary policies and provisions. By contrast, the framework is based on indicators of how well
the DoD intelligence components prepared for implementation and are meeting the objectives.
110
OPM, Alternative Personnel Systems Objective-Based Assessment Framework, Oct. 2008. Hereafter “the
framework.”
111
Ibid.
60
Table 4-1. Overview of the OPM Objectives-Based Assessment Framework
Framework
Dimensions Elements
Component
Preparedness Leadership Commitment • Engagement
• Accountability
• Resources
• Governance
Open Communication • Information Access
• Outreach
• Feedback
Training • Planning
• Delivery
Stakeholder Involvement • Inclusion
Implementation Planning • Work stream Planning and
Coordination
• HR Business Processes and
Procedures
• Tools and Technology
Infrastructure
• Structured Approach
Progress Mission Alignment • Line of Sight
• Accountability
Results-Oriented Performance Culture • Differentiating Performance
• Pay for Performance
• Cost Management
Workforce Quality • Recruitment
• Flexibility
• Retention
• Satisfaction and Commitment
Equitable Treatment • Fairness
• Transparency
• Trust
Implementation Plan Execution • Work Stream Planning and Status
• Performance Management System
Execution
• Employee Support for APS
(i.e., DCIPS)
It is not possible to evaluate DCIPS against every dimension or element in the framework at this
time given DCIPS’ relative immaturity, the varying stages of implementation across the
intelligence components, and the effects of the NDAA pause.
61
PREPAREDNESS
Preparedness is the extent to which OUSD(I) laid the groundwork for DCIPS’ success by
preparing employees for the change and establishing the supporting infrastructure. “Agencies
that do not place sufficient emphasis on Preparedness are likely to encounter significant
implementation problems, thereby reducing the ultimate effectiveness” of the system.112 The
dimensions of Preparedness are:
• leadership commitment
• open communication
• training
• stakeholder involvement
• implementation planning
LEADERSHIP COMMITMENT
▪ Engagement ▪ Accountability ▪ Resources ▪ Governance
Leadership commitment, a key dimension of all successful change efforts, involves engagement,
accountability, resources, and governance. A dimension of the Preparedness component in the
framework, it is considered a best practice by those who study alternative personnel systems in
the federal government.113 Agency leaders must be visibly and actively engaged in planning the
change, championing the system, and communicating to employees that the change is a mission
imperative, not simply an HR program. Following implementation, they have an ongoing
responsibility to reinforce their commitment and ensure the system’s continued success.
Commitment provides an emotional aspect that can be elusive to measure. The framework
focuses on specific behaviors that demonstrate leadership commitment, but does not address the
underlying strength of leadership conviction that supports those behaviors. According to the
framework, leadership commitment is measured by the extent to which leaders communicate
with the workforce about the system, prioritize system implementation, provide appropriate
resources, and are held accountable for system execution.
62
In the DCIPS context, “leaders” refer to the USD(I) and DoD intelligence component heads.
DCIPS could not have progressed as far as it has without the strong level of commitment from
the USD(I), but component heads have been less supportive. Preliminary results of a recent
OUSD(I) DCIPS survey of intelligence component employees show that leadership commitment
across the components has been uneven; agreement that senior organizational leaders are
committed to DCIPS ranges from 24 percent at the NRO to 76 percent at OUSD(I).114
Engagement
Engagement is the extent to which leaders conduct outreach to the workforce to champion the
system, provide information, and gain employee acceptance. The outreach should be strategic,
rather than tactical, in focus. The purpose is to demonstrate leadership support, emphasize
accountability for making it happen, and foster employee acceptance of DCIPS.
• The USD(I) held a formal DCIPS kick-off event in December 2007. Attendees included
DoD intelligence component directors and Defense Intelligence Human Resources Board
(DIHRB) members;6
• The USD(I) offered periodic messages to the workforce, and OUSD(I) provided
messages for the components to adapt and use; and
• The USD(I) is prominently featured on the DCIPS website and issues periodic updates to
component directors.
Notwithstanding these steps, OUSD(I) has not monitored the frequency, content, delivery
mechanisms, or quality of messaging at the component level. In addition, there has been a lack
of constancy and consistency in those messages. Further, only four USD(I) messages to the
workforce have been posted on the DCIPS website since 2007; these messages focused on
specific advantages of DCIPS, such as human capital flexibilities, consistency in occupational
structure, and the link of individual performance to agency mission. Noticeably absent from the
communications is a strategic focus—conveying a sense of urgency, offering a convincing
argument for how DCIPS contributes to mission accomplishment, or describing what will
constitute success.
Academy focus group participants reported that some agency leaders voiced support for DCIPS
frequently and through multiple channels within their organizations. Yet they acknowledged that
the link between DCIPS and agency mission has not been well communicated to the workforce.
Others said their leadership was less supportive and communicated this clearly through their lack
of engagement.
Overall, USD(I) engagement during DCIPS implementation has been insufficiently frequent and
not fully effective in gaining widespread support. Communication from all senior leadership
114
2010 DCIPS Survey Preliminary Results, Question 33.
6
The DIHRB is discussed in more detail in the Governance section of this chapter.
63
levels has focused too much on tactical or management issues and has lacked key strategic
points. As a result, acceptance and commitment at multiple levels, including among members of
the senior leadership, have been lacking.
Finding 4-1
DoD intelligence component leadership engagement has been inconsistent and messages that
link DCIPS to mission have been lacking.
Accountability
Accountability refers to the extent to which agency leaders identify system implementation as an
agency priority, are involved in the system’s design and implementation, and are held
accountable for implementation. Accountability is a key success factor in change management;
sufficient measures and mechanisms of accountability must be communicated and employed in
any effort to institute meaningful change.
The Defense Intelligence Enterprise Human Capital Strategic Plan, 2010-2015 lists DCIPS
implementation as a DoD priority that will support the goal of “an integrated, interoperable,
diverse, and mission-aligned defense intelligence enterprise workforce.”115 DCIPS is described
as a priority in DoD documents, but accountability for effective implementation has not been
enforced among senior intelligence component officials. No metrics or performance objectives
for senior managers align with DCIPS implementation, and there is no evidence that OUSD(I) is
holding senior agency leaders accountable. OUSD(I) has verified the lack of formal mechanisms
and accountability metrics for tracking implementation activities within and across components.
This poses a challenge to ensuring consistent implementation.
Finding 4-2
Formal mechanisms, such as metrics or performance objectives, are lacking to hold agency
leaders and senior managers accountable for DCIPS implementation. This has contributed to
inconsistent implementation across the components.
Resources
Successful implementation requires that agency leaders create the appropriate organizational
structure, with adequate resources and authorities to implement the program. The authority,
staffing, and funding of the program management function within OUSD(I) provides one
indication of the progress to date.
OUSD(I) did not establish a program management office (PMO) with dedicated personnel,
authority, and responsibility for DCIPS design and implementation. The intent was that each
115
Defense Intelligence Enterprise Human Capital Strategic Plan, 2010-1015, Office of the Under Secretary of
Defense for Intelligence, Human Capital Management Office, p. 3.
64
intelligence component would do so. Although the combat support agencies (DIA, DSS, NGA,
NSA, NRO) established DCIPS PMOs, the military services did not.
The OUSD(I) budget includes a DCIPS line item; approximately three positions, with contract
support, have responsibility for DCIPS design and implementation at the OUSD(I) level.
Without a centralized PMO to direct the effort, however, oversight of component implementation
has been inadequate. OUSD(I) officials concurred that oversight has been lacking and that they
relied on each component’s self-assessment of readiness to implement. As a result, they have
not been able to verify whether adequate training has taken place or whether the components
fully understand the required change management.
In addition, some intelligence components made changes to DCIPS without OUSD(I)’s prior
approval or knowledge. As examples, DIA made significant changes to the performance
evaluation tool and NSA decided to rename DCIPS as “ACE.” Component-specific
modification undermines the goal of creating a unified personnel system across the agencies.
Finally, OUSD(I) staff responsible for implementing DCIPS have extensive backgrounds in HR,
but they have little change management experience. Similarly, the military services, which did
not establish PMOs, experienced challenges with adequate staffing, resources, and authority for
implementation.
Finding 4-3
The lack of an OUSD(I) DCIPS PMO has resulted in OUSD(I)’s inability to provide adequate
oversight of DoD intelligence component readiness and implementation.
Governance
Governance entails establishing processes to resolve conflicts and make decisions. OUSD(I) has
them in place at two levels: the Defense Intelligence Human Resources Board (DIHRB) and
DCIPS Working Group.
Established in 2006, the DIHRB is responsible for addressing and providing recommendations to
the USD(I) on human capital issues, including DCIPS. It is composed of a Defense Intelligence
Senior Executive Service or equivalent official from each intelligence component, the DoD
office of General Counsel, and Director of Administration and Management. The DIHRB is co-
chaired by designees of the USD(I) and the Under Secretary of Defense (Personnel & Readiness)
(USD(P&R)).116 When the DIHRB is unable to reach consensus, the USD(I) decides the matter.
The DCIPS Working Group, composed of OUSD(I) HCMO staff and HR representatives from
the intelligence components, is responsible for developing and updating personnel policies,
reviewing and commenting on the design of tools to support DCIPS, serving as a liaison between
116
DoD Intelligence Human Capital Management Operations, Department of Defense Instruction 3115.11, Jan. 22,
2009, pp. 5-7.
65
OUSD(I) and the components, and providing recommendations to the DIHRB, the USD(I), and
the USD(P&R) on personnel business practices related to DCIPS implementation.117
OUSD(I) has a clear decision-making process for implementation, but decisions have not always
been communicated or explained clearly to the workforce; sometimes, they have appeared
arbitrary. One example is the split of GS-13 level personnel into pay bands 3 and 4 upon
conversion to DCIPS. The DIHRB could not reach consensus on this issue, so the USD(I)
decided to place GS-13 steps 1 and 2 into band 3, and those in steps 3 and above into band 4.
The effect was to place into two new and separate pay ranges employees who sat side by side,
did the same work in similar positions, and previously were in the same salary range. Many
employees who view the decision-making process as confusing and seemingly arbitrary—and
the outcome as unfair—cited this example.
Finding 4-4
Decisions have not been adequately explained to employees, which leads them to be distrustful of
the decision-making process and view resulting decisions as unfair.
OPEN COMMUNICATION
▪ Information Access ▪ Outreach ▪ Feedback
Open communication entails providing the workforce with access to accurate and timely
information. It also requires establishing an outreach mechanism for gathering and considering
employee feedback.
Information Access
The framework’s Information Access element refers specifically to having a website to support
broad information sharing concerning design, training, and other implementation issues.118
OUSD(I) maintains two sites for sharing program-related content: the DCIPS website and the
DCIPS Readiness Tool.
DCIPS Website. The DCIPS Website (http://dcips.dtic.mil) offers status updates, links to
information, and frequently asked questions pages aimed at addressing multiple issues. The
website is the primary resource to which the intelligence component workforce is directed for a
variety of information needs.
The website is primarily used by the military service organizations, which have accounted for
well over 60 percent of the users in the past two years.
117
Ibid, p. 7.
118
2008 OPM Framework, p. 25.
66
• Credible, timely, and original in its content;
• Responsive to the user;
• Easy to read and understand;
• Interactive;
• Well organized;
• Filling a necessary niche.119
The website fills an important niche for the intelligence components. However, it does not
contain timely information that is interactive and responsive to the user. The home page offers a
message from the USD(I) that has been updated only four times since its creation in 2007, most
recently in January 2010. Further, the message lacks many key aspects of leadership
engagement described earlier (sense of urgency, case for change, and the like) but instead
discusses NDAA’s effects. The remaining information on the site is generally one-dimensional,
allowing for information to flow outward rather than in a manner that facilitates two-way
communication.
The website was intended to be an information-sharing tool. As such, the information should be
message based, technical, relevant, and timely. Websites currently use Web 2.0 tools to facilitate
openness and engage and inform their audience. An example would be a prominently displayed
blog that provides updates, answers questions, and interacts with users. Allowing the blog to
receive comments would create a dialogue between users and administrators, stimulating open
communication. Additionally, a video message from the USD(I) might be effective. These
changes would support a more engaging, relevant, and multi-dimensional communication
channel.
Comments from the Academy focus groups included multiple concerns with the website. Some
users noted OUSD(I)’s overreliance on the site as its primary (and sometimes sole)
communication channel. DCIPS employees are strongly encouraged to visit the site to address
their questions or seek information, but usage data suggest that the website does not meet these
information needs.
Visits to the website peaked during 2008. Since then, most visitors view only the home page;
few click through to other pages. 2010 tracking figures to date indicate that the home page was
visited more than 400,000 times, but that other pages typically had only 1,000 to 4,000 hits. This
rate suggests that visitors open the home page, see little that has changed, and then leave the site.
Thus, new but less prominently placed content might be missed.
119
King, Andrew B. “What Makes a Great Web Site?” Web Development and Design Tutorials, Tips and Reviews -
WebReference.com. Internet.com, Aug. 1999. Web. 21 Apr. 2010. http://www.webreference.com/greatsite.html.
67
Readiness Tool
The DCIPS Readiness Tool is a key communication channel and access point for program
information, guidance, and training. Yet its target audience is primarily the HR personnel
involved in implementation within the components. It contains numerous training courses and
briefings developed by both OUSD(I) and the components.
A data repository for DCIPS implementers, the Readiness Tool is most used by the HR staff and
their contract support. At present, there are a total of 428 registered users, and usage is light. A
query of monthly activity indicated fewer than 20 hits per month. March 2010 had the highest
number of hits in the 29 months since its launch.120 Given that the tool is to be a resource for
sharing guidance, communication, and training products, the level of engagement is notably
limited.
The Readiness Tool has a number of challenges. From a design standpoint, the site structure is
not intuitive; materials are not indexed and no search capability exists. Various forms of content
reside on the site, but there is no clear indication of what is there, who developed it, what its
purpose may be, or how accurate the materials are.
The tool shares many design features with the DCIPS website, including the labels of buttons
and links, though they direct the user to different content within the Readiness Tool. For
example, the Documents link on the website takes the user to published reports, while the same
button on the tool takes the user to a mix of training course materials and briefings. Such design
features confuse users.
A robust tool engages and presents or directs users to contextually relevant material, based on
their role and issue. The existing Readiness Tool lacks this dynamic interaction as it is a static,
un-indexed repository of information. OUSD(I) has not played an active oversight role in
reviewing or managing the content; this adds to the difficulty of knowing whether the materials
convey accurate information.
As web-based tools are critical to supporting the information needs of the DCIPS community,
more robust, user-centered performance support tools are needed to address the range of
performance support topics and issues that DCIPS encompasses.
120
OUSD(I) Readiness Tool usage report.
68
Finding 4-5
The DCIPS website and Readiness Tool do not effectively provide comprehensive or timely
information to their intended audiences. As a result, use of these tools is low and information
needs are not being met.
Outreach
The Outreach element of the framework refers to adoption of a communication strategy that
works to produce successful cultural change in a particular area. In DCIPS’ context, the strategy
would be developed and pursued by managers and HR implementers in the intelligence
components. Outreach is a more tactical and repetitive form of interaction than senior leadership
engagement; its focus is on status updates and technical and operational information. Review of
this element addresses the overall strategy and approach taken for outreach, the channels of
communication, and the message content.
Prepared by OUSD(I) in 2008, the DCIPS Communications and Learning Plan outlines the
overall communication strategy. It was envisioned that the plan would be updated as program
requirements and component needs emerged, but no updates or outreach guidance have been
identified. The Readiness Tool also offers basic outreach materials for HR implementers to
modify and use within the components.
• DIA reported that it conducted more than 260 communication events as of mid-2008,
including 164 “DCIPS Overview” town halls, 40 “Performance Management” town halls,
30 road shows, and several additional events for specific audiences.121
• Navy posted multiple briefings, brochures, fact sheets, and other communications on the
DCIPS Readiness Tool; most content focused on the system’s design features.
• OUSD(I) engaged in town halls, executive briefings, and surveys early in the
implementation process, and recently sponsored an additional DCIPS workforce survey.
Overall, OUSD(I) has not effectively overseen or monitored the outreach efforts and indicators
of activity that components have undertaken. The outreach approach within the components
reflects some of the same challenges noted under Leadership Engagement, namely that senior
leadership support, plans, and outreach activities were inconsistent.122 In its Senior Leadership
Guide to DCIPS, the Navy emphasized the importance of senior leaders in DCIPS’ success and
reiterated early USD(I) messaging: “DCIPS…embodies the core values of the U.S. intelligence
community—Commitment, Courage, and Collaboration.” Few other communications carried
this critical message.
121
DIA DCIPS Training Communications, Apr. 2008.
122
Component briefings at National DCIPS Conference, Southbridge, Massachusetts, Feb. 2010.
69
Implementation responsibility for DCIPS, including outreach, has been left primarily to the
intelligence components’ HR managers, causing concern within their community about an over-
reliance on them as change agents, especially when leadership support has been inconsistent or
lacking. Absent visible agency leadership to provide consistent, frequent messages about
DCIPS’ importance, the perception has grown that the system is an HR program, not a broader
strategic management program.
The DCIPS Communications and Learning Plan outlines a few communication “products,”
specifically the DCIPS website as the primary communication channel to the workforce and the
Readiness Tool as the main repository of outreach guidance and examples for implementers.
Most of the content of communications from OUSD(I) and the components focused on HR
issues, such as the system’s design and mechanics. No one, however, provided a strong case for
the need, urgency, or desired outcome.
The lack of outreach success indicates that the approach and actions taken to date have been
ineffective. Few focus group and open forum participants—employees and supervisors alike—
could clearly explain the strategic outcome that DCIPS was designed to achieve. Their answers
reflected an emphasis on process improvements but fell short of actual impacts and outcomes
affecting their component’s ability to achieve its mission. This is not surprising given the limited
attention that senior leadership devoted to defining the desired outcome and communicating it.
Although the link between DCIPS and mission enhancement is necessarily indirect, the
discussion of any relationship between the two was largely ignored in strategic communications.
At a more tactical level, there appears to be a high degree of frustration and confusion among the
workforce about many of DCIPS’ technical features and its status. Communications were
reactive and ever changing as ad hoc updates were issued for policy, guidance, tools, and other
program aspects. The system implementation itself was rushed, and the outreach efforts
reflected a lack of overall strategy and sufficient guidance.
As noted earlier, OUSD(I) did not provide strong oversight or guidance to the components for
outreach. It pushed information out through the Readiness Tool, but did not engage in follow up
to review component communications for accuracy and timing. The lack of strong, centralized
guidance and oversight for outreach resulted in uneven activities and inconsistent messages
among the components.
Additionally, OUSD(I) might consider issuing a style guide for outreach. A guide offers
templates, key phrasing, terms, logos, and other features that would brand DCIPS as a unified
program and allow all components to use a common voice when communicating about it. The
lack of such guidance resulted in OUSD(I) and components reverting to templates from NSPS.
Those templates failed to distinguish DCIPS from NSPS, and further perpetuated a negative
association between the two programs.
70
Finding 4-6
The lack of a communications plan and style guide, incorporating strategic change management
principles, has resulted in inconsistent messaging that has focused on the mechanics of DCIPS,
rather than its mission-related objectives.
Feedback
Feedback means providing a formal mechanism for employees to provide input on specific
aspects of the system, as well as a way for implementers to consider this feedback. Stakeholder
Involvement, discussed later in this chapter, means actively engaging stakeholders in system
design and implementation.
Employee feedback on DCIPS is collected at the component level through surveys, town hall and
other types of meetings, and on an individual basis. Employees also can reach OUSD(I) staff
directly through the Contact Us feature of the DCIPS website.
Focus group participants indicated that most feedback has consisted of individual complaints that
HR staff handled. “Program level” issues are forwarded to the OUSD(I) and sometimes raised in
the DCIPS Working Group. If necessary, the issue is addressed by the DIHRB. For example, a
guide to writing DCIPS individual performance objectives123 was developed in response to
employee requests for guidance, and policy changes allowed employees hired under a specific
career ladder to remain there under DCIPS.124
Thus, the intelligence components and OUSD(I) collect and consider employee feedback on an
ad hoc basis. It would be more productive to establish a formal feedback mechanism so that
employees know how to make their concerns heard, understand the process for considering
feedback, and receive information on the outcome. As Academy colloquia participants pointed
out, adjusting the system to respond to legitimate employee concerns helps build trust in the
system.
Finding 4-7
Communications with the DCIPS workforce have primarily focused on pushing information out
to the workforce. No formal mechanism exists to collect and consider employee feedback or
report outcomes to employees.
123
Guide to Writing Effective Performance Objectives describes how to write specific, measurable, achievable,
relevant, and time-bound objectives (“SMART”).
124
U.S. Government Accountability Office, DOD Civilian Personnel: Intelligence Personnel System Incorporates
Safeguards, but Opportunities Exist for Improvement, Dec. 2009, p. 27. Hereafter “GAO DCIPS Review.”
71
TRAINING
▪ Planning ▪ Delivery
Effective training is comprehensive and delivered through multiple channels. Without it,
employees waste time and effort, make mistakes, and experience frustration. In turn, this lack of
knowledge and skill diminishes support for the system. Because DCIPS represents a major
culture shift for the intelligence components, an effective approach encompasses both the
broader knowledge and skills needed to support the change, as well as the technical features of
the tools and work processes.
Planning
Training plans provide an overall strategy and framework for developing and delivering
instruction that directly support the change being implemented. Issued in 2008, the DCIPS
Communications and Learning Plan identifies the target audiences, learning strategy, preliminary
list of training products (courses and exercises), and recommended sequence of learning events.
The document represents OUSD(I)’s strategic guidance and overall approach to training.
More detailed plans, such as a training design document, are mentioned within the
Communications and Learning Plan. A design document typically offers detailed guidance,
including learning and performance objectives, assessment strategies, high level course flow, and
design outlines. Similar to a style guide, it helps ensure that training has a level of consistency
and conveys the knowledge and skills that employees need to adopt DCIPS. However, these
design documents are not yet available.
The courses outlined in the DCIPS Communications and Learning Plan focus almost exclusively
on knowledge and skills training about the system itself. Given the major culture change
underway, shifting from a system requiring little management engagement to one placing
significant new time and management requirements on supervisors, the degree of management
burden is given minimal emphasis. The OPM framework holds that a solid training plan for
implementing alternative pay systems should include employee training on how to understand,
communicate, and accommodate change; communicate performance expectations; and offer
feedback.125 The DCIPS training plan falls short in these areas.
The plan outlines nine communication/learning products for employees and managers; one of the
shortest courses is a two- to four-hour workshop where managers practice communicating with
employees about their performance. Given the transformational change that DCIPS represents,
there is a notable lack of training that targets the skills that managers need to build a
performance-oriented culture. OUSD(I) has acknowledged the need for “soft skills” training,
especially for first-line supervisors, that will better support DCIPS’ overall performance
management aspects.
125
OPM Framework, p. 66.
72
No training needs assessments have been identified and no documentation is available that
identifies what user groups actually need, a major omission. The plan simply outlines various
course topics that address different aspects of DCIPS processes and tools; it lacks any user
considerations. Had a training needs assessment been performed, OUSD(I) and the components
would have been well positioned to identify user requirements and skills gaps (particularly
related to soft skills training), support a more informed and thorough approach to training, and
address constraints on delivery (such as bandwidth limitations for web-based training and other
technical challenges). As written, the training is DCIPS centric, not requirements centric.
Specific training is also needed for rating consistency and fairness. Fed by reports of actual
behavior, there is a perception that ratings are being forced to conform to predetermined
distributions or specific quotas. There is the further perception, supported by actual NGA data,
that administrative support staff (who primarily reside in Pay Band 2) are consistently given
lower ratings overall since their work is less directly connected to the agency mission.
The use of performance ratings is new to most supervisors, and the guidance for ensuring
objectivity and fairness must be thorough and consistent. Few supervisors have previously used
a rating system tied to performance objectives; the concepts behind the system and the actual
practices must be communicated, trained, and reinforced.
Raters must be more fully trained on how to apply a consistent approach to rating against the
individual objectives and performance elements for each job, without bias against certain
functions or forcing a distribution of ratings to a pre-set quota. Data suggest that more thorough
training is needed across the DoD intelligence components to educate raters on how to prepare
fair ratings.126
Finding 4-8
Key planning documents, such as a training design document, are lacking and training courses
have focused on DCIPS’ technical features rather than the broader behavioral changes needed
to support the transformation.
Delivery
126
Academy online dialogue and open forum data.
73
assessment strategies and provides a mechanism for participant feedback or course evaluation.
Registration and record-keeping should be seamless with training available on a just-in-time
basis, especially for skills training.127 Given the size, geographic dispersion, and complexity of
the DCIPS population, training also should be offered through various media, including
websites, electronic job aids, and reference guides.
As noted previously, many courses offered to the DoD intelligence workforce have focused on
knowledge and skills associated with using DCIPS. Given that DCIPS policy and guidance were
not stable at the outset, the training content often changed to accommodate changes in policy and
updates to automated tools. This added both expense and workforce frustration.
The DCIPS training evaluation focuses on measuring participant satisfaction with individual
training sessions and counting the number of participants trained monthly. These are common
measures, but they do not assess the more important aspects of content validity (was the content
correct and thorough), or application to the job (were they able to use what they learned).
Recent DCIPS survey questions asked whether training equipped employees with the skills
needed for implementation.128 Preliminary results suggest that more work is needed to train
employees adequately in writing SMART objectives and communicating how DCIPS will affect
them. Aside from NGA, fewer than half of the respondents from the intelligence components
agreed or strongly agreed that they were satisfied with their training. At NGA, 52 percent agreed
or strongly agreed. 129
The ratings of satisfaction or “helpfulness” regarding specific aspects of the training (delivery
method or specific topics covered) appear to be consistently lower; less than half of the
respondents from all or most components gave favorable ratings to specific training questions.
These findings suggest aspects that OUSD(I) may consider for improvement.130
Many employees, especially those in remote locations, noted technical challenges. Access to
high speed Internet access is not universal among DCIPS employees, so web-based training is
not effectively delivered to them. Further, representatives of the military services noted that they
must retrain their uniformed managers more frequently as they experience turnover
approximately every two years. This impacts the long-term management of DCIPS, as well as
the ability to refresh leaders and maintain a consistent level of knowledge and proficiency among
the uniformed supervisor cadre.
127
Skills training should be conducted just prior to its subsequent application on the job—no more than two weeks
before the application of the new skills—to maximize retention.
128
2010 DCIPS Survey Preliminary Results, Question 36.
129
Ibid.
130
Ibid.
74
pay pool exercises to be helpful and hands on, while others saw them as ineffective and were
unable to transfer skills to the work environment. Overall, there has been considerable training
activity but the actual impact has been less than effective.
The Panel views the design and delivery of appropriate training as critical to the successful
implementation of DCIPS, especially given the critical role of first-line supervisors in the
process. As noted previously, overall implementation lacks sufficient emphasis on the change
management activities, including training, required to win the support of the workforce,
especially managers and supervisors. Without their full understanding of what they are being
asked to do, and why, implementation cannot succeed.
Finding 4-9
Insufficient and incomplete DCIPS training has been provided and offered too far in advance of
when employees need to use the skills being taught. Especially lacking has been training aimed
at changing behaviors and equipping managers and supervisors with the skills they need to
effectively implement and maintain DCIPS.
STAKEHOLDER INVOLVEMENT
▪ Inclusion
According to the OPM framework, consulting with key stakeholder groups on system design,
development, and implementation is critical to employee acceptance and ultimate
effectiveness.131 GAO and others consider it to be a best practice as it reduces employee anxiety
and resistance to change and fosters feelings of employee ownership and acceptance.132
131
OPM Framework, p. 26.
132
See, for example, Risher, Howard, Pay for Performance: A Guide for Federal Managers, IBM Center for the
Business of Government, Nov. 2004; Booz Allen Hamilton, Pay for Performance (PFP) Implementation Best
Practices and Lessons Learned Research Study, June 18, 2008; and Fernandez, Sergio and Rainsey, Hal G.,
“Managing Successful Organizational Change in the Public Sector: An Agency for Research and Practice, Public
Administration Review, (vol. 55, no. 2). Mar./Apr. 2006.
75
Inclusion
Identifying groups and their concerns is a key first step to involve stakeholders. A centralized
assessment that solicits and consolidates component stakeholder input would be expected to
provide useful information about:
OUSD(I)’s outreach efforts did not account for many important features of component readiness.
Specifically, first-line supervisors are key stakeholders who face significant changes to their job
due to DCIPS. There is no indication that their needs, impact, and training requirements were
assessed sufficiently prior to system implementation.
The intelligence components reported on their readiness to OUSD(I), but these reports focused
on such items as trainings completed. They did not address the elements identified above or
other important aspects of readiness, such as the workforce’s understanding of the system or the
ability of supervisors and managers to assume the additional responsibilities required for
successful implementation.
Following development of the design framework, 100 NGA employees organized into eight
teams to develop new HR practices. They provided input on pay bands, performance
management, career development, pay pool administration, and other system design aspects.
Aggressive employee outreach and ongoing solicitation of feedback helped generate support for
the transition, created champions who were sources of reliable information, and helped identify
special circumstances and issues that could otherwise have impacted implementation
negatively.134
The following examples provide additional best practices for stakeholder involvement:
133
Risher, Howard and Smallwood, Andrew, “Performance-Based Pay at NGA,” The Public Manager, Summer
2009, pp. 25-26.
134
Ibid, pp. 26-27.
76
• The Commerce Department used focus groups to gather employee input, which informed
system modifications;
• TSA implemented an online “idea factory” to solicit employee suggestions to improve
the system. Ground rules for this online dialogue limited discussion to constructive ideas
on specific topics, such as evaluation criteria, rather than individual complaints; and
• TSA established a National Advisory Council composed of employees from around the
country, selected by their peers to serve two-year terms. The council interacts with
agency leadership and program offices on a regular basis.
OUSD(I) has engaged employees through town hall meetings and surveys while components
have held their own town halls and other events like brown bag lunches and discussion groups.
However, the purpose of these forums primarily has been to “push” information outward, rather
than obtain workforce input.
In addition to these broader mechanisms, OUSD(I) partnered with ODNI to involve SMEs in IC-
wide focus groups. For example, 147 SMEs from eight components participated in 19
workshops to develop and validate exemplar performance objectives. In another focus group, 37
SMEs from 11 components developed and validated performance standards. As a follow up to
this effort, a survey was provided to all IC agencies to validate the results, though not all
participated.
Intelligence component HR staff have been involved in DCIPS development and implementation
through annual conferences and participation in the DCIPS Working Group and subgroups on
Resources, Implementation, and Communications. The working group provides an opportunity
for them to provide input and discuss issues, but meeting minutes indicate that it primarily has
been used as a mechanism for OUSD(I) to provide information.
Academy online dialogue and focus group participants indicated that they were not adequately
included in DCIPS development and, as a result, felt that the system was imposed on them. That
most online dialogue participants used it as a forum to air their concerns—rather than offer
constructive suggestions and ideas—underscores the fact that employees believe they have not
been adequately heard. Participants in the open forums contrasted the opportunity afforded them
in that venue with the more formal, lecture-style format of town hall meetings involving agency
leadership and the workforce.
135
2009 GAO Review, p. 8.
136
IC Community (IC) Pay Modernization Project Office, “Stakeholder Analysis,” p. 13. (undated PowerPoint)
77
Given the magnitude of the change and complexity associated with DCIPS, the lack of
stakeholder participation in system design and implementation undermines the system.
Employees feel no ownership, resulting in a high level of resistance to the changes it represents.
Finding 4-10
Stakeholder involvement has not been strategic or centrally managed. Stakeholder participation
has been ad hoc, limited, and often focused on narrow technical aspects of DCIPS, resulting in
increased employee resistance to DCIPS.
IMPLEMENTATION PLANNING
▪ Work Stream Planning and Coordination ▪ HR Business Processes and Procedures
▪ Tools and Technology Infrastructure ▪ Structured Approach
DCIPS affects every aspect of personnel management for DoD’s civilian intelligence workforce.
This change requires a comprehensive planning process and development of a necessary
infrastructure to support the new system, including policies, procedures, and automated tools. It
also requires a broad change management approach with mechanisms to assess progress and
manage risk.
Work stream planning and coordination refer to a detailed implementation plan that includes
streams of work and milestones for designing and implementing a system. The four-page
January 2008 Program Plan for DCIPS Implementation includes a mission statement and
business case for change. It also describes the program’s strategy, scope, objectives, and
implementation phases. The five program phases are:
137
GAO Review, p. 35.
78
3. Implementing DCIPS;
4. Evaluating the Success of DCIPS; and
5. Life-Cycle Support.
The program plan does not provide sufficient detail or milestones; it refers to an “established
timeline” but does not include it. Further, its focus is on tactical strategies; references to change
management issues are vague and few in number.
In addition, the program plan was not followed. DCIPS policies and processes were to have
been developed in the first phase of implementation. In reality, the timelines for Phases 1
through 3 were compressed and overlapping, and key policies were not ready at the outset.
Finding 4-11
The Program Plan for DCIPS Implementation does not focus sufficiently on change management
and lacks milestones for measuring progress.
According to the OPM framework, business processes and procedures related to an alternative
personnel system should be documented prior to implementation.138 These processes and
procedures provide the foundation for the development of automated tools, training materials,
and other implementation activities.
A major flaw of DCIPS’ implementation is that components were transitioned to DCIPS before
processes, procedures, and roles and responsibilities were finalized. For example, the DoD
policies, procedures, and responsibilities regarding performance management and for the DCIPS
occupational structure were not finalized until August 2009—two years after implementation
began. Although interim final regulations were signed in July 2008 for performance
management, occupational structure, and pay administration—two months in advance of the first
conversion of employees into DCIPS—some components were unwilling to publish local
implementing polices based on “interim final” regulations. In addition, two months is
insufficient time for employees to be informed of and trained on the policies and procedures.
Further, several important DCIPS policies and procedures have not yet been completed, as noted
in Chapter 3.
Implementing DCIPS while simultaneously developing and finalizing its policies and processes
has had widespread negative effects. The lack of firm policy and guidance has impacted
communications, training, and automated tools. Data from the Academy focus groups, online
dialogue, and open forums indicate that communications and training have included inconsistent
and contradictory information, and trainers have been unable to provide complete answers to
even basic questions.
138
OPM Framework, p. 27.
79
Taken together, these shortcomings have caused confusion, anxiety, and mistrust among
employees, and they have contributed to perceptions that the system is not transparent.139 In
addition, focus group participants—some of whom serve on the DCIPS Working Groups—noted
that fluid policies have challenged their ability to provide consistent, accurate guidance.
Finding 4-12
Implementing DCIPS prior to the completion of HR business policies, processes, and procedures
has caused confusing and contradictory training course content and communications messages,
frustrating the workforce.
As noted previously, a DCIPS website and various tools have been developed, including the
Performance Appraisal Application (PAA), Compensation Workbench (CWB), and DCIPS
Payout Analysis Tool (DPAT). The first is used by employees and rating officials to develop,
update, and view performance plans. The second is a spreadsheet used by pay pool panels to
carry out such tasks as generating salary increase and bonus amounts based on the DCIPS
algorithm and creating a one-page summary of payout information for each pay pool member.
The third is a spreadsheet used to analyze pay pool process results. Data from multiple CWBs
can be imported into the DPAT to generate statistics on rating distributions, salary increases and
bonuses, and pay pool funding and allocations.
Focus group, online dialogue, and open forum participants; DCIPS survey respondents; and
individual interviewees all voiced dissatisfaction with the automated tools, especially the PAA.
Among their comments: “the PAA has never worked properly” and “almost useless.” Another
noted, “Poor tool readiness (e.g., PAA/CWB) negatively affected credibility/acceptance.”
Depending upon the agency, between 20 and 45 percent of the respondents to the DCIPS survey
either disagreed or strongly disagreed that the PAA is helpful in planning or tracking
performance against objectives.141 The perception is that tools were immature and not adequately
tested prior to DCIPS implementation.
DCIPS Working Group minutes confirm that the tools were under development and tested long
after many components had transitioned to DCIPS. User guides were developed late, tools were
time-consuming and not user friendly, and training was inadequate. The PAA is not available on
the classified systems used by many intelligence component employees, and some agencies
139
Intelligence Community (IC) Pay Modernization Project Office, Stakeholder Analysis (undated PowerPoint);
Academy focus group, online dialogue, and open forum participants.
140
OPM Framework, p. 28.
141
2010 DCIPS Survey Preliminary Results, Question 59.
80
created a separate document for some of their employees to use at the classified level. Also, the
tools were modified midstream to respond to changes in policies and processes.
OUSD(I) officials acknowledge that the PAA is cumbersome and not useful as an oversight tool.
As one HR professional put it, the PAA “is the face of DCIPS” to the average employee and
problems with its usability have increased employee frustration.
Finding 4-13
DCIPS automated tools are immature and difficult to use, further frustrating employees.
Structured Approach
The OPM framework describes the Structured Approach element as the comprehensive change
management strategy that addresses “people” issues during implementation.142 A structured
approach or change management strategy takes into account anticipated employee reactions and
provides support for employees as they experience the process.
Multiple organizational cultures exist within and across the DoD intelligence components, each
of which has a unique mission and way of conducting its work. This variety represents one
aspect of workforce complexity. DCIPS requires a shift in the underlying philosophy about
managing and rewarding the workforce. It is not a set of new tools or procedures to conduct
performance reviews, but a transformation requiring a structured approach to guide
implementation.
Change management principles provide the core framework for structuring successful
implementation efforts, especially those as sweeping as DCIPS. Successful change management
efforts of this scope must address both the transformational and transactional aspects of change:
Transformational Change
Transformational change requires a clear vision of the desired new state, a guiding strategy for
achieving the goal, and strong leadership throughout the process.143 Leaders initiating the change
must visibly champion the effort and use every opportunity to communicate a simple, clear, and
142
OPM framework, p. 28.
143
Stragalas, N. Improving Change Implementation; Practical Adaptations of Kotter’s Model. OD Practitioner, vol
42(1), 31-38. (2010).
81
compelling case. That message must include a vision of success, the benefits that the change
offers, and even the risks of failure.
Leaders also must play a critical role in driving the change and maintaining its momentum, while
simultaneously understanding the organizational climate in which it is taking place, including
readiness or barriers to implementing the change within target organizations.144 For many
employees, DCIPS represents losing something they value: pay security and predictability.
Leadership must recognize the loss and clearly explain how the gains are worth the effort.
Readiness assessments of the target organizations help identify aspects of culture that must be
considered and provide critical input to change efforts. Understanding organizational readiness
can help leaders craft implementation strategy and communications, which in turn can help
mitigate challenges and resistance. In DCIPS’ case, DoD intelligence components performed
some readiness assessments and updates, but OUSD(I) did not provide centralized oversight.145
Transactional Change
Transactional change encompasses the various activities carried out by the implementers—in this
case, HR staff and line managers throughout the components. These aspects of change are more
tangible and operational in nature than the transformational aspects, which focus on leadership.
They include specific guidance, processes, procedures, tools, operationally-focused
communications (e.g., status of specific activities, refinements to guidance, schedules), and
incentives or accountability measures intended to reinforce adoption of the changes.
With DCIPS, supervisors must learn and demonstrate new behaviors to support a performance
orientation, such as effectively communicating with employees about their performance,
conducting periodic reviews, developing measureable objectives, and employing new automated
tools. OUSD(I) acknowledges that it rushed to implement DCIPS, and did so without key
structural components in place. The Implementation Plan mentions development of a change
management plan as a Phase 2 task, but none was developed, notwithstanding the importance of
change management as emphasized in the Communications and Learning Plan.
Overall DCIPS implementation largely ignored the transformational aspects of change. Building
on the defects in Engagement and Outreach, the lack of a change management plan highlights a
missed opportunity to develop the clear, compelling case for change and emphasize its urgency.
It also resulted in ad hoc implementation activities that focused on tactical or transactional issues
and lacked an overall approach.
DoD intelligence components conducted their own readiness assessments, but it appears the
results were not considered in overall implementation. Feedback for this study suggests that
OUSD(I) largely ignored the uniqueness of the components’ cultures. It is not surprising that the
144
Siegal, W., Church, A.H., Javitch, M., Waclawski, J., Burd, S., Bazigos, M., Yang, T.F., Anderson-Rudolph, K.,
and Burke, W.W. Understanding the Management of Change: An Overview of Manager Perspectives and
Assumptions in the 1990s. Journal of Organizational Change Management, vol 9(6), 54-80. (1996).
145
Examples of these component readiness assessments are included in the DCIPS Readiness Tool under Program
Management.
82
lack of an overall change strategy resulted in omission of accommodations for varying states of
readiness. It has resulted in a substantial challenge to this overall change effort.
Finding 4-14
DCIPS implementation efforts focused largely on tactical functions, not on fundamental change
management practices that would have supported the sweeping behavioral and other changes
necessary to transition to DCIPS. Even so, many tactical aspects of DCIPS were not
implemented effectively or completely.
PROGRESS
In contrast to the Preparedness component, which addresses the readiness of the intelligence
components to implement DCIPS, the Progress component measures the degree to which they
have achieved or are achieving the broad transformation goals needed for successful
implementation. The Progress dimensions are:
• Mission alignment;
• Results-oriented performance culture;
• Workforce quality;
• Equitable treatment; and
• Implementation plan execution.
MISSION ALIGNMENT
▪ Line of Sight ▪Accountability
Mission alignment refers to how well individual, team, and unit objectives link to organizational
mission. The elements are:
83
• Accountability. Ensures that linkage to mission is included in performance plan
objectives and judged based on the credibility of performance targets and employee
perceptions of accountability.
Links to mission objectives are a central design feature of DCIPS. A majority of DCIPS survey
respondents from five intelligence components indicated that they agreed or strongly agreed with
the statement, “I understand how my work relates to the goals and priorities of my organization
or component.”146 Many employees view the alignment of performance objectives to mission as
a positive aspect of DCIPS.147
Finding 4-15
Employees support the concept of aligning individual performance with organizational mission
and understand how their performance objectives align with their agency’s mission. It is too
early to assess whether employees are being held accountable for mission alignment.
DCIPS must effectively differentiate levels of performance and link rewards to performance to
be successful. It also must enable effective management of payroll and other implementation
costs.
Differentiating Performance
A formal process to review and assure the quality of performance ratings is necessary to ensure a
system that adequately recognizes different levels of performance. As described in Chapter 3,
DCIPS’ design includes such a process. In addition, employees must perceive that ratings
accurately reflect performance levels; otherwise, they will view the system as unfair and
arbitrary, and resist implementation.
DCIPS’ design has a built-in series of checks and balances, but employee experiences with the
rating review and quality assurance processes have led them to believe it is opaque and
untrustworthy. Very few respondents to the DCIPS survey agree that these processes—or
reconsideration and other grievance processes—will contribute to fairness in DCIPS.148
Employees assert that ratings assigned by supervisors change during the process without
explanation or recourse, and that it is impossible for them to determine where their rating was
146
2010 DCIPS Survey Preliminary Results, Question 68. Data on this question were available from five
components: Navy/USMC, NSA, NGA, DIA, and OUSD(I).
147
Online dialogue, focus group, and interview participants.
148
2010 DCIPS Survey Preliminary Results, Questions 20, 21, 25, 27.
84
changed and by whom.149 This appears to result from the way the process is implemented, rather
than a function of how DCIPS was designed.
More troubling are the widespread perceptions of why ratings are changed. Dozens of
individuals, representing at least four different intelligence components, reported through the
Academy online dialogue, open forums, and interviews that rating quotas or bell curves were
enforced in their agencies. Such practices are prohibited by DoD Instruction150 and OUSD(I)
and ODNI have communicated that they are not permissible.151 Nonetheless, the perception is
that such practices occur.
DCIPS employees are being told by their supervisors, correctly or not, that their ratings have
changed due to the office being required to fit a bell curve or achieve an agency-wide bell curve.
In addition, supervisors reported that they are told to follow a bell curve when rating their
employees. Nine different “idea threads” from the online dialogue, each with multiple examples,
indicate that the practice is perceived as widespread across agencies and condoned by
management. Representative comments from supervisors:
As a supervisor I was told there were no quotas and to rate my employees how I feel
they performed. Yet there was pressure from above and “guidance” given, that if my
ratings didn’t adhere to the general quota distribution that my agency was aiming for,
my own performance rating would suffer.
Some raters, including myself, were repeatedly told that we have to adhere to an
office “bell-curve” so that ratings are equally distributed across the agency. We also
had to submit proposed ratings for employees before appraisals were prepared to
ensure that we were within our “bell-curve boundaries.”
Ratings data provided by OUSD(I)152 do not support, nor do they dispel, the claims that bell
curves are enforced. Among the agencies from which data were examined, many more
employees rated as 4s than 2s, resulting in a skewed distribution. Approximately 82 percent of
employees in the six agencies received a rating of 3. Only .06 percent of all employees received
a rating of 5.
Some employees have asserted that manipulation of ratings exists, and is proof that DCIPS has
been designed to keep payroll costs down. Others have blamed perceived rating quotas for
149
Academy online dialogue and open forum participants.
150
DCIPS Performance Management Instruction (DoDI 1400.25-V2011).
151
See DCIPS, Prohibition of Forced Distribution of Ratings Fact Sheet, Aug. 2009.
http://dcips.dtic.mil/documents/Prohibition_of_Forced_Distribution.pdf; and Memorandum for Heads of
Intelligence Community, All Intelligence Community Employees, Implementing Performance-Based Pay in the
Intelligence Community, June 12, 2009
(http://www.dami.army.pentagon.mil/site/dcips/documents/Memos/DNI/DNI%20Letter%20to%20%20IC%20Work
force%20regarding%20DCIPS%20status.pdf).
152
Ratings data were available for six components: DIA, Navy, NGA, NSA, OUSD(I), and USMC.
85
increasing competition among employees and inhibiting collaboration. In general, such
perceptions have the effect of undermining DCIPS’ credibility and integrity. That these beliefs
appear widespread and strongly held presents a difficult challenge to overcome.
Online dialogue and open forum participants frequently mentioned that some supervisors are
inflating ratings to help ensure their employees will get larger payouts, while other employees
are being disadvantaged because their supervisors are “being honest.” In some cases, employees
reported that they have been told to rate themselves, and that supervisors do not deal properly
with those who inflate their own ratings.
Overall, employees have lost confidence in DCIPS performance ratings for various reasons,
including the belief that there is little relationship between ratings and performance. Only
slightly more than half of DCIPS survey respondents who rated this item believed their
supervisors rated them fairly.153 This perception of unfairness affects morale and severely
undermines the system. As ODNI and OUSD(I) staff repeatedly stated, if the performance
management part of DCIPS operates properly, the rest of the system will fall into place.
Conversely, if that aspect is flawed, the entire system may fail. Consequently, it is important to
provide better communications, training, and oversight to address issues related to the
misapplication of official guidance on the rating process.
Finding 4-16
Perceptions that ratings are being manipulated in the DoD intelligence components have
undermined the integrity and credibility of DCIPS and led employees to believe that ratings do
not accurately reflect performance.
One of DCIPS’ main purposes is to reward good performance so there must be a strong link
between ratings and salary increases and bonuses. This link is measured in terms of the average
increase and bonus amount by performance level and pay band. Employees must believe that
salary increase and bonus amounts are based on performance levels and that subjective
considerations, as well as favoritism, are at a minimum.
153
2010 DCIPS Survey Preliminary Results, Question 63—data for this question were available from only five
components. .
154
2010 DCIPS Survey Preliminary Results, Question 10.
86
How pay pools are constructed and which pay pool an employee is placed in also will affect
payouts. All else being equal, employees could receive different payouts depending on the
characteristics of their pay pool.
Finding 4-17
Several factors negatively affect employee views of the link between pay and performance,
including perceptions regarding ratings manipulation, perceived biases against support
occupations and those in the field, and differences in payouts across pay pools.
Cost Management
Cost management is the extent to which decision makers have access to reliable estimates of
costs associated with program design and implementation, and the degree to which costs are
budgeted.
OUSD(I), responsible for developing guidance to track implementation costs, reports that DCIPS
implementation has required $60 million thus far.155 More specific budget information is
classified, making it impossible to analyze the data in an unclassified context. Thus, this analysis
focuses primarily on cost management processes that are in place.
Estimating costs accurately is another aspect of cost management. OUSD(I) officials indicated
that mock payout data will be analyzed to determine how much it actually will cost to run the
pay pools, which is currently unknown.
Intelligence components were directed to estimate payroll costs based on the previous year’s
budget and were responsible for estimating other implementation costs. Most underestimated
costs for different reasons. For example, some agencies did not budget enough for training, and
OUSD(I) and ODNI had to make up the difference. Because NSA delayed conversion, it does
not now have the estimated $30 million necessary to convert its workforce to DCIPS, though
solutions for this budget shortfall are being explored.
OUSD(I) has taken appropriate measures and established effective mechanisms to track
implementation costs accurately and comprehensively. Although guidance and mechanisms
allow components to estimate, track, and manage costs, minor problems have impacted estimates
155
This is only the amount allocated by OUSD(I) for DCIPS design and implementation. Each component was
responsible for funding its own PMO, workforce conversion, and such implementation activities as training.
156
OUSD(I), Resource Management Sub-Group Kickoff Meeting, Oct.12, 2007 (PowerPoint).
87
and budgeting. Once DCIPS is fully implemented, the components will have to pay for the
system from their own budgets so there should be some improvement in estimating and
managing costs at that level.
Finding 4-18
OUSD(I) has established mechanisms for managing implementation costs and has appropriately
built on lessons learned from NSPS. However, improvements in cost estimation and
management are needed at the component level.
WORKFORCE QUALITY
▪ Recruitment ▪ Flexibility ▪ Retention ▪ Satisfaction and Commitment
EQUITABLE TREATMENT
▪ Fairness ▪ Transparency ▪ Trust
Equitable treatment refers to employees’ perceptions of how they are being treated in a new
culture. The elements of equitable treatment are fairness, transparency, and trust.46 According to
the OPM framework, these cultural aspects of implementation “have a significant impact on the
degree of success” for an alternative personnel system.47 Due to the early stage of DCIPS’
implementation, analysis of these elements is limited to employee perceptions.
Fairness
Design and implementation can impact employee perceptions of the fairness of agency practices
in the context of adopting DCIPS. As discussed with the Differentiating Performance element,
believing that DCIPS does not accurately rate performance can have serious ramifications for
employee perceptions of fairness. Other decisions and practices viewed as unfair include the
GS-13 split between pay bands 3 and 4 upon conversion, described earlier, and the differences in
payouts of different pay pools. The former was a one-time conversion issue and will not happen
again. However, the effects on the employees placed in pay band 3 will be long term.
46
OPM framework, p. 32.
47
Ibid.
88
DCIPS survey results indicate that most intelligence component employees do not believe that
the checks and balances in DCIPS’ design—including the PM PRA, PP PRA, their agency’s
grievance system, and equal employment opportunity complaint process—will contribute
significantly to system fairness.157 Similarly, only a small percentage of employees believe that
the pay pool decision tool and panels contribute to fairness; percentages range from 11 to 26
percent in six of the agencies.158 It is unclear whether employees understand how these processes
contribute to DCIPS’ fairness. If they do not, communications and training could address the
problem. If they do, the processes themselves need to be reevaluated.
Transparency
For DCIPS to be transparent, stakeholders must have access to and understand processes and
procedures related to the system’s performance-based compensation aspects. Some focus group
and online dialogue participants reported that DCIPS improves transparency by requiring and
improving the rigor of documentation of performance evaluations, ratings, and decisions. Yet
lack of transparency was a recurring theme in the online dialogue. The causes are many and
have been discussed earlier: the lack of complete and clear policies and guidance; the opaque
rating review process; inadequate training; and ineffective communications. Employee
perceptions of transparency will affect their views of fairness and trust.
Trust
This element relates to the impact of DCIPS on the level of trust that employees have in their
supervisors. If the issue of enforced ratings distributions is not addressed, trust in supervisors
will erode. As one online dialogue participant said, “How can we ever trust management when
they do this to us.” It is clear from the dialogue and open forums that some segment of the
workforce lacks trust in its supervisors, or at least its second level of management. However, it
is too early to determine whether distrust has been exacerbated by DCIPS or will ultimately
decrease if DCIPS is implemented as designed.
Finding 4-19
Employee perceptions of fairness, transparency, and trust are negatively affected by the
widespread belief that ratings and pay do not accurately reflect performance, the opaque ratings
review process, inadequate training, and ineffective communications.
157
2010 DCIPS Survey Preliminary Results, Questions 20, 21, 25, 27
158
2010 DCIPS Survey Preliminary Results, Questions 76, 81. The six agencies that responded to these questions
were Army, Air Force, Navy/USMC, DSS, NSA, and NGA.
89
IMPLEMENTATION PLAN EXECUTION
▪ Work Stream Planning and Status ▪ Performance Management System Execution
▪ Employee Support for DCIPS
Work stream planning and status assess the comparison of the implementation process with the
planning process. Since the DCIPS Implementation Plan lacks specific tasks and milestones, it is
impossible to assess implementation against the plan.
As explained earlier, the Academy Panel and study team made substantial efforts to assess
employee support for DCIPS. However, the number of participants was relatively small—
approximately 900 in open forums, 60 in focus groups, and 1,800 in the online dialogue—given
a workforce of more than 50,000 individuals. In addition, experience indicates that these data
collection methods are more likely to attract individuals with specific issues and concerns, rather
than strong advocates.
The recent OUSD(I) DCIPS survey had a fairly high response rate and likely provides a more
robust depiction of employee sentiment.159 The results indicate that employees have mixed
feelings about performance-based compensation. For example, between 73 and 91 percent
(depending upon component) supported the statement that individual performance should be
considered when granting pay increases, with the largest increases going to the highest
performers.160 However, far fewer respondents, ranging from 10 to 34 percent depending upon
the component, thought their own performance would be more effectively recognized under
DCIPS.161 The fact that performance-based compensation as a concept is viewed so positively
across all components, but that concerns remain about how it will be applied under DCIPS,
further suggests the implementation efforts and outreach were lacking.
159
2010 DCIPS Survey response rates are as follows: Air Force/NRO: 49 percent; Army: 46 percent; DIA: 37
percent; DSS: 52 percent; Navy/USMC: 38 percent; NGA: 17 percent; NSA: 33 percent; OUSD(I): 39 percent.
160
2010 DCIPS Survey Preliminary Results, Question 10.
161
2010 DCIPS Survey Preliminary Results, Question 11.
90
CONCLUSIONS
The Panel finds that a significant amount of time and effort have been expended by OUSD(I) and
the DoD intelligence components on DCIPS implementation, but that all aspects of DCIPS
implementation related to the Preparedness component of the OPM Assessment Framework are
nonetheless significantly flawed and that it is too early to assess the Progress component. It is
important to note that this situation is not the sole responsibility of the HR professionals in the
DoD intelligence components tasked with implementing DCIPS. These individuals were given a
charge that would have been very difficult to achieve under the best of circumstances and have
endeavored to do the best job possible.
The Panel further concludes that DCIPS implementation was rushed, and an overall change
management strategy was not established to guide the transformational and tactical dimensions
of implementation. These critical omissions have created a host of challenges that must be
addressed, including a major effort to rebuild employee trust. Given the nature and scope of the
challenges, DCIPS leadership must fill many key gaps in leadership and strategy prior to
engaging in further implementation activities. To prepare a stronger foundation going forward,
leadership must fully support and appropriately allocate additional time and resources to
developing:
As noted in the training discussion, a critical missing component is intensive training for first-
line supervisors on all system aspects, including basic managerial behaviors and communications
that underpin DCIPS and every performance management system. The lack of adequate
managerial training is a chronic weakness across the federal government, but it is magnified with
DCIPS since it requires new and different behaviors from supervisors, many of whom have had
limited demands placed on them for developing personnel management skills.
162
Interviews and the examples of NGA and MITRE support this statement.
91
including the effects of earlier implementation efforts, the NDAA, and the Panel’s
recommendations.
Recommendation 11. OUSD(I) should move swiftly to finalize DCIPS governing policies,
make them available to the workforce, and communicate them widely to improve
transparency and ease of understanding.
Recommendation 13. The USD(I) should be more visibly engaged, set key implementation
objectives for DoD intelligence component leaders, and meet with them regularly to hold
them accountable for meeting those objectives.
Recommendation 14. OUSD(I) should develop a detailed communications plan and style
guide as part of its overall change management efforts. This plan should address strategic
communications about the overall DCIPS system and implementation, as well as an
approach for tactical communications about status, updates, and other fluid aspects of
implementation.
Recommendation 15. As part of the overall change management effort, OUSD(I) should
develop a thorough training plan and specific instructions aimed at first-line supervisors
and managers to equip them with the personnel management skills needed to fully
implement and maintain DCIPS.
Recommendation 17. OUSD(I) should establish a program management office, with the
requisite staffing, resources, and authority to design and implement a comprehensive
change management strategy and provide adequate oversight of DoD intelligence
component implementation.
Recommendation 18. OUSD(I) should make the DCIPS Readiness Tool and website more
user-friendly and interactive in order to meet the information resource needs of their
intended audiences through timely, accurate, and updated information.
Recommendation 19. OUSD(I) should employ best practices for stakeholder involvement
and develop guidance for gathering and considering continual employee feedback.
92
CHAPTER 5
INTRODUCTION
This chapter responds to the NDAA’s provision for a review of DCIPS’ impact on career
progression and its sufficiency in providing diversity protections for promotion and retention.
The Panel notes that its findings, conclusions, and recommendations are limited given that
DCIPS has not been fully implemented in any DoD intelligence components other than NGA.
Only NGA has experienced the full range of system elements: performance management, pay
banding, new personnel policies, salary pay pools, and bonus pay pools. As a result, only two
sources are available to extrapolate DCIPS’ impacts:
1. Employee perceptions and experiences, which illustrate their experience to date. These
perceptions must be dealt with in further implementation, however, they are of limited
value in judging the real impact of DCIPS; and
2. NGA’s experience with DCIPS and a performance-based compensation system, which is
also of limited value as explained further below.
Senior managers and HR professionals have shared their perspectives and experiences through
the Academy focus groups and intelligence component employees have provided their feedback
in the online dialogue and agencies’ open forums. These data cannot be presumed to be entirely
representative of the views of the larger workforce, but consistent themes have emerged
regarding DCIPS’ impact on individuals and areas where remedial action is needed. Additional
perspectives are provided through the recently completed OUSD(I) DCIPS survey.
As noted previously, NGA has experienced a decade’s worth of data collection with a pre-
DCIPS performance-based compensation system and one full performance management and
compensation cycle under DCIPS. These data should provide a reasonable basis from which to
infer, albeit indirectly, DCIPS’ impact on other DoD components.
EMPLOYEE PERCEPTIONS
HR Professionals
In February 2010, the OUSD(I) HCMO sponsored the National DCIPS Conference, designed to
provide information to intelligence component HR officials about DCIPS Interim, the system
devised to deal with the NDAA-mandated strategic pause. This three-day program provided the
93
Academy study team with insights on DCIPS’ impacts, based on the experiences and opinions of
personnel in attendance.
The participants seemed generally supportive of DCIPS, but were struggling with the
complexities of the new interim process, frustrated by the NDAA pause, and uncertain whether
their efforts to implement DCIPS would lose momentum. There was general agreement on:
• The positive impact that the DCIPS performance management system was having on
DoD intelligence components by requiring managers to engage in performance
discussions with their subordinates;
• The multiple levels of review built into the process, creating a more consistent and fair
means for evaluating performance; and
• The opportunity for personnel with analytic and operational skills to progress to higher
salary levels without having to assume management responsibilities.
The HR professionals described negative impact, as well. Some questioned DCIPS design and
implementation choices. Still others challenged the fundamental wisdom of trying to implement
a performance-based compensation system in the federal government.
Senior Managers
• The transparency of DCIPS compared with its predecessor systems. One participant said,
“you can’t hide in DCIPS” because all DoD intelligence components have consistent
guidance and use the same bases for classifying jobs, establishing individual performance
measures, and evaluating employee performance;
• A strong performance management system with emphasis on building sound evaluation
metrics and a consistent approach to evaluations and rewards;
• A link among evaluations, pay, and mission outcomes;
• The prospect of placing the entire IC on the same footing—in other words, a common HR
system and rules for pay and performance; and
• Flexibility in setting pay for new hires through pay banding.
• The time that supervisors, particularly those in front-line positions, needed to execute
their performance management responsibilities. This was attributed partly to new
evaluation requirements, but most concerns focused on overly complex and cumbersome
administrative tools and processes. Several said this added burden required some
supervisors with technical responsibilities to put them aside, with potential adverse
effects on mission performance;
94
• Disincentives to become a supervisor because DCIPS imposes additional work
requirements with no increase in compensation;
• The extent and nature of employees’ negative reaction to DCIPS. One manager said,
“We are intelligence officers and we were surprised at how the workforce reacted. We
aren’t supposed to be surprised”; and
• Concern that administrative positions in pay bands 1 and 2—which have a higher
percentage of women and minorities than pay bands 3, 4, and 5—tend to receive lower
average performance ratings than those in higher bands because the work is not so clearly
linked to the mission.
• A belief that DCIPS reduces promotion opportunities and career progression. Both
DCIPS’ complexities and the elimination of the GS grades and steps have resulted in the
impression that career and salary progression are now harder to achieve. By design,
DCIPS provides for fewer promotions and it is entirely possible for employees to spend
most or all of their careers in a single pay band. Although salary progression may equal
or exceed what the GS system provides for most employees, the reduced number of
“milestone events,” such as promotions to the next grade, seems to promote a negative
view;
• Concern that DCIPS inhibits collaboration among employees. Although the performance
elements on which all employees will be rated include an element on cooperation and
collaboration, the emphasis on individual achievement and reward is seen as working
against collaborative efforts: “DCIPS forces employees into contests for claiming credit.
This is not good for teambuilding or productivity”;
• A perception that morale is suffering. Said one, “The appraisal system associated with
DCIPS is not a good motivator, and can be demoralizing.” This perception has been
compounded by the confusion resulting from the interim policies and procedures
necessitated by the NDAA pause; and
• The amount of time spent on performance management is seen as excessive. Both
managers and subordinate staff made this observation. One remarked, “As a first line
supervisor, I had to conduct write-ups on 22 civilians—both performance objectives (4
on each employee) and elements (6 on each employee). Doing a total of 220 write-ups in
two weeks was a nightmare!! Plus, I had to do my own assessment. I simply don’t
understand what this is supposed to accomplish.”
95
The first pertains to the impact of the new performance management system. There are concerns
about the amount of time it requires, the adequacy of the specific performance elements used,
and other system features. Nonetheless, there seems to be an understanding that it is the right
thing to do and that it is intended to be an important vehicle for driving performance.
A second related theme is the importance and potential positive impact of linking performance at
all levels to agency mission. Although there are concerns about how this may work in practice,
there is little disagreement about whether it should be done.
The third theme is the advantage of transparency and consistency that DCIPS is intended to
provide to intelligence components. These features are seen as helping to reduce job
classification disparities among agencies, providing a similar basis for assessing performance,
and providing a platform for future cooperation and collaboration. This is mitigated somewhat
by opinions that DCIPS’ predecessor system was in some ways more transparent, particularly
with regard to employee evaluations.
Fourth, there were strong statements that implementation is having a major negative impact on
the most critical level of management for this kind of transformation: front-line supervisors.
New performance management requirements have a disproportionate impact on this group. They
also expose weaknesses in the training provided in preparation for implementation and the
potential management skills deficits in this cadre of leaders.
Negative comments were especially strong concerning the alleged “forced distribution” of
ratings, i.e. the belief that ratings have been or will be forced into a normalized bell curve
distribution, regardless of actual results based on a straightforward assessment of employee
performance against established objectives. Many believe that there are limitations on the
percentage of employees who may receive above average ratings or to save money by limiting
the number who receive increases and bonuses.
The fifth DCIPS theme is the tension produced by a pay system focused heavily on individual
achievement yet applied to organizations that rely on employee coordination and collaboration to
produce mission-critical products. Component employees at all levels report that the focus on
individual performance alone produces negative consequences for collaboration and cooperation.
Mock pay pool exercises are used to determine meaningful distinctions in performance and
generate lessons learned for improving processes, ensuring consistency, and promoting fairness
in payout decisions. Their results are not recorded for compensation purposes, but they can help
refine business rules and processes for actual pay pool meetings at the end of the performance
year. Mock pay pools are mandatory under DCIPS in the first year that pay pools are conducted
for any intelligence component.
OUSD(I) conducted an analysis of DCIPS employees who were evaluated and had bonuses
determined following the FY2009 performance cycle.163 NGA was the only component whose
163
Defense Civilian Intelligence Personnel System. 2009 Performance Evaluation and Payout Analysis. Hereafter
96
employees received performance ratings, salary increases, and bonuses under the methodology.
Intelligence employees in DIA, NSA, the Navy, Marine Corps, and OUSD(I) received
performance ratings and bonuses under DCIPS, but were precluded from receiving salary
increases during NDAA-imposed interim period. For these agencies, this was a mock pay pool
exercise and analysis. Air Force, Army, and DSS did not participate and were not included in
the analysis.
Approximately 97.3 percent of DCIPS employees received a performance evaluation for the
2009 cycle, and 98.2 percent of those were eligible to receive a bonus.164 Approximately 99.5
percent of rated employees received an evaluation of record of Successful or higher, meaning
that almost all employees were eligible to receive performance bonuses and performance-based
pay increases. In the mock pay pools, about 44 percent of employees would have received both
a performance salary increase and performance bonus.165 The remaining population was split
between those who would have received a bonus but no salary increase (less than 1 percent) and
vice versa (55 percent). Ratings rose with pay band, and supervisors rated higher in Pay Bands
3, 4, and 5 than did non-supervisors.
The analysis also showed that differences across pay pools complicated the investigation of
performance ratings due to differences in work demands, the mix of jobs and experience, and the
application of the common performance indicators and benchmarks in a local context. As with
the separate analysis conducted by NGA, discussed below, this analysis could not determine
whether variances in ratings assigned to employees in certain protected classes reflected
legitimate performance differences, and thus this issue requires further review.
The preliminary results of the OUSD(I) DCIPS Survey present a mixed picture of DCIPS.
Although employees overwhelmingly (81 percent) agreed or strongly agreed with the concept of
performance-based compensation in principle,166 far fewer agreed that their performance would
be more effectively recognized under DCIPS than under GS (22 percent) or that career
advancement opportunities would be greater (15 percent).167
There is much greater agreement that DCIPS allows employees to understand their performance
objectives and how their work relates to organizational goals and priorities (78 percent).168
Employees also agreed that their supervisors know and understand what they do (69 percent) and
take an interest in their success (65 percent).169 Fewer agreed that supervisors provide helpful
explanations of the bases for ratings under DCIPS (53 percent), that ratings were fair (54
97
percent), or that feedback received from supervisors helps them achieve their performance
objectives (36 percent).170
Relatively few employees agreed that DCIPS provides employees with adequate protections
against unfair treatment (18 percent), or that the DCIPS performance management system
contributes to improved accomplishment of the work unit’s objectives (24 percent).171 Only 16
percent agreed that DCIPS will improve performance within the organization or component over
time while very few (10 percent) agreed that it will increase collaboration within their
organizations.172
Although 45 percent of employees agreed that they understand the process by which DCIPS
performance payout decisions are made, only 14 percent agreed that the use of pay pool panels
and the pay decision tool contribute to increased fairness of pay decisions.173 Few agreed that
their individual base pay increase was appropriate based on payouts made to others in the pay
pool (18 percent) or the organization (16 percent).174
Based on these survey results, it is clear that intelligence component employees accept the
proposition that they should be rewarded commensurate with their performance. Yet there is
widespread doubt that DCIPS, as implemented, will achieve that end. The extent to which these
survey results are influenced by the NDAA pause and DCIPS Interim–rather than DCIPS itself—
is unclear.
NGA has more experience with performance-based compensation systems than any other DoD
intelligence component. In developing DCIPS, OUSD(I) benchmarked and adopted the basic
principles that NGA developed over the prior decade. In turn, NGA adopted the new design
features built into DCIPS and became the first agency to put the new system into place in 2008.
In terms of actual impact data, NGA’s experiences and data elements are key points of reference.
A core DCIPS premise is that it will strengthen the long-term ability of the DoD intelligence
components to achieve their missions. However, NGA has not collected data that could directly
connect its personnel system to organizational performance. Its officials point to indirect
measures indicating that performance measurement and employee perceptions have improved
under the performance-based system; they suggest this will result in improved organizational
performance over time. One example they cite is the significant reduction in the percentage of
annual ratings that are Excellent or Outstanding that occurred following DCIPS implementation,
illustrated in Figure 5.1.
170
Ibid. Questions 61, 63, 64.
171
Ibid. Questions 19, 69.
172
Ibid. Question 80.
173
Ibid. Questions 75, 76, 81.
174
Ibid. Questions 77, 78.
98
Figure 5-1. DCIPS Interim Array of Ratings
10.0%
(Last cycle)
(was 82.5%
Excellent)
8.0%
6.0%
Unacceptable - 0.1%
3
1
9
2.
3.
3.
4.
2.
2.
2.
2.
3.
3.
3.
4.
4.
4.
4.
20
The bold percentages in Figure 5-1 indicate that those rated Excellent fell from 82.5 percent
before NGA began applying DCIPS to 31.6 percent afterward, and that those at the Successful
level rose from 13.8 percent to 65.7 percent. Clearly, NGA’s DCIPS evaluation system has
moved the spread of ratings toward the Successful part of the ratings curve. NGA believes this is
reflective of actual employee accomplishments.
At the same time, this shift in ratings does not mean that NGA employees fared less well in
terms of compensation. The salary increase percentage remained the same as the previous year
(2.37 percent), the bonus budget rose from 1.55 percent to 1.8 percent, and the percentage of the
workforce receiving a bonus rose from 44 percent to 48.4 percent. The average amount also
increased, from $2,933 to $3,212.
One potential negative impact of the ratings distribution is the reinforcement of employee
perceptions that DCIPS will include forced distributions into a bell curve pattern regardless of
actual performance. The figures also raise the question whether the performance evaluation
portion of the pre-DCIPS performance-based compensation system had the real capability to
make performance distinctions when, after almost a decade of performance-based compensation,
so many NGA employees rated Excellent in 2008.
NGA officials also noted the positive impact of performance-based compensation as it relates to
organizational culture change. They cite the results of a comparison of data from the 2008
99
Annual Employee Climate Survey and Federal Human Capital Survey, both conducted before
NGA adopted DCIPS. In response to the statement, “In my work unit, differences in
performance are recognized in a meaningful way,” the results were the following:
These data indicate a significant difference in employee perceptions among employees at NGA,
those at other IC agencies, and the rest of the federal workforce. NGA believes this likely
reflects its long-term experience with a performance-based management system.
However, this positive impact is offset somewhat by NGA employee responses to another
statement in the same survey: “In my work unit, steps are taken to deal with a poor performer
who cannot or will not improve:”
Thus, it appears that NGA may have done a better than average job in identifying differences in
performance, but it has not done as well as other agencies in effectively dealing with poor
performers.
Five key Federal Human Capital survey indices, shown in Table 5-1, indicate that NGA ranks
somewhat better than other federal agencies in a variety of areas, but is no better than average
when compared with other IC agencies.175
175
Leadership and Knowledge Management; Results Oriented Performance Culture; Talent Management; Job
Satisfaction; and IC Transformation (IC Agencies Only). Data provided by USD(I) Human Capital Management
Office.
100
Table 5-1. NGA Indices and Comparative Results
Another significant area for examination is whether NGA’s history with performance-based
compensation has had a positive or negative impact on the careers of women and minorities.
This area is identified by the ODNI’s Strategic Human Capital Plan as one of five major
challenges in building a strong HR program.176
176
Insufficient Diversity. Although the U.S. civilian labor force was becoming more diverse, the IC reportedly was
not keeping pace. From the ODNI Strategic Human Capital Plan.
101
“Black/African American” and “Two or More Races” and the “Targeted Disability” (less than
one percent of the rated workforce) “merit expert analysis and attention.”
As a group, the race/ethnicity indicator variables explained only 0.64 percent of the
variance in performance ratings. However, the regression confirmed results obtained
earlier in the report: individual rating differences are related to differences in median
ratings by pay pool, supervisors and higher paid individuals tend to receive higher
ratings, and employees in the Analysis & Production mission category tend to receive
higher ratings than those in “support” categories, where racial/ethnic groups tend to be
over-represented compared to their proportion of the overall population.177 Further,
statistically significant differences in ratings and performance payouts among protected
groups are not equally evident across the entire NGA workforce; they tend to be clustered
in pay bands 3 and 4.
The regression points to the conclusion then that while median pay pool rating is the key
indicator of a given individual’s rating (and hence, performance payout), there is also an
important interaction of several other factors, all relatively weak in and of themselves,
but somewhat more powerful in collectively explaining the overall variance in
performance ratings.
The NGA analysis reviewed prior year data and concluded that these results have “existed at
NGA for at least the last few years before DCIPS implementation. [Thus], it is difficult to infer
from the historic data that this is a new result under DCIPS.” In essence, there are unexplained
variances in ratings assigned to employees in certain protected classes. NGA plans to study
these to determine whether they reflect legitimate performance differences.
CONCLUSIONS
OUSD(I) and the intelligence components, including NGA, are trying to introduce fundamental
change to the way employees are evaluated, compensated, and progress through their careers.
The creation and introduction of DCIPS have been approached with great seriousness, hard
work, and creativity. The Academy Panel has been impressed both with the DCIPS system and
the people who work within it.
As noted, determining DCIPS’ impact is not possible at this time given the intelligence
components’ limited experience with the system. Supervisor and employee perceptions and the
impact of DCIPS Interim provide a somewhat negative picture. And, the NGA experience does
not provide clear evidence of potential impacts.
The Panel finds that there is nothing inherent in the DCIPS’ design that would lead to negative
impacts on career progression or diversity, but that it is too soon to determine the actual impacts
177
This finding is similar to the findings resulting from the mock pay pools conducted by the other (non-NGA) DoD
intelligence components.
102
of implementation. Nonetheless, OUSD(I) should address employee concerns prior to
undertaking any further implementation efforts.
RECOMMENDATIONS
This report identifies unanswered questions and areas requiring further review prior to moving
ahead with implementation. With that in mind, the Panel makes the following recommendations.
Recommendation 23. OUSD(I) should determine the reasons that ratings tend to increase
at each higher pay band.
Recommendation 25. OUSD(I) should identify ways to compensate for employee attitudes
about the loss of “milestone events” when transferring from a grade-based system to a pay-
banded system.
103
This Page Left Intentionally Blank.
104
CHAPTER 6
The DCIPS effort is designed to unify the DoD intelligence components under a single HR
management system, further enhance the high quality of their workforce, and strengthen their
ability to perform a vital mission. At the most fundamental level, however, it is about assisting
in protecting U.S. national security interests. The importance of a robust personnel system to the
ability of these organizations to defend the nation against terrorist attack, was highlighted by Lee
Hamilton, former Congressman, Chairman of the House Permanent Select Committee on
Intelligence and Vice Chairman of the 9/11 Commission, when he met with the Academy Panel
to discuss DCIPS:
Because the intelligence mission is essential to the national security of the United States, the
Panel agrees that DCIPS must be capable of attracting, retaining, and motivating the best people
to contribute their best efforts. Based upon this review, the Panel understands the intended
national security significance of DCIPS and believes the effort should proceed, but with
conditions.
The Panel applauds the effort that the USD(I) has made to bring the DoD intelligence
components closer together through the adoption of DCIPS. However, it is critical that this
effort to alter a fundamental element of the culture of those components be managed very
carefully. The attention of the workforce cannot be diverted from the performance of its mission
to the composition of its compensation.
The Panel concludes that the design of DCIPS is fundamentally sound. Nonetheless, several
major areas for further improvement are identified in this report. The implementation of the
DCIPS design has been flawed for a number of reasons, and a significant number of
recommendations for change and enhancement in that regard are also identified. Finally, it is too
early to judge the actual impact of DCIPS, but the flaws in its implementation and the effects of
105
the NDAA pause have resulted in a confused and skeptical workforce that may not be adequately
prepared for this system.
Based on its findings and conclusions, the Panel recommends that DoD continue with
implementation of DCIPS by phasing in its performance-based compensation elements at
the remaining DoD intelligence components based on readiness-based assessments. Given
the intended link between DCIPS and mission enhancement, OUSD(I) should pursue this
approach with urgency, taking into account recommendations provided in Chapters 3, 4,
and 5 of this report.
• Establish a Program Office within OUSD(I) that has overall responsibility to:
• Develop mandatory, specific, and robust training regimens for DoD intelligence
component supervisors and managers regarding their responsibilities under the
DCIPS performance management process. Further, adopt Performance Objectives
or Elements that make these supervisors and managers accountable for consistent
and effective execution of those responsibilities, including diversity management
that has meaningful development and advancement of a diverse workforce as its
goal.
All of these activities should be conducted in consultation and coordination with the
Undersecretary of Defense for Personnel and Readiness.
106
OUSD(I) has advised the Panel that one or more DoD intelligence components, in addition
to the National Geospatial Intelligence Agency (NGA), will be ready to implement
performance-based compensation by 2011 and to execute full base and bonus payouts
under DCIPS no later than January 2012. It also advises that the other components will be
able to follow a similar phased schedule by approximately January 2012. These time
frames should be the goals of the phased approach, but be subject to revision based on
OUSD(I)’s evaluation of the readiness of the components and DCIPS to proceed to the next
phase.
All DoD intelligence components should continue with DCIPS performance management
and bonus payouts as they did this year, subject to refinements and improvements
resulting from OUSD(I) implementation actions. NGA, which already has fully
implemented DCIPS, should be excluded from the readiness-assessment-based schedule,
but be subject to additional training and other process improvements recommended in this
report and resulting from OUSD(I) implementation actions.
USD(I), OUSD(I), the DoD intelligence components, and the ODNI are working to introduce an
important “new order of things.” The design and implementation of DCIPS have been
approached with great seriousness, hard work, and creativity, and the Panel believes that the
system has the potential to meet its intended goals.
Prior to full adoption of the system, however, OUSD(I) must invest the time and energy needed
to complete the implementation of DCIPS as designed and undertake a full-scale change
management effort to reestablish workforce trust and support. The recommendations in this
report are intended to assist in that effort.
107
This Page Left Intentionally Blank.
108
CHAPTER 7
PANEL RECOMMENDATIONS
Based on its findings and conclusions, the Panel makes the following recommendations.
• Establish a Program Office within OUSD(I) that has overall responsibility to:
• Develop mandatory, specific, and robust training regimens for DoD intelligence
component supervisors and managers regarding their responsibilities under the
DCIPS performance management process. Further, adopt Performance Objectives
or Elements that make these supervisors and managers accountable for consistent
and effective execution of those responsibilities, including diversity management
109
that has meaningful development and advancement of a diverse workforce as its
goal.
All of these activities should be conducted in consultation and coordination with the
Undersecretary of Defense for Personnel and Readiness.
OUSD(I) has advised the Panel that one or more DoD intelligence components, in addition
to the National Geospatial Intelligence Agency (NGA), will be ready to implement
performance-based compensation by 2011 and to execute full base and bonus payouts
under DCIPS no later than January 2012. It also advises that the other components will be
able to follow a similar phased schedule by approximately January 2012. These time
frames should be the goals of the phased approach, but be subject to revision based on
OUSD(I)’s evaluation of the readiness of the components and DCIPS to proceed to the next
phase.
All DoD intelligence components should continue with DCIPS performance management
and bonus payouts as they did this year, subject to refinements and improvements
resulting from OUSD(I) implementation actions. NGA, which already has fully
implemented DCIPS, should be excluded from the readiness-assessment-based schedule,
but be subject to additional training and other process improvements recommended in this
report and resulting from OUSD(I) implementation actions.
DCIPS’ Design
Recommendation 2. OUSD(I) should review and assess models for measuring and
rewarding team and organizational performance under DCIPS to ensure alignment with
the IC’s broad goals.
• Review its policies regarding pay pool composition to ensure equitable treatment of
similarly situated employees. This review should examine the policy for
determining the size of pay pools and practice of assigning employees of different
work categories to the same pay pool.
110
• Clarify and strengthen its guidance for developing performance objectives to ensure
that managers and supervisors fully understand ways to develop appropriate
objectives for all employees, including those in non-mission work categories.
• Refine and modify the impact of the performance elements to ensure that they
permit meaningful and appropriate assessments of factors affecting overall
performance.
• Adjust the performance standards for summary rating levels so that they permit the
same performance assessments for all categories of work.
Recommendation 6. OUSD(I) should finalize its evaluation policy and ensure that it defines
a process for monitoring DCIPS’ impact on salary increases, bonuses, and career
progression of women, minorities, and other protected groups.
DCIPS’ Implementation
Recommendation 11. OUSD(I) should move swiftly to finalize DCIPS governing policies,
make them available to the workforce, and communicate them widely to improve
transparency and ease of understanding.
111
Recommendation 13. The USD(I) should be more visibly engaged, set key implementation
objectives for DoD intelligence component leaders, and meet with them regularly to hold
them accountable for meeting those objectives.
Recommendation 14. OUSD(I) should develop a detailed communications plan and style
guide as part of its overall change management efforts. This plan should address strategic
communications about the overall DCIPS system and implementation, as well as an
approach for tactical communications about status, updates, and other fluid aspects of
implementation.
Recommendation 15. As part of the overall change management effort, OUSD(I) should
develop a thorough training plan and specific instructions aimed at first-line supervisors
and managers to equip them with the personnel management skills needed to fully
implement and maintain DCIPS.
Recommendation 17. OUSD(I) should establish a program management office, with the
requisite staffing, resources, and authority to design and implement a comprehensive
change management strategy and provide adequate oversight of DoD intelligence
component implementation.
Recommendation 18. OUSD(I) should make the DCIPS Readiness Tool and website more
user-friendly and interactive in order to meet the information resource needs of their
intended audiences through timely, accurate, and updated information.
Recommendation 19. OUSD(I) should employ best practices for stakeholder involvement
and develop guidance for gathering and considering continual employee feedback.
DCIPS’ Impact
112
Recommendation 23. OUSD(I) should determine the reasons that ratings tend to increase
at each higher pay band.
Recommendation 25. OUSD(I) should identify ways to compensate for employee attitudes
about the loss of “milestone events” when transferring from a grade-based system to a pay-
banded system.
113
This Page Left Intentionally Blank.
114
APPENDIX A
APPENDIX A
PANEL AND STAFF
PANEL
Dr. Edwin Dorn, Chair∗—Professor and former Dean, Lyndon B. Johnson School of Public
Affairs, University of Texas. Former Under Secretary of Defense for Personnel and Readiness,
and Assistant Secretary of Defense for Personnel and Readiness, U.S. Department of Defense;
Senior Staff Member, Center for Public Policy Education, The Brookings Institution; Deputy
Director of Research, Joint Center for Political Studies; Director of Executive Operations, Office
of Elementary and Secondary Education, U.S. Department of Education; Special Assistant to the
Secretary, U.S. Department of Health, Education and Welfare.
Martin C. Faga*—Former position with the President's Foreign Intelligence Advisory Board;
Former positions with The MITRE Corporation: President and Chief Executive Officer,
Executive Vice President and Director, Department of Defense Federally Funded Research and
Development Center; Senior Vice President and General Manager, Center for Integrated
Intelligence Systems; Member, Technical Staff. Former Assistant Secretary of the Air Force for
Space; Director, National Reconnaissance Office, U.S. Department of Defense; Professional
Staff Member, House Permanent Select Committee on Intelligence, U.S House of
Representatives.
∗
Academy Fellow
A-1
APPENDIX A
Transportation Services and Vice President, External Affairs. Former Vice President/Chief of
Staff, Mid-Atlantic Region, Citicorp Mortgage, Inc.; Special Assistant to the President/Deputy
Assistant to the President, Office of Intergovernmental Affairs, Executive Office of the
President; Deputy Assistant Secretary/Executive Director of Governmental Affairs, U.S.
Department of Transportation.
Leo Hazlewood—Corporate Vice President and Director for Intelligence Programs in the Space,
Intelligence and Information Sector, Science Applications International Corporation (SAIC);
Senior Vice President and General Manager of the Mission Integration Business Unit in SAIC's
Intelligence, Security and Technology Group; Director of Shared Services in SAIC's new Shared
Services Center in Oak Ridge, TN; Former positions with the CIA: Deputy Director of the
National Geospatial-Intelligence Agency; Comptroller; Director, National Photographic
Interpretation Center (NPIC) Executive Director; and Deputy Director for Administration.
PROJECT STAFF
Lena E. Trudeau, Vice President—Ms. Trudeau leads the National Academy’s service delivery
organization, providing executive oversight for all studies in which the organization is engaged.
In addition, Ms. Trudeau is a founder of the Collaboration Project, an independent forum of
leaders committed to leveraging web 2.0 and the benefits of collaborative technology to solve
government's complex problems. Ms. Trudeau’s previous roles include: Program Area Director,
National Academy of Public Administration; Vice President, Consulting Services, The Ambit
Group; Marketing Manager, Americas Public Sector, Nokia Enterprise Solutions; Principal
Consultant, Touchstone Consulting Group; Consultant, Adventis Inc.; Associate, Mitchell
Madison Group.
A-2
APPENDIX A
Committee on U.S. National Security and Military/Commercial Concerns with the People's
Republic of China, United States House of Representatives; Deputy Inspector General for
Investigations, Office of Inspector General, Central Intelligence Agency; Deputy Counsel for
Intelligence Policy, Office of Intelligence Policy, U.S. Department of Justice; Assistant General
Counsel, CIA Office of General Counsel; Associate Attorney, Private Practice; Intelligence
Analyst/Career Trainee, Central Intelligence Agency.
Leslie E. Overmyer-Day, Senior Advisor—Former positions include Director, the Ambit Group;
Senior Research Analyst at AmerInd, Inc.; Senior Research Scientist, American Society for
Training and Development. Principal researcher on numerous organizational and human capital
analyses. Ph.D. and M.A. in Industrial/Organizational Psychology, George Mason University,
Bachelor of Science, Pennsylvania State University.
Maria Rapuano, Senior Advisor—Former Project Director, Alliance for Healthy Homes.
Former positions include: Overseas Development Council and State Services Organization.
Board Member, Trust for Lead Poisoning Prevention. B.A. in Government, College of William
and Mary and M.A. in International Affairs, The American University.
William McCarty, Senior Advisor—Thirty three years of experience in the Federal Government
with 20 years of service supporting the US Intelligence Community with Senior Executive and
management positions, including National Geospatial-Intelligence Agency (NGA) Deputy
Director of Acquisition and Contracts and Central Intelligence Agency: Chief, Acquisition
Policy and Legislative Issues Team and Chief of Staff, Office of Finance and Logistics, and
Chief of Staff, Facilities Management Group.
Tara Newman, Research Associate—Project staff member for Academy studies: Defense
Civilian Intelligence Personnel System (DCIPS) Review; Immediate Past Research Associate for
Academy FBI Transformation Budget Process Review Project; Intern at New York University's
Medical Center Office of Development; Masters of Public Administration, American University;
B.A. in English, B.S. in Marketing, University of Tampa.
A-3
APPENDIX A
Martha S. Ditmeyer, Senior Program Associate—Staff member providing technical support for
a wide range of Academy studies. Former staff positions at the Massachusetts Institute of
Technology, Cambridge, MA and the Communication Satellite Corporation, Washington D. C.
and Geneva, Switzerland.
A-4
APPENDIX B
APPENDIX B
PARTICIPATING INDIVIDUALS AND ORGANIZATIONS178
Interviewees
Lt. General James Clapper—Under Secretary of Defense for Intelligence, Office of the Under
Secretary of Defense (Intelligence)
Tom Ferguson—Deputy Under Secretary of Defense for Intelligence, Office of the Under
Secretary of Defense (Intelligence)
Elizabeth Hoag—Deputy Director for Human Resources, Office of the Under Secretary of
Defense (Intelligence), Human Capital Management Office
178
Titles were current at time of participation.
B-1
APPENDIX B
Paula Roberts—Chief Human Capital Officer, Office of the Director of National Intelligence
James M. Seacord—Deputy Director for Readiness, Office of the Under Secretary of Defense
(Intelligence), Human Capital Management Office
Brigadier General Vincent Stewart—Director of Intelligence, United States Marine Corp
Intelligence
Hon. Lee Hamilton*—President and Director, Woodrow Wilson International Center for
Scholars
∗
Academy Fellow
B-2
APPENDIX B
Colloquia Participants
B-3
APPENDIX B
Congressional Committees
Army Intelligence
James Gunlicks, Vice Director of Army
Stephanie Samergedes, Deputy G2, AMC
Yolanda Watson, Chief, Intelligence Personnel Management Office
B-4
APPENDIX B
Navy Intelligence
Ken Carlgren, Management Analyst
Bob Gerrity, Supervisory Intelligence Specialist
Phyllis Wright, Program Manager
Kathy Griffin, DCNO for Information Dominance, Total Force Management Division
B-5
APPENDIX B
B-6
APPENDIX C
APPENDIX C
BIBLIOGRAPHY
9/11 Commission Report. “Final report of the National Commission on Terrorist Attacks Upon
the United States.” New York: WW Norton. (2004).
Air Force Research Manual, Laboratory Personnel Demonstration Project, 1 July 2008.
Booz Allen Hamilton, Pay for Performance (PFP) Implementation Best Practices and Lessons
Learned Research Study, 18 June 2008.
Crum, John. “Statement to the Task Group.” Office of Policy Evaluation, U.S. Merit Systems
Protection Board. Testimony to the Defense Business Board Review of NSPS. 25 June 2009.
“DCIPS 101.” DoD Worldwide HR Conference. DCIPS PowerPoint briefing. July 2009.
Defense Intelligence Enterprise Human Capital Strategic Plan 2010-1015, Office of the Under
Secretary of Defense for Intelligence, Human Capital Management Office, p. 3.
DoD Civilian Personnel Management System: Volume 2001, Defense Civilian Personnel
System (DCIPS) Introduction, 29 December 2008.
DoD Civilian Personnel Management System: Volume 2008, Defense Civilian Intelligence
Personnel System (DCIPS) Awards and Recognition. 15 January 2010.
DoD Civilian Personnel Management System: Volume 2012 Defense Civilian Intelligence
Personnel System (DCIPS) Performance-Based Compensation, Number 1400.25, 15 January
2010.
“DoD, OPM Announce Defense Business Board NSPS Review.” Department of Defense: News
Releases. 2009. Department of Defense. Web. 15 May 2009.
<http://www.defense.gov/Releases/Release.aspx?ReleaseID=12679>
C-1
APPENDIX C
Federally Employed Women. “Pay for Performance: Position Paper from Federally Employed
Women.” FEW.org. (2010): Web. 7 January 2010.
Fernandez, Sergio and Rainsey, Hal G., “Managing Successful Organizational Change in the
Public Sector: An Agency for Research and Practice, Public Administration Review,
March/April 2006 (vol. 55, no. 2).
Furst, S.A. and Cable, D.M. “Employee resistance to organizational change: Managerial
influence tactics and leader-member exchange.” Journal of Applied Psychology, (2008). vol
93(2), 453-462. Print.
Griggin, M.A., Parker, S.K. &Mason, C.M. “Leader vision and the development of adaptive and
proactive performance: A longitudinal study.” Journal of Applied Psychology, (2010). vol
95(1), 174-182.
Herold, D.M., Fedor, D.B., Caldwell, S. & Liu, Y. “The effects of transformational and change
leadership on employees’ commitment to a change: A multilevel study.” Journal of Applied
Psychology, (2008). vol 93(2) 346-357.
Howell, W.C. and Dipboye, R.L. Essentials of Industrial Organizational Psychology, third
edition. Chicago: The Dorsey Press, pp. 228-229. (1986).
Hyde, A.C. “Pay for Performance.” The Public Manager. 2008: Web. February 2010.
IC Community (IC) Pay Modernization Project Office, “Stakeholder Analysis,” p. 13. [undated
PowerPoint]
Intelligence Reform and Terrorism Prevention Act of 2004. Public Law 108–458. (2004). Print.
Kerr, J., & Slocum, J.W. “Managing corporate culture through reward systems.” Academy of
Management Executive, vol 1(2), 99-107. (1987).
King, Andrew B. “What Makes a Great Web Site?” Web Development and Design Tutorials,
Tips and Reviews - WebReference.com. Internet.com, Aug. 1999. Web. 21 Apr. 2010.
<http://www.webreference.com/greatsite.html>.
Kotter, J.P. & Cohen, D.S. “The Heart of Change: Real-Life stories of How People Change Their
Organizations.” Boston: Harvard University Press. 2002.
C-2
APPENDIX C
Langbein, L. “The impact of love and money on quitting in the federal government: implications
for pay for performance.” Conference paper presented at the Midwestern Political Science
Association. (2009).
Losey, Stephen. “Defense, OPM to review NSPS performance pay system.” Federal Times.com.
(2009): Web. March 2010.
Murphy, Mark. “Are Smart Goals Dumb.” Leadership IQ. (2010): Web. March 2010.
National Defense Authorization Act for Fiscal Year 1997. Public Law 104-201 (1996.) Print.
National Defense Authorization Act for Fiscal Year 2010, Pub. L. No. 111-84 (2009). Print.
“Report of the Joint Inquiry into the Terrorist Attacks of September 11, 2001.” P.33. (2002).
Richer, Howard. “Pay for Performance: A Guide for Federal Managers.” IBM Center for the
Business of Government. November 2004.
Richer, Howard and Smallwood, Andrew, “Performance-Based Pay at NGA,” The Public
Manager, Summer 2009.
Rosenberg, Alyssa. “Union leaders make NSPS repeal, personnel reform major priorities.”
Government Executive. (2009): Web. March 2010.
Rutzick, Karen. “Performance Pay: A History” Government Executive. (2005): Web. February
2010. <http://www.govexec.com/story_page.cfm?articleid=31908>
C-3
APPENDIX C
Siegal, W., Church, A.H., Javitch, M., Waclawski, J., Burd, S., Bazigos, M., Yang, T.F.,
Anderson-Rudolph, K., and Burke, W.W. “Understanding the Management of Change: An
Overview of Manager’s Perspectives and Assumptions in the 1990s.” Journal of
Organizational Change Management, (1996): Volume: 9(6), 54-80.
Springer, Linda. Testimony of former OPM Director before the House Subcommittee on Federal
Workforce, Postal Service, and the District of Columbia. 31 July 2007.
United States. The Commission on the Intelligence Capabilities of the United States Regarding
Weapons of Mass Destruction. Report to the President. Washington: The Commission, 2005.
Print.
U.S. Merit Systems Protection Board, Designing an Effective Pay for Performance
Compensation System, (January 2006).
C-4
APPENDIX D
APPENDIX D
MERIT SYSTEMS PRINCIPLES
AND PROHIBITED PERSONNEL PRACTICES
Section 2301 of title 5 of the U.S. Code applies to executive agencies and requires federal
personnel management to be implemented consistent with the following merit systems
principles.
(2) All employees and applicants for employment should receive fair and equitable
treatment in all aspects of personnel management without regard to political
affiliation, race, color, religion, national origin, sex, marital status, age, or
handicapping condition, and with proper regard for their privacy and constitutional
rights.
(3) Equal pay should be provided for work of equal value, with appropriate consideration
of both national and local rates paid by employers in the private sector, and
appropriate incentives and recognition should be provided for excellence in
performance.
(4) All employees should maintain high standards of integrity, conduct, and concern for
the public interest.
(7) Employees should be provided effective education and training in cases in which such
education and training would result in better organizational and individual
performance.
(A) protected against arbitrary action, personal favoritism, or coercion for partisan
political purposes, and
D-1
APPENDIX D
(B) prohibited from using their official authority or influence for the purpose of
interfering with or affecting the result of an election or a nomination for election.
(9) Employees should be protected against reprisal for the lawful disclosure of
information which the employees reasonably believe evidences--
The Prohibited Personnel Practices are derived from the Merit System Principles. 5 USC
2302(b) says that any employee who has the authority to take, direct others to take, recommend,
or approve any personnel action shall not, with respect to that authority, commit any of the 12
Prohibited Personnel Practices.
(1) Discriminate for or against any employee or applicant on the basis of race, color,
religion, sex, national origin, age, handicapping condition, marital status, or
political affiliation.
(2) Solicit or consider any personnel recommendation that is not based on a personal
knowledge or records of job-related factors such as performance, ability, aptitude,
general qualifications, character, loyalty, or suitability.
(3) Coerce the political activity of any person or take reprisal action for the refusal of
any person to engage in political activity.
(5) Influencing any person to withdraw from competition for any position for the
purpose of improving or injuring the prospects of another applicant.
(6) Granting any preference or advantage not authorized by law, rule, or regulation to
any employee or applicant for the purpose of improving or injuring the prospects of
another applicant.
D-2
APPENDIX D
(12) Violating any law, rule, or regulation which implements or directly concerns the
merit system principles.
D-3
This Page Left Intentionally Blank.
D-4
APPENDIX E
APPENDIX E
DCIPS PERFORMANCE APPRAISAL FORM
7. PAY POOL ID
Date (YYYYMMDD)
RATING OFFICIAL
Printed Name:
Signature:
Date: (YYYYMMDD)
Communication
Method (face-to face,
telephone, other)
REVIEWING OFFICIAL:
Printed Name:
Signature:
Date: (YYYYMMDD)
1
2
3
4
5
6
7
8
9
10
AVERAGE PERFORMANCE OBJECTIVE RATING
SECTION 3 - Performance Evaluation of Record
Average Performance Element Rating
Average Performance Objective Rating
Overall Rating
Evaluation of Record
E-6
APPENDIX F
APPENDIX F
OPEN FORUM LOCATIONS, DATES, AND ATTENDANCE
Estimated
Date Agency Location Attendance
F-1
This Page Left Intentionally Blank.
F-2
APPENDIX G
APPENDIX G
MITRE’S PERFORMANCE-BASED COMPENSATION SYSTEM
Since 1996, MITRE has used a merit-based compensation system that ties individual
performance to pay. The system is viewed as a success for the following reasons:
• Since the system was implemented, MITRE has scored in the top 25 percent of
Fortune 500 companies on a survey that asks employees if they are paid fairly; and
• MITRE’s attrition rate is very low.180
“Near Broad Bands.” MITRE groups its employees into six basic job levels, similar to pay
bands, with each job level having a 100 percent spread from top to bottom. Pay bands have four
quartiles, or “sections.” Movement within the bands is based on performance, but MITRE’s
system does not rely on the principle of moving most employees toward the middle of the band.
Rather, pay corresponds to the employee’s value to the organization and how rapidly employees
are building capabilities to better serve the organization.
Rating Process. MITRE rates its employees on a three-level scale. Top performers are rated at
Level 1; successful performers are rated at Level 2; and employees who need improvement are
rated at Level 3. Within this three-level rating scale, most employees are rated at Level 2, with
the possibility of “refining” the rating, e.g., 2+ or 2-. Like DCIPS, individual employee
objectives are set at the beginning of the year so that employees know what is expected of them.
When ratings are completed at the end of the year, a group of managers across the organization
come together to normalize the ratings and identify the top 10 percent of performers, as well as
the next top 10 percent, so that the top 20 percent are agreed upon by all senior managers. It is
agreed by managers, and understood by all employees, that this top 20 percent of performers will
receive noticeably higher pay increases.
Salary Review/Merit Increase. At MITRE, as is true under DCIPS, the performance rating
process is completed before payouts are considered. After the ratings are finalized, managers use
the “Merit Matrix” to guide the decision-making process to determine merit increases. The
Matrix lists a range of percentages for each section (quartile) of the band to be used as a
guideline. The Matrix promotes the concept that larger pay increases should be given to the best
performers, but payouts are funded at higher levels for employees who are “low to market.” This
means that the budget is larger to fund pay increases for those employees who are new to the
organization and who are more likely to be recruited by other organizations. Transparency is a
key feature of the system, and the employee’s “personal budget” and the Merit Matrix are
179
MITRE Information Infrastructure.
180
As reported in an interview with MITRE’s compensation expert.
G-1
APPENDIX G
available to all employees as soon as managers have communicated increases to their employees.
Managers find the range of guideline percentages by locating the intersection of the employee’s
fiscal year-end Guide Position (the salary’s position in the upcoming year’s pay guide) and the
rating. Although there is no specific incentive for managers or supervisors, it is understood that
managers will command higher salaries over time. Table G-1 provides a sample Merit Matrix.
Full implementation of the merit pay aspects of its HR system reportedly required about five
years.181 MITRE’s experience supports the conclusion that a performance-based compensation
system like DCIPS cannot be successfully implemented in one or two years. OUSD(I) will
likely need at least five years to adjust the design so that employees and managers understand
and accept how DCIPS provides equity for all affected employees. MITRE’s experience also
further confirms that transparency in the payout process is a critical element of success.
181
Ibid.
G-2
National Academy of
Public Administration®