EAAF - Enterprise Architecture Assessment Framework v3.1
EAAF - Enterprise Architecture Assessment Framework v3.1
EAAF - Enterprise Architecture Assessment Framework v3.1
June 2009
Table of Contents
1 INTRODUCTION.......................................................................................................... 1
2 PERFORMANCE IMPROVEMENT LIFECYCLE......................................................... 3
2.1 ARCHITECT ...........................................................................................................................4
2.2 INVEST .................................................................................................................................5
2.3 IMPLEMENT ...........................................................................................................................6
2.4 MEASURE, ASSESS AND IMPROVE .........................................................................................6
2.5 AGENCY SUBMISSION DATA QUALITY.....................................................................................7
3 FEDERAL ENTERPRISE ARCHITECTURE OVERVIEW ........................................... 8
3.1 FEA REFERENCE MODELS ....................................................................................................8
3.2 SEGMENT ARCHITECTURE .....................................................................................................9
3.3 FEDERAL TRANSITION FRAMEWORK.....................................................................................10
4 FRAMEWORK STRUCTURE .................................................................................... 12
4.1 CHANGES IN THE 3.1 FRAMEWORK ......................................................................................12
4.2 ASSESSMENT CRITERIA OVERVIEW .....................................................................................12
5 AGENCY EA ASSESSMENT SUBMISSION AND SCORING PROCESS................ 15
5.1 EAAF VERSION 3.1 IMPLEMENTATION TIMING ......................................................................15
5.2 AGENCY EA ASSESSMENT & REVIEW ..................................................................................16
6 ASSESSMENT FRAMEWORK 3.1 CRITERIA.......................................................... 18
6.1 COMPLETION CAPABILITY AREA ...........................................................................................18
6.1.1 Target Enterprise Architecture and Enterprise Transition Plan...............................................................19
6.1.2 Architectural Prioritization.......................................................................................................................21
6.1.3 Scope of Completion..................................................................................................................................22
6.1.4 Internet Protocol Version 6 (IPv6)............................................................................................................24
6.2 USE CAPABILITY AREA ........................................................................................................26
6.2.1 Performance Improvement Integration .....................................................................................................26
6.2.2 CPIC Integration.......................................................................................................................................28
6.2.3 FEA Reference Model and Exhibit 53 Data Quality .................................................................................29
6.2.4 Collaboration and Reuse...........................................................................................................................31
6.2.5 EA Governance, Program Management, Change Management, and Deployment ...................................33
6.3 RESULTS CAPABILITY AREA.................................................................................................36
6.3.1 Mission Performance ................................................................................................................................36
6.3.2 Cost Savings and Cost Avoidance .............................................................................................................37
6.3.3 IT Infrastructure Portfolio Quality............................................................................................................39
6.3.4 Measuring EA Program Value ..................................................................................................................40
APPENDIX A: ARTIFACT DESCRIPTIONS .............................................................. A-1
APPENDIX B: STRATEGY FOR MEASURING DATA QUALITY.............................. B-1
APPENDIX C: AGENCIES INCLUDED IN THE EA ASSESSMENT PROCESS ....... C-1
June 2009
OMB EA Assessment Framework 3.1
1 Introduction
The Federal Government is focused on delivering solutions and achieving results
grounded in the principles of transparency and open government. In the course of
managing the President’s budget, with over $70 billion in annual spending1, there is an
inherent responsibility to manage information technology (IT) investments wisely. This
investment, and in particular the $23.7 billion in Development, Modernization, and
Enhancement (DME, BY2010) funding, represents a key resource for improving agency
performance, closing performance gaps and achieving government-wide transformation.
EAAF Version 3.1 features the use of key performance indicators (KPIs) to measure the
effectiveness of EA relative to the three EA capabilities areas of Completion, Use, and
Results. It also moves agency EA submission to a template-based model aimed at
improving reporting and assessment via an automated process and delivery
mechanism. Artifacts will be posted on the MAX collaboration environment.
EAAF Version 3.1 also changes the assessment and reporting process. Instead of a
single annual assessment, Version 3.1 moves to posting relevant artifacts for the
Completion, Use, and Results capability areas in order to better align the use of EA with
agency planning, investment management, and budget formulation and decision-
making processes relevant to the annual budget cycle.
1
$75,829M total, $23,686M in DME. This represents the IT crosscut across the President’s BY10
Budget. Please see http://www.whitehouse.gov/omb/egov/vue-it/index.html for more information.
June 2009 1
OMB EA Assessment Framework 3.1
The EAAF supports policy implementation and assessment when meeting the EA and
related requirements set forth in OMB Circulars A-11 and A-130. EAAF Version 3.1 is
closely aligned with the methodologies, reporting templates, and tools such as the
Federal Transition Framework (FTF), the Federal Segment Architecture Methodology
(FSAM), and OMB’s IT Spending Dashboard.2
Six key success factors for agencies with the EAAF v3.1 will be their ability to:
• Align with agency performance improvement to quantitatively plan for and
support measurable delivery of agency performance improvement.
• Collaborate with other agencies to deliver common architectures for shared cross
boundary mission, business, and technical requirements.
• Contribute to the definition and implementation of the target Federal Enterprise
Architecture.
• Leverage bureau and program architecture activity to build out the agency EA
and ensure that agency-proposed IT spending is well-architected, implementing
the target agency and Federal Enterprise Architecture, and demonstrably driving
agency performance improvement.
• Integrate with agency IT Governance to ensure effective use of the agency EA to
support delivery of agency performance improvement.
• Through the above, establish buy-in with mission and business owners, and
complete the evolution to results-focused architecture.
OMB is committed to working with agencies through the annual assessment and
quarterly reporting process to successfully implement the EAAF v3.1. For more
information on the quarterly reporting process, see Section 5 below.
2
Additional information on these tools and methodologies can be found at www.egov.gov , Note: IT
Spending Dashboard was previously referred to as VUE-IT.
June 2009 2
OMB EA Assessment Framework 3.1
The focus of this document, and the discussion in this chapter, is information and IT-
enabled performance improvement.
Agency EA programs are one of several practice areas that must be effectively
executed to achieve improvements in agency mission performance and other
measurement areas.3 EA helps to organize and clarify the relationships among agency
strategic goals, investments, business solutions, and measurable performance
improvements - it is but one link in a chain of integrated practice areas. To achieve
target performance improvements, other practice areas ~ such as strategic planning,
capital planning and investment control (CPIC), and program and project management
~ must be strong and fully integrated with an agency EA practice.
The Performance Improvement Lifecycle defines a simple value chain linking enterprise
architecture with IT investment management and program and project execution.
Figure 2-1 below illustrates the logical integration and sequencing of key architecture,
investment and implementation activities, as well as feedback from program
assessment and performance measurement.
3
Other stakeholders, many of whom are the actual drivers and owners for program success, include the
Chief Financial Officer and Budget Officers, Chief Performance Officers, Chief Acquisition Officers,
Congress, agency leadership, business owners, program managers, and the public.
June 2009 3
OMB EA Assessment Framework 3.1
2.1 ARCHITECT
Enterprise architecture describes the current (baseline) and future (target) states of the
agency, and the plan to transition from the current to the future state, with a focus on
agency strategy, program performance improvements and information technology
investments. Agency EAs are organized by segments – core mission areas (e.g.,
homeland security, health), business service (e.g., financial management, human
resources), and enterprise services (e.g., Information Sharing). Segments are defined
using the Federal Enterprise Architecture (FEA) reference models, described in
subsequent chapters.
June 2009 4
OMB EA Assessment Framework 3.1
The purpose of the target enterprise architecture is to develop a set of blueprints, using
the FEA reference models, that when implemented can effectively achieve the strategic
goals of an agency or agencies, The enterprise transition plan (ETP) identifies a
desired set of business and IT capabilities, and highlights the performance milestones
that need to be met along the path to achieving the target enterprise architecture. It
also defines logical dependencies between major activities (i.e. program/project,
investment) and helps to define the relative priority and sequencing of those activities.
This can be represented at the enterprise level or across segments within the EA.
To achieve the target performance improvements, the agency EA should fully integrate
with the capital planning and investment control (CPIC) step, as well as the agency
system (solution) development life cycle (SDLC). OMB Circular A-130 states:
“Agencies must establish and maintain a capital planning and investment control
process that links mission needs, information, and information technology in an effective
and efficient manner. The process will guide both strategic and operational IRM, IT
planning, and the enterprise architecture by integrating the agency's IRM plans,
strategic and performance plans, financial management plans and the agency's budget
formulation and execution processes…”
The FEA Practice Guidance4, provides more information on techniques and best
practices for EA Practice Integration.
2.2 INVEST
Performance improvement opportunities identified during the “Architect” process are
ideally addressed through an agency portfolio of IT investments5. This step defines the
implementation and funding strategy for individual initiatives identified in the Enterprise
Transition Plan (ETP) and described in the segment architectures. Program
management plans are created to implement the individual solutions identified in the
implementation and funding strategy.
4
http://www.whitehouse.gov/omb/assets/fea_docs/FEA_Practice_Guidance_Nov_2007.pdf
5
It is recognized that, more often than not, funding is provided by Congress in advance of program
design or the development of an architecture.
June 2009 5
OMB EA Assessment Framework 3.1
During this step of the Performance Improvement Lifecycle, agencies should carefully
evaluate and adjust their prioritization to ensure investments are aligned, via high-
priority segments, to agency strategic goals and objectives. Further, the prioritization
should be refined to reflect additional opportunities for cost savings and avoidance, as
well as other approaches to improve agency performance. Agencies should also
incorporate high priority national objectives identified as part of the FTF within its EA
and investment portfolio.
The FEA Practice Guidance provides more information on techniques and best
practices to align agency enterprise architecture and investments.
2.3 IMPLEMENT
Projects are executed and tracked throughout the system development life cycle
(SDLC). Achievement of the program / project plan within acceptable variance for
schedule and budget is measured and reported through Earned Value Management
(EVM) process. Performance is measured to determine how well the implemented
solutions achieve the desired (process) outputs and mission outcomes, and provide
feedback into the enterprise and segment architecture development processes as well
as the cyclical strategic planning process.
The FEA Practice Guidance6 provides more information on techniques and best
practices to align the agency ETP and performance measures and outcomes.
6
http://www.whitehouse.gov/omb/assets/fea_docs/FEA_Practice_Guidance_Nov_2007.pdf.
June 2009 6
OMB EA Assessment Framework 3.1
This data helps OMB decision-makers select IT initiatives and investments that promise
to deliver the highest value and performance impact for the Federal Government within
a constrained budgetary environment. In order to make informed decisions, OMB is
dependent upon agencies to provide high-quality data submissions. EAAF Version 3.1
outlines expectations for high quality submissions through transparency on KPIs and
associated algorithms and heuristics.
Appendix B describes OMB’s strategy for using the KPIs defined within the EAAF
Version 3.1 to enforce high standards of data quality for agency EA and IT investment
portfolio submissions.
June 2009 7
OMB EA Assessment Framework 3.1
OMB Circular A-118 sections 53 and 300 require Federal agencies to align their IT
investments to the FEA Reference Models and segment architecture. EAAF Version
3.1 is designed to assess agency responses to this policy and gauge the extent
agencies are using their EA and ETP to implement cross-agency initiatives and achieve
measurable performance improvements.
Component-Based Architecture
• Inputs, outputs, and outcomes
Business-Driven Approach
7
http://www.whitehouse.gov/omb/e-gov/fea/
8
http://www.whitehouse.gov/omb/circulars/a11/current_year/a11_toc.html
June 2009 8
OMB EA Assessment Framework 3.1
The Data Reference Model (DRM) enables information sharing and reuse across the
Federal Government through the standard description and discovery of common data
and the promotion of uniform data management practices. This model provides
guidance on the implementation of consistent processes to enable data sharing through
Federal Government-wide agreements.
Agencies should use their strategic goals and objectives, EA and ETP as the basis for
identifying and prioritizing enterprise segments. The process to identify and prioritize
enterprise segments should reflect the following key characteristics:
• Use performance gaps, identified by the agency’s strategic plan, IG or GAO reports,
and/or performance improvement assessments, as the driver for segment
identification and prioritization;
June 2009 9
OMB EA Assessment Framework 3.1
• Identify new requirements and opportunities within the agency strategic plan and
use these new requirements to expand existing segments or develop new
segments;
• Integrate cross-agency initiatives using the FTF described below; and
• Measure the value of and results from enterprise architecture to stakeholders.
Cross-agency teams, chartered by the Federal CIO Council, are working with OMB to
develop step-by-step step guidance documents serving as a road map for architects
developing segment architecture.
Agencies should use their Enterprise Transition Plan (ETP) and segment architectures
to align and integrate appropriate cross-agency initiatives from the FTF with their
enterprise architecture10. Relevant cross-agency initiatives are reflected in agency IT
investment portfolios (Exhibit 53) and business cases (Exhibit 300s). Segment
architectures provide the integration point between cross-agency initiatives,
performance improvement goals, and agency improvement commitments, as illustrated
below in Figure 3-2. The FEA Practice Guidance and Federal Segment Architecture
Methodology (FSAM) provide additional information on segment architecture and the
ETP.
9
http://www.whitehouse.gov/omb/e-gov/fea/
10
The burden of proof lies with the agency whenever that agency includes architectural segments that are
identical OR similar to initiatives that are included in the FTF. For example: Since there is a government-
wide financial management initiative identified in the FTF, any agency that proposes a new financial
management effort that does not reuse, share, or comport with the FTF initiative must provide written
justification on why that agency’s requirements are so divergent from the FTF financial management
initiative as to warrant separate development and funding. For the most part, if a service component
exists within the FTF, agencies are required to consider reuse or share services, and replicate the
architectural segment from the FTF, including the Lines of Business.
June 2009 10
OMB EA Assessment Framework 3.1
Segment Architecture
Project 1
Line of Business
Project 2
Project 3
Segment
Baseline
Project 4
Project 5
Program B
Project
Project
6
7 Exhibit 300 53 Line Item
Project 8
Project 9
53 Line Item
Program C
Architecture
Architecture
53 Line Item
Segment
Baseline
Segment
Project 6
Project 7
Project 8
Project 9 53 Line Item
Program D
Exhibit 300 53 Line Item
Architecture
Architecture
Project 10
Segment
Baseline
Segment
Project 11
Project 12
Project 13 53 Line Item
Performance
Improvement Interim Interim Interim
Summary
Target 1 Target 2 Target 3
As part of the architectural planning process, architects in conjunction with segment and
investment owners should evaluate opportunities to incorporate FTF initiatives to deliver
measurable performance benefits. Benefits should be quantified in terms of component
reuse, improved collaboration, information sharing, cost savings, cost avoidance, and
mission performance improvements.
In the event an FTF initiative cannot be integrated with the agency enterprise
architecture, architects should provide feedback to the FTF initiative’s Managing Partner
and OMB on the aspects of initiatives not satisfying agency requirements. This
guidance will allow initiative owners and managing partners to effectively reengineer the
scope of FTF initiatives to bridge these gaps and thereby expand the potential audience
and cross-applicability for the initiative.
June 2009 11
OMB EA Assessment Framework 3.1
4 Framework Structure
EAAF Version 3.1 moves to a template-based submission process – one for each
agency defined segment architecture, identifying enterprise segments and aspects of
the target enterprise architecture.
Agency artifacts will be posted on the MAX collaboration sites. The MAX environment
is intended to promote information sharing and transparency among agencies,
particularly those with shared mission areas, business services, or enterprise services.
MAX allows agencies to work together to identify, diffuse, and adopt best practices, and
improve the quality and use of agency EAs throughout the year.
11
The complete URL is
http://www.cio.gov/index.cfm?function=showdocs&structure=Information%20Technology&category=Enterprise%20Architecture
June 2009 12
OMB EA Assessment Framework 3.1
June 2009 13
OMB EA Assessment Framework 3.1
decide to develop additional artifacts or elaborate upon them further than described
here. Appendix A provides a description of the artifacts in more detail.
Additionally, for each assessment criterion, a rationale and a mandate are provided.
The rationale explains why OMB considers it important to collect information about each
criterion and the mandate links the assessment criterion to law and/or policy, as
applicable.
The FEA Practice Guidance provides more information on techniques and best
practices for EA Practice Integration. All documents listed as mandates are available
for download from the OMB E-Government website on the following pages:
• Legislation: http://www.whitehouse.gov/omb/e-gov/
• OMB Memoranda: http://www.whitehouse.gov/omb/memoranda_default/
• Federal Enterprise Architecture: http://www.whitehouse.gov/omb/e-gov/fea/
• Federal Transition Framework: http://www.whitehouse.gov/omb/e-gov/fea/
• IT Spending Dashboard: http://www.whitehouse.gov/omb/e-gov/
• Federal Segment Architecture Methodology (FSAM): http://www.fsam.gov
• EA Segment Report: http://www.whitehouse.gov/omb/e-gov/fea/
June 2009 14
OMB EA Assessment Framework 3.1
The submission and scoring process are discussed below. The list of agencies to be
assessed using this Framework is included in Appendix C.
EAAF Version 3.1 will be phased in over the next two EA budget preparation cycles,
with full implementation and accountability required for the budget year (BY) 2012 cycle
(submissions starting in Q3 FY10). KPI levels are provided in the EAAF v3.1 criteria
portion (Chapter 6) and interim KPI thresholds for the BY 2011 cycle are identified via
footnotes in Chapter 6.
June 2009 15
OMB EA Assessment Framework 3.1
OMB will provide feedback following the review and assessment of each of the agency
EA submissions. The following diagram depicts the timeline for the EA reporting
activities over a fiscal year:
Q2 Q3 Q4 Q1
Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec
EA Segment
Reporting 9 9 9 9
Agency EA Self
Assessment and EA
9 9 9
Submission Updates Completion Use Results
Agencies will receive an average assessment score in each capability area, calculated
by summing the score for all criteria within the capability area and dividing by the
number of criteria. Scores will be rounded to the nearest tenth. The results of the
overall EA assessment will be provided to the agency.
June 2009 16
OMB EA Assessment Framework 3.1
June 2009 17
OMB EA Assessment Framework 3.1
• Outcomes:
o Identifies specific reporting the agency needs to provide to OMB to support
data-driven analysis and decision-making around EA and IT portfolio
management.
o Describes the future capabilities (via enterprise transition plan and target
segment architectures) to enable the agency to achieve its performance
goals.
o Identifies the magnitude of the gap between the baseline and target
architectures and possible improvement strategies to realize its target state.
o Effectively integrates relevant cross-agency initiatives into the agency’s target
architecture and enterprise transition plan, including all applicable FTF
initiatives.
o Produces segment architectures describing agency lines of business to be
used to assist agency managers in decision-making tasks.
o Identifies duplication and opportunities for consolidation and reuse of
information and technology within and across agencies.
o Provides a framework and a functional view of an agency’s lines of business
(LoBs), including its internal operations/processes.
• Notes:
o The Completion capability area assesses agency maturity in developing
baseline and target architectures in terms of the five FEA reference models:
performance, business, data, service component, and technology. However,
this should not be construed as a requirement for agencies to restructure their
EA frameworks into five corresponding layers or views. OMB does not
require agencies to adopt one specific EA framework, unless specified in
OMB budget guidance. In their submissions to OMB, agencies are simply
June 2009 18
OMB EA Assessment Framework 3.1
Activities:
• The agency must have a target enterprise architecture that is a
consolidated representation of all agency segments.
• The agency must submit their segment architectures as EA Segment
Level 1 Reports.
Practices
• The agency must submit an enterprise transition plan. There is no
indication of reuse.
Artifacts:
• Target EA, Enterprise Transition Plan, EA Segment Report
12
The FEA reference models (PRM, BRM, SRM, DRM, and TRM) are typically used as a "common
language" to articulate target capabilities - although many agencies can and do customize these models
to meet their evolving needs.
June 2009 19
OMB EA Assessment Framework 3.1
Activities:
• The target enterprise architecture must address all FTF cross-agency
initiative areas within scope for the agency (i.e. comply with all
statutory and policy requirements promulgated by the initiatives).
Level 2 • EA Segment Report transition milestones13 demonstrate reuse,
Practices within the agency.
• EA Segment Report transition milestones are evident in the
Enterprise Transition Plan
Artifacts:
• Target EA, Enterprise Transition Plan, EA Segment Report, Exh. 53
Activities:
• EA Segment Report transition milestones14 demonstrate reuse
and/or information sharing with appropriate initiatives within the FTF
Level 3 catalog.
Practices
• Plans exist to address to mature agency segment architectures.
Artifacts:
• Target EA, Enterprise Transition Plan, EA Segment Report, Exh. 53
Activities:
• EA Segment Report transition milestones15 demonstrate reuse
and/or information sharing with other government agencies.
• EA Segment Report transition milestones clearly demonstrate line-of-
Level 4 sight to Agency performance goals and commitments (as identified in
Practices the EA Segment Report v1.2 - Performance Section.
• The Agency has defined segment architecture for its major mission
areas and cross-cutting services.
Artifacts:
• Target EA, Enterprise Transition Plan, EA Segment Report, Exh. 53
Activities:
• All of the agency’s segment architectures are in-progress or complete
maturity stages.
• EA Segment Report transition milestones16 demonstrate reuse
and/or information sharing among sub-units of the agency and/or
Level 5 other agencies.
Practices
• EA Segment Report transition milestones clearly demonstrate line-of-
sight to Agency performance goals (as identified in the EA Segment
Report v1.2 - Performance Section.
Artifacts:
• Target EA, Enterprise Transition Plan, EA Segment Report, Exh. 53
13
EA Segment Report transition milestones are identified in the EASRv1.2 Segment Transition Planning Section
14
Ibid
15
Ibid
16
Ibid
June 2009 20
OMB EA Assessment Framework 3.1
Activities:
• The agency must have a process in place to prioritize and initiate the
development of segment architectures.
• The prioritization process contains prioritization criteria including
mission performance and cost efficiency opportunities.
Level 1
• The agency’s prioritization process must yield proposed high priority
Practices
segments approved by the agency CIO.
• The agency registers its segment(s) with OMB.
Artifacts:
• Segment architecture prioritization process, identified high priority
segment approved by CIO, EA Segment Report
Activities:
• The agency’s prioritization process has matured and contains
quantitative prioritization criteria including each segment’s financial
Level 2 spending data, existing performance plans, and performance
Practices assessments such as the Performance and Accountability Report.
Artifacts:
• Segment architecture prioritization process, identified high priority
segment approved by CIO, EA Segment Report
Activities:
• The agency’s prioritization process must include the identification of
mission performance gaps tied to specific segments.
• The agency prioritization process should be factored into segment
prioritization along with the performance and financial spending data
Level 3 available for segments.
Practices • Additionally, the prioritization process should include consideration of
IT security opportunities.
• The agency must show evidence of segment business owner(s)
signoff.
Artifacts:
• Segment architecture prioritization process, identified high priority
June 2009 21
OMB EA Assessment Framework 3.1
June 2009 22
OMB EA Assessment Framework 3.1
Activities:
• All agency IT investments must have one and only one associated
segment architecture identified on the agency Exhibit 53 expect for
limited instances.17
Level 1 • These segment architectures should come from the list of agency
Practices segment architectures provided by the agency to OMB.
• These segments do not have to be fully built out.
Artifacts:
• Exhibit 53, EA Segment Report, and agency provided segment
architecture codes
Activities:
• All of the Agency’s Major IT Investments, and non-majors with DME
spend must be associated with a segment architecture.
• At least 70%18 of agency Exhibit 53 DME spending must be
represented in In-progress or Completed segment architecture, and
represented on the Enterprise Transition Plan.
Level 2
Practices • At least 10%19 of the DME funding amount of the entire agency
Exhibit 53 must be aligned to completed segment architecture(s).
The agency provides a full accounting of the usage status and
rationale for non-use of Federal Transition Framework initiatives for
all segments.
Artifacts:
• EA Segment Report and Exhibit 53
Activities:
• At least 80%20 of agency Exhibit 53 DME spending must be
represented in In-progress or Completed segment architecture, and
represented on the Enterprise Transition Plan.
• At least 40%21 of the DME funding amount of the entire agency
Level 3 Exhibit 53 must be aligned to completed segment architecture(s).
Practices
• The agency can demonstrate the planned usage of at least one
Federal Transition Framework initiative within a segment reported to
OMB.
Artifacts:
• EA Segment Report and Exhibit 53
Activities:
Level 4 • At least 90%22 of full agency IT Portfolio (Exhibit 53) spending must
Practices be represented in In-progress or Completed segment architecture,
and represented on the Enterprise Transition Plan.
17
The exception to the “one and only one” rule is for investments such as an ERP system which the
investment is the key data system for multiple segments, as stated in the EA Segment Report v1.2. This
should not be used where an investment provides a service that is utilized or consumed by multiple
business areas.
18
For the FY11 submission cycle (due Q3 FY09), the level 2 KPI is 50%.
19
For the FY11 submission cycle (due Q3 FY09), the level 2 KPI is 5%.
20
For the FY11 submission cycle (due Q3 FY09), the level 3 KPI is 60%.
21
For the FY11 submission cycle (due Q3 FY09), the level 3 KPI is 20%.
June 2009 23
OMB EA Assessment Framework 3.1
Activities:
• The agency has performed a cost and risk impact analysis for
migrating to IPv6.
Level 1
• Agency has also completed a second inventory of IP-aware devices.
Practices
Artifacts:
• IPv6 impact analysis document using guidance in Attachment B of
OMB M-05-22; second IP-aware device inventory (Attachment A)
Activities:
• The agency has met all of its IPv6 transition milestones, and is on
schedule to complete transition per OMB M-05-22.
Level 2 Artifacts:
Practices • IPv6 transition milestones (included in the enterprise transition plan)
through completion date showing projected and actual completion
dates, evidence of milestone completion (agency should determine
the artifact(s) constituting evidence of completion for each milestone),
documentation of successful execution of deployment test criteria
22
For the FY11 submission cycle (due Q3 FY09), the level 4 KPI is 70%.
23
For the FY11 submission cycle (due Q3 FY09), the level 4 KPI is 50%.
24
For the FY11 submission cycle (due Q3 FY09), the level 5 KPI is 80%.
25
For the FY11 submission cycle (due Q3 FY09), the level 5 KPI is 70%.
June 2009 24
OMB EA Assessment Framework 3.1
June 2009 25
OMB EA Assessment Framework 3.1
• Outcomes:
o Establishes strategic objectives and programs the agency needs to meet
citizens’ needs.
o Demonstrates the relationship between EA, strategic planning, and capital
planning processes.
o Provides the ability to make better management decisions, and as necessary,
the ability to assess and re-assess the path forward.
Activities:
• At least one major IT investment in the agency portfolio should be
aligned to a program that undergoes periodic performance
improvement evaluations.
• This specific IT investment must have an Exhibit 300 business case
Level 1 and must be on the agency's enterprise transition plan.
Practices • Alignment is measured using IT investment/program alignment
information reported in the Exhibit 300, Part I, Section A, question
14b.
Artifacts:
• Enterprise Transition Plan, Exhibit 300s, program improvement
assessment data26
Activities:
• The agency must demonstrate alignment between
approved/submitted segment architectures and at least one program
Level 2 that undergoes periodic performance improvement evaluations per
Practices segment.
• Alignment is measured through IT investment/program alignment
information reported in the Exhibit 300, Part I, Section A, question
14b, compared to segment alignment reported in the agency Exhibit
26
This report is collected as part of the PART process. OMB will correlate the PART program data with
the EA data and the IT portfolio data
June 2009 26
OMB EA Assessment Framework 3.1
27
For the FY11 submission cycle (due Q4 FY09), the level 3 KPI is 50%.
28
For the FY11 submission cycle (due Q4 FY09), the level 4 KPI is 60%.
29
For the FY11 submission cycle (due Q4 FY09), the level 5 KPI is 70%.
June 2009 27
OMB EA Assessment Framework 3.1
Activities:
• All major IT investments in the agency Exhibit 53 must be
represented on the agency enterprise transition plan.
Level 1 • At least 40% of the IT investments in the agency Exhibit 53 have
Practices been mapped to the most appropriate investment type of the Exhibit
53 using definitions found in OMB Circular A-11, section 53.
Artifacts:
• Enterprise Transition Plan, Exhibit 53, and Exhibit 300s30
Activities:
• All major IT investments and at least 50%31 (in dollars) of non-major
investments in the agency Exhibit 53 must be represented on the
agency enterprise transition plan.
Level 2
• At least 50% of the IT investments in the agency Exhibit 53 have
Practices
been mapped to the most appropriate investment type of the Exhibit
53 using definitions found in OMB Circular A-11, section 53.
Artifacts:
• Enterprise Transition Plan, Exhibit 53, and Exhibit 300s
Activities:
• All major IT investments and at least 50%32 (in dollars) of non-major
investments with DME spending in the agency Exhibit 53 must be
Level 3 represented on the agency enterprise transition plan.
Practices • There must be at least 50%33 agreement between milestones in the
enterprise transition plan and milestones reported in Part II, Section
C of the Exhibit 300 business cases for major IT investments.
• At least 70% of the IT investments in the agency Exhibit 53 have
30
This data is collected as part of the OMB Circular A-11 process. OMB will correlate the EA data with
the IT portfolio data
31
For the FY11 submission cycle (due Q4 FY09), the level 2 KPI is 30%.
32
For the FY11 submission cycle (due Q4 FY09), the level 3 KPI is 30%.
33
For the FY11 submission cycle (due Q4 FY09), the level 3 KPI is 30%.
June 2009 28
OMB EA Assessment Framework 3.1
34
For the FY11 submission cycle (due Q4 FY09), the level 4 KPI is 30%.
35
For the FY11 submission cycle (due Q4 FY09), the level 4 KPI is 70%.
36
For the FY11 submission cycle (due Q4 FY09), the level 5 KPI is 70%.
June 2009 29
OMB EA Assessment Framework 3.1
Activities:
• The agency must map 100% of the IT investments in its IT portfolio to
a BRM sub-function or SRM service component.
Level 1 • At least 75% of the IT investments in the agency Exhibit 53 have
Practices been mapped to the most appropriate “part” of the Exhibit 53 using
definitions found in OMB Circular A-11, section 53.
Artifact:
• Exhibit 53
Activities:
• The agency must map 100% of the IT investment in its IT portfolio to
a BRM sub-function or SRM service component.
• At least 60% of the IT investments must be accurately mapped given
the title and description of the IT investment and the description of
Level 2 the mapped BRM sub-function or SRM service component.
Practices
• At least 80% of the IT investments in the agency Exhibit 53 have
been mapped to the most appropriate “part” of the Exhibit 53 using
definitions found in OMB Circular A-11, section 53.
Artifact:
• Exhibit 53
Activities:
• The agency must map 100% of the IT investment in its IT portfolio to
a BRM sub-function or SRM service component.
• At least 70% of the IT investments must be accurately mapped given
the title and description of the IT investment and the description of
Level 3 the mapped BRM sub-function or SRM service component.
Practices
• At least 85% of the IT investments in the agency Exhibit 53 have
been mapped to the most appropriate “part” of the Exhibit 53 using
definitions found in OMB Circular A-11, section 53.
Artifact:
• Exhibit 53
Activities:
• The agency must map 100% of the IT investment in its IT portfolio to
a BRM sub-function or SRM service component.
• At least 80% of the IT investments must be accurately mapped given
the title and description of the IT investment and the description of
Level 4 the mapped BRM sub-function or SRM service component.
Practices
• At least 90% of the IT investments in the agency Exhibit 53 have
been mapped to the most appropriate “part” of the Exhibit 53 using
definitions found in OMB Circular A-11, section 53.
Artifact:
• Exhibit 53
Level 5 Activities:
Practices • The agency must map 100% of the IT investment in its IT portfolio to
June 2009 30
OMB EA Assessment Framework 3.1
Activities:
• The agency must show evidence of implementation of required
interoperability standards documented in the FTF catalog for cross
agency initiatives.
• This evidence comes in the form of specifications in the TRM table in
Part I, Section F of the Exhibit 300s for IT investments within scope
of the various cross-agency initiatives.
Level 1
• At least 80% of investments reported in agency Exhibit 300s include
Practices
valid UPI codes for reused SRM service components and report
accurate SRM service component funding percentages.
• At least 80% of SRM service components identified in Table 4 of the
agency Exhibit 300s are mapped to an appropriate TRM service
standard and include detailed and accurate service specifications.
Artifacts:
• EA Segment Report, Exhibit 53, and Exhibit 300s
Level 2 Activities:
Practices • The agency must show evidence of compliance with E-Gov initiatives
June 2009 31
OMB EA Assessment Framework 3.1
June 2009 32
OMB EA Assessment Framework 3.1
Artifacts:
• EA Segment Report, Exhibit 53, and Exhibit 300s
Activities:
• Agency has developed a vision and strategy for EA.
• The agency has begun to identify EA tasks, and resource
requirements. Agency has appointed a chief architect, has senior-
level sponsorship of its EA program, and has funded an EA program.
Level 1
• The agency has developed an EA policy to ensure agency-wide
Practices
commitment to EA.
• Policy clearly assigns responsibility to develop, implement and
maintain the EA.
Artifacts:
• EA Program Plan, EA Policy
Activities:
• Agency has established an EA governance committee or other group
for directing, overseeing, or approving EA activities.
• Internal and external stakeholders are identified based on their
Level 2 involvement in EA related activities and needed information.
Practices • The agency has selected an EA framework.
• The agency has deployed an EA tool/repository to manage EA
artifacts and models.
• The tool/repository supports the agency's EA framework.
• Useable EA content from the tool/repository is communicated
June 2009 33
OMB EA Assessment Framework 3.1
June 2009 34
OMB EA Assessment Framework 3.1
Activities:
• The EA governance committee ensures EA compliance throughout
the agency. If non-compliance is identified, the committee is
responsible for developing a plan to resolve the issue.
• Alignment to the EA standards is a common practice throughout the
agency.
• The compliance process is reviewed and updated when deficiencies
or enhancements to the process are identified.
• The agency’s head, or a designated operations executive has
Level 5 approved the EA governance plan in writing.
Practices
• The EA repository and its interfaces are used by participants or
support staff for the CPIC, SDLC, and strategic planning processes.
• Current EA information is readily available to participants in these
processes, as well as the broader agency user community.
• Users are informed of changes, as necessary.
Artifacts:
• EA Governance Pan, EA governance committee meeting minutes,
governance plan approval, EA communications plan and training plan
and materials
June 2009 35
OMB EA Assessment Framework 3.1
• Outcomes:
o Demonstrates the relationship of IT investments to the agency's ability to
achieve mission and program performance objectives.
o Captures how well the agency or specific processes within an agency are
serving citizens.
o Identifies the relationships between agency inputs and outcomes.
o Demonstrates agency progress towards goals, closing performance gaps,
and achieving critical results.
Activities:
• The agency is not able to demonstrate EA activities have resulted in
program performance improvements.
Level 1 • Specifically, the average major IT investment in the agency’s portfolio
Practices is either a) not aligned to a mission program, or b) is supporting
mission programs not demonstrating results.
Artifacts:
• Mission program performance data, Exhibit 300s
Activities:
• The agency IT investment portfolio shows strong alignment to
Level 2 mission programs, but the supported mission programs are, on
Practices average, are not demonstrating results or are ineffective.
Artifacts:
• Mission program performance data, Exhibit 300s
Activities:
• The agency IT investment portfolio shows strong alignment to
Level 3 mission programs and the supported mission programs are, on
Practices average, providing adequate results.
Artifacts:
• Mission program performance data, Exhibit 300s
June 2009 36
OMB EA Assessment Framework 3.1
Activities:
• The agency IT investment portfolio shows strong alignment to
Level 4 mission programs and the supported mission programs are, on
Practices average, providing moderately effective results.
Artifacts:
• Mission program performance data, Exhibit 300s
Activities:
• The agency IT investment portfolio shows strong alignment to
Level 5 mission programs and the supported mission programs are, on
Practices average, providing effective results.
Artifacts:
• Mission program performance data, Exhibit 300s
Activities:
• The agency is not able to demonstrate the EA program has resulted
in cost savings or cost avoidance.
• Every investment in the agency Exhibit 53 includes a prior year UPI
code.
Level 1
Practices • At least 80% of the IT investments in the agency Exhibit 53 have
been mapped to an accurate UPI code for the previous year using
definitions found in OMB Circular A-11, section 53.
Artifacts:
• EA Segment Report, program improvement assessment data, Exhibit
53 and Exhibit 300s
Activities:
• The agency must have a process and report on cost savings and
avoidance.
• Every investment in the agency Exhibit 53 includes a prior year UPI
code.
Level 2
Practices • At least 85% of the IT investments in the agency Exhibit 53 have
been mapped to an accurate UPI code for the previous year using
definitions found in OMB Circular A-11, section 53.
Artifacts:
• EA Segment Report, program improvement assessment data, Exhibit
53 and Exhibit 300s
Level 3 Activities:
June 2009 37
OMB EA Assessment Framework 3.1
June 2009 38
OMB EA Assessment Framework 3.1
Activities:
• The agency’s IT infrastructure portfolio is outside the committed
service performance levels or exceeds cost levels by a factor of 10%
Level 1 or more.
Practices
Artifacts:
• IT infrastructure EA Segment Report, Exhibit 5337, IT Infrastructure
agency 5 year plans
Activities:
• The agency’s IT infrastructure portfolio is outside the committed
service performance levels or exceeds cost levels by a factor of less
Level 2 than 10%.
Practices
Artifacts:
• IT infrastructure EA Segment Report, Exhibit 5338, IT Infrastructure
agency 5 year plans
Activities:
• The agency’s IT infrastructure portfolio is outside the committed
service performance levels or exceeds cost levels by a factor of less
Level 3 than 5%.
Practices
Artifacts:
• IT infrastructure EA Segment Report, Exhibit 5339, IT Infrastructure
agency 5 year plans
Level 4 Activities:
Practices • The agency’s IT infrastructure portfolio exceeds the committed
37
This data is collected as part of the OMB Circular A-11 process. OMB will correlate the EA data with
the IT investment portfolio data.
38
Ibid.
39
Ibid.
June 2009 39
OMB EA Assessment Framework 3.1
Activities:
• The agency has identified stakeholders and goals for EA value
Level 1 measurement.
Practices
Artifact:
• Agency EA Value Measurement Plan
Level 2 Activities:
Practices • The agency must meet the criteria for the previous level.
40
This data is collected as part of the OMB Circular A-11 process. OMB will correlate the EA data with
the IT investment portfolio data.
41
Ibid.
June 2009 40
OMB EA Assessment Framework 3.1
June 2009 41
OMB EA Assessment Framework 3.1
In a data-driven environment, the quality of the data determines whether the right
decisions are made; poor quality data leads to inadequate decisions. To make the right
decisions, OMB is dependent upon agencies to provide high-quality data submissions.
Quality encompasses both the utility of the information (i.e., the usefulness of the
information to its intended users), the objectivity of that data (i.e., whether the data are
presented in an accurate, clear, complete, and unbiased manner and the accuracy,
reliability, and bias in the underlying data source).
This data quality effort can be viewed within the larger context of OMB’s focus on
information quality for both information disseminated to the public and for information
used internally to make important investment decisions. Furthermore, this effort
embraces the principles upon which the OMB’s Government-wide Information Quality
Guidelines42 are based. Specifically, it recognizes high quality comes at a cost and
agencies should weigh the costs and benefits of higher information quality. The
principle of balancing the investment in quality commensurate with the use is generally
applicable to all data the federal government generates.
42
67 FR 8452-8460.
One of the roles of the OMB Enterprise Architecture Assessment Framework (EAAF)
Version 3.1 is to ensure high quality agency information technology portfolio data
submissions, especially pertaining to data collected via the OMB Circular A-11
processes (e.g., Exhibits 53 and 300). This appendix describes OMB’s strategy for
using the KPIs defined within the EAAF Version 3.1 to enforce high standards of data
quality for agency EA and IT investment portfolio submissions, thereby improving the
quality of downstream analytics performed on these data sets.
Each section below discusses the strategy implemented by this version of the EAAF to
use the EAAF as a tool to help OMB improve data quality for each respective area.
create a horizontal (functional) view of the Federal IT investment portfolio. This allows
OMB to identify opportunities for cross-agency collaboration and reuse. In the past,
OMB has used this analytic technique to identify candidates for E-Government
initiatives such as the E-Gov Lines of Business.
When agencies provide inaccurate mappings, this inhibits the ability of OMB to perform
quality analysis. Accordingly, the “FEA Reference Model Mapping” KPI has been
crafted to perform the following quality checks and adjust the agency score accordingly:
• Every IT investment in the portfolio must have a valid primary mapping. For
example, a Mode of Delivery sub-function cannot be a primary mapping for an IT
investment;
• Primary mappings must be consistent with sub-function/service component
definitions found in the Consolidated Reference Model document.43 OMB uses
various analytic techniques for checking this.
• Mappings must be consistent with other reported data. For example, IT
investments reported as financial management systems on an Exhibit 300
business case (Part I, Section A, Item 19) should be aligned to a sub-function in
the Financial Management FEA BRM LOB.
When agencies provide inaccurate mappings, this inhibits the ability of OMB to perform
quality analysis. Accordingly, the “Scope of Completion” KPI has been crafted to
perform the following quality checks and adjust the agency score accordingly:
• Every investment must have a valid segment architecture mapping. In other
words, each investment must have a mapping and this mapping must link to a
segment architecture code provided by the agency to OMB prior to budget
submission. The exception to the rule of mapping to one segment is in the case
of an ERP system which directly supports multiple segments (per EA Segment
Report v1.2).
• Mappings must be consistent with segment architecture definitions and scope
agreed upon with OMB. In other words, the segment to which the investment
belongs should be a good “fit” given the title and description of the investment
(e.g., it makes sense for an accounting system to belong to the financial
management segment).
• Segment architecture mappings should be consistent with primary FEA
Reference Model mapping, where applicable. For example, an investment
43
http://www.whitehouse.gov/omb/e-gov/fea/
IT investments should be placed in the most appropriate part using definitions found in
OMB Circular A-11, section 53. Investments placed in an inappropriate part have a
detrimental impact on portfolio analysis performed by OMB. For example, an IT
infrastructure investment placed in Part 3 would be an incorrect categorization.
Accordingly, the “Exhibit 53 Part Mapping” KPI has been crafted to perform the following
quality checks and adjust the agency score accordingly
Accordingly, the “Cost Savings and Cost Avoidance” KPI has been crafted to perform
the following quality checks and adjust the agency score accordingly. OMB will use
various analytic techniques to check data quality in this area. If OMB finds an inordinate
number of IT investments not accurately disclosing the previous year’s UPI code, this
will negatively impact the agency score on some KPIs.
The biggest issue in this area concerns the correct use of the “-24” investment category.
This should be used for approved E-Gov initiatives only. Any misuse of this code will
negatively impact the agency score on the “Collaboration and Reuse” KPI.
The score of the “Collaboration and Reuse” KPI will reflect this quality check.
mind toward reuse (e.g., identifying SmartBUY opportunities). To help OMB perform
effective analysis, agencies should ensure the following:
• Each SRM service component in Table 4 should have an appropriate TRM
service standard associated with it
• To the maximum extent possible, detailed and accurate service specifications
should be provided.
The score of the “Collaboration and Reuse” KPI will reflect this quality check.