100% found this document useful (1 vote)
480 views36 pages

EDMC DCAM Overview v2.2.3 5 1 2024 1

The Data Management Capability Assessment Model (DCAM®) is a framework developed by the EDM Council to establish and sustain effective data management initiatives in organizations. It encompasses seven core components and one optional component, addressing strategies, structures, technology, and operational practices necessary for successful data management. The framework emphasizes the importance of managing data as meaning and establishing a data control environment to ensure data quality and alignment across the organization.

Uploaded by

Haodtt
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
480 views36 pages

EDMC DCAM Overview v2.2.3 5 1 2024 1

The Data Management Capability Assessment Model (DCAM®) is a framework developed by the EDM Council to establish and sustain effective data management initiatives in organizations. It encompasses seven core components and one optional component, addressing strategies, structures, technology, and operational practices necessary for successful data management. The framework emphasizes the importance of managing data as meaning and establishing a data control environment to ensure data quality and alignment across the organization.

Uploaded by

Haodtt
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

OVERVIEW

Published by:
The DCAM® Framework

This Data Management Capability Assessment Model (DCAM®) (“Model”) overview is being provided to the
Recipient, (“Recipient”) by the EDM Council, Inc. (“EDM Council”). The Model and all related materials are the
sole property of EDM Council, and all rights, titles, and interests therein are vested in EDM Council. The Model, or
any portion thereof, may not be copied by any Recipient and may not be distributed to, or made available for the
use by any party other than Recipient, unless, in each case, Recipient has obtained the prior written authorization
of EDM Council. Except as provided above, the Model, or any portion thereof, may not be used in any way by
Recipient, its officers, employees or agents or by any other party without the prior written consent of EDM
Council. The Model may only be used by Recipient for external purposes or external assessments if it has entered
into a separate licensing agreement with EDM Council governing the terms for such use. By reviewing or using
the Model, or any portion thereof, the Recipient (and each person reviewing or using the Model) agrees to the
terms set forth above. Any copying or use of the Model except as set forth above is strictly prohibited.

DCAM® is a registered trademark of the EDM Council, Inc., and may not be used or copied without the prior
written authorization of the EDM Council, Inc.
© 2024 EDM Council, Inc. All Rights Reserved.
DCAM Framework: Contents

Foreword
The Data Management Capability Assessment Model (DCAM®) is a structured resource that defines and
describes the capabilities needed to establish and sustain a successful data management (DM) initiative in any
organization. The Enterprise Data Management Council created the Model based on the practical experiences
and hard-won lessons of many of the world's leading organizations. The result is the synthesis of a broad range
of DM best practices from across the full spectrum of interconnected business processes. The DCAM addresses
the strategies, organization-wide structures, technology, and operational practices needed to drive DM
successfully. It addresses the tenets of DM based on an understanding of business value combined with the hard
reality of implementation.

To manage data in today's organizations, we must start by recognizing that proper DM is about managing data
as meaning. This relatively new concept for many organizations is not very well understood. Managing data,
according to its meaning, is a process of defining each piece of data by what it represents or describes in the real
world. This process results in a direct, readily comprehensible label for that data. By adding descriptive
metadata, the precise nuanced connection between each piece of data and the real world is established. Data
exists everywhere within an organization and must be managed consistently within a well-defined control
framework. The DCAM defines the framework and capabilities required to make DM a critical part of an
organization's everyday operational fabric.

The challenges of properly managing data are significant. In most organizations, there are numerous legacy data
repositories and an overabundance of functions to unravel. There are social and political barriers to overcome.
There are real technical challenges and execution gaps to address. Data ownership and accountability are hard
to establish. Historically, funding often has been project-based, making DM an intermittent priority. Data's now
critical place in the organization requires a commitment to robust, ongoing funding. Organizations have an
additional challenge to build the strong executive support needed to ensure that the organization stays the
course in the face of short-term measurement criteria, operational disruption, and conflicting stakeholder
priorities.

We understand this reality because we've been there, and we have the scars to show for it. Data is foundational.
It is the lifeblood of the organization. The bad data tax is a significant expenditure for many organizations
though it may remain hidden in accepted inefficiencies and stunted results. Unraveling data silos through the
creation of harmonized data is a prerequisite for eliminating redundancy, reducing reconciliation, and
automating business processes across the organization.

Managing this kind of fully interconnected data is essential if we are to gain insight from analytics, feed our
models with confidence, enhance our service to clients and capitalize on new, but often fleeting, business
opportunities. DCAM provides the guidance needed to assess the current-state of any organization's DM and
define the objectives and framework for the target-state of the DM initiative.

The DCAM is organized into seven core components and one optional component:

Core:

• 1.0 Data Management Strategy & Business Case


• 2.0 Data Management Program & Funding Model
• 3.0 Business & Data Architecture
• 4.0 Data & Technology Architecture
• 5.0 Data Quality Management

3
DCAM Framework: Contents

• 6.0 Data Governance


• 7.0 Data Control Environment

The core components are organized into 31 capabilities and 106 sub-capabilities.

Optional:

• 8.0 Analytics Management

This optional component covers Analytics Management and is relevant where the scope of the DM program or
the Chief Data Officer's responsibilities cover the Analytics functions of the organization. This component is
organized into seven capabilities and 30 sub-capabilities.

4
DCAM Framework: Contents

Contents
The DCAM® Framework........................................................................................................................ 2
Foreword ............................................................................................................................................... 3
Introduction ............................................................................................................................................ 6
1.0 Data Management Strategy & Business Case ............................................................................... 15
2.0 Data Management Program & Funding Model ............................................................................... 18
3.0 Business & Data Architecture ........................................................................................................ 22
4.0 Data & Technology Architecture..................................................................................................... 25
5.0 Data Quality Management ............................................................................................................. 27
6.0 Data Governance ........................................................................................................................... 29
7.0 Data Control Environment .............................................................................................................. 31
8.0 Analytics Management ................................................................................................................... 33

5
Introduction
The Data Management Capability Assessment Model (DCAM®) defines the scope of capabilities required to
establish, enable and sustain a mature data management (DM) discipline. It addresses the strategies,
organizational structures, technology and operational best practices needed to successfully drive DM. It
addresses the tenets of DM based on an understanding of business value combined with the reality of
operational implementation.

Overview
The concept of data as a foundational aspect of business operations has arrived. It is now widely understood as
one of the core factors of input into the full range of business and organizational processes. An organization that
is effective in its use of data is one that implements and manages a data control environment. The data control
environment yields a wide range of bottom-line benefits to the organization, including reduced operational
costs, automated manual processes, consolidated redundant systems, minimized reconciliation, and enhanced
business opportunities. An organization implements a data control environment to ensure trust and confidence
in the data they are relying on for business processing and decision-making. The data control environment
eliminates the need for manual reconciliation or reliance on data transformation processes.

The data control environment concept ensures the data is precisely defined, described using metadata, aligned
with meaning, and managed across the full data lifecycle. The key to first establishing a data control
environment, however, is the achievement of unambiguous shared meaning across the organization along with
the governance required to ensure precise definitions. Data must be consistently defined by the real thing it
represents, such as products, clients, customers, legal entities, transactions, events, and much more. All other
processes are built upon this foundation.

The opposite of a data control environment is a fragmented data environment. The fragmentation results in
application development that produces ad hoc data models. The fragmentation exacerbates the problem of
common terms that have different meanings, common meanings that use different terms, and vague definitions
that don't capture critical nuances. For many organizations, this challenge can be debilitating because there are
thousands of data elements, delivered by hundreds of internal and external sources, all stored in dozens of
unconnected databases. This fragmentation results in a continual challenge of mapping, cross-referencing, and
manual reconciliation. The data control environment is dependent on every data attribute being understood, at
its atomic level, as a fact that is aligned with specific, durable business meaning without duplication or
ambiguity. Managing data as meaning is the key to the alignment of data repositories, harmonization of business
glossaries, and ensuring that application data dictionaries are comparable.

Achieving alignment of business meaning, including the process of how terms are created and maintained, can
be a daunting task. It is not uncommon to experience resistance from internal business and technology users–
particularly when there are multiple existing systems linked to critical business applications. The best strategy
for reconciliation in a fragmented environment is to harmonize based on legal, contractual, or business
meanings rather than trying to get every system to adopt the same naming convention. Nomenclature
represents the structure of data and the unraveling of data structures and data models is expensive and not
necessary. It is better to focus on precisely defining business concepts, documenting transformation processes,
and capturing real-world data relationships. Once established, existing systems, glossaries, dictionaries,
repositories, etc. can be cross-referenced to the common meaning.

Data as meaning must be managed along with its defining metadata to ensure consistency and comparability
across the organization. Data meaning and metadata management must be understood as the core of your
content infrastructure and the baseline for process automation, application integration, and alignment across
DCAM Framework: 1.0 Data Management Strategy & Business Case

linked processes. Some common types of metadata include business, operational, technical, descriptive,
structural, administrative.

The implementation and management of a data control environment are governed by data policies, standards,
processes, and procedures. These are the essential mechanisms for establishing a sustainable DM initiative and
for ensuring compliance with a data control environment in the face of organizational complexity. Managing
meaning is the key to effective DM. Meaning is achieved through the adoption of semantic standards. Standards
are governed by policy. The policy is established by executive management, supported by data owners and
enforced by Internal Audit.

Challenges in Creating a Data Control Environment

Diagram 0.1: Data Control Environment Challenges

7
DCAM Framework: 1.0 Data Management Strategy & Business Case

Achieving a data control environment requires the organization to overcome the following challenges.

• Understand the existing legacy data environments, including an inventory of data, point-to-point links,
inconsistent definitions, etc.
• Simplify, organize, and categorize the disparate environment into defined data domains, with clearly
identified data elements and documented data flows.
• Align data elements to unambiguous shared meaning across the organization through the
implementation of controls, policy, and governance.
• Measure and track data to ensure quality and consistency with minimal reconciliation.
• Align technology to ensure the principles and best practices that have been established are enabled
across the organization's technology infrastructure.

It is this journey that must be taken to arrive at a data control environment needed to ensure the highest quality
of data is delivered to critical processes throughout the organization.

Many organizations have made the mistake of trying to solve the DM problem by starting with technology. The
DCAM, as a best practice, advocates that the starting point is with the business process that defines the
requirements for data. Then the business process and data can be automated by technology.

Data Management Operating Levels


The DCAM Framework defines the components and capabilities that are required to achieve a data control
environment in the organization. However, the Framework is not prescriptive to how the capabilities are
executed at the various operating levels of the organization. An organization will need to customize their
operating model to account for the size, complexity, geography, and culture of the organization.

Diagram 0.2: Data Management Operating Levels

When using the DCAM Framework to assess the organization’s capabilities it is most informative to evaluate
each level as defined in the DM target operating model of the organization.

8
DCAM Framework: 1.0 Data Management Strategy & Business Case

Data Management Stakeholders


The DM Stakeholder Accountability Matrix illustrated below is a construct for identifying the stakeholder types
across the data ecosystem and the high-level roles and the relationships between the stakeholders.
Understanding the stakeholder types is an important foundation for applying the DCAM Framework in a DM
target operating model for the organization. It also defines the audience that is targeted for using the DCAM as
the capability assessment tool for the organization. Within the Matrix, data architecture is the pivotal bridge in
the relationship between the business and technology stakeholders.

Diagram 0.3: Data Management Stakeholder Accountability Matrix

DCAM: A Framework for Sustainable Data Management


A complex set of DM capabilities are required to achieve a data control environment. The Data Management
Capability Assessment Model is a framework for executing a robust, sustainable DM function. It is also an
essential tool for ongoing assessment and benchmarking of an organization's DM capabilities.

9
DCAM Framework: 1.0 Data Management Strategy & Business Case

Diagram 0.4: DCAM Framework

The DCAM Framework consists of seven core components and one optional component. The first component,
Data Management Strategy & Business Case and the second component, the Data Management Program &
Funding Model, are foundational to the other five core components.

The next four core components of the DCAM framework, Business & Data Architecture, Data & Technology
Architecture, Data Quality Management, and Data Governance are the execution components.

The final core component is the collaboration activity–Data Control Environment. It is here that the execution
components are put into operation by the data producer to bring a defined set of data into control and make it
available to data consumers at a point in time that is either real-time or a period end.

The seven core Components include 31 Capabilities and a total of 106 Sub-capabilities. The definition and scope
of each component are presented below.

The eighth component, Analytics Management, is optional and is relevant where the scope of the DM program
or the Chief Data Officer's responsibilities cover the Analytics functions of the organization. This component has
seven capabilities and 30 sub-capabilities.

DCAM: The Scope of the Eight Components


1.0 Data Management Strategy & Business Case

The Data Management Strategy (DMS) & Business Case component is a set of capabilities to define, prioritize,
organize, fund, and govern DM and how it is embedded into the operations of the organization in alignment
with the objectives and priorities of both the enterprise and operating units. The DM business case is the
justification for creating and funding a DM initiative. The business case articulates the major data and data
related issues facing an organization or operational unit and describes the expected outcomes and benefits that
can be achieved through the implementation of a successful DM initiative.

• Establish a DMS function within the Office of Data Management (ODM).


• Work with DM Program Management Office (PMO) to design and implement sustainable business-as-
usual processes and tools for the DMS function.
• Align the DMS with the business strategy, objectives, and priorities, including prioritization of data based
on its criticality to the business.
• Define the rationale and business case for the management of data as an asset through the
organization-wide DM initiative.
• Ensure the DMS is aligned with the organization-wide Enterprise Data Management Principles.
• Articulate the DM target and current-state. Then, using DCAM as an assessment tool for gap analysis
and prioritized gap closure, create a cohesive execution plan.
• Define the high-level execution roadmap.
• Define strategy execution risks and mitigations.
• Define DM performance metrics.
• Document the DMS with a compelling presentation of the value of an organization-wide DM initiative.
• Ensure that the DMS governance is integrated into the Data Governance (DG) structure.

2.0 Data Management Program & Funding Model

The Data Management Program (DMP) & Funding Model component is a set of capabilities to manage the
Office of Data Management (ODM). These organizational structures include resource requirements and a full
range of Program Management Office (PMO) activities such as the execution of program management,

10
DCAM Framework: 1.0 Data Management Strategy & Business Case

stakeholder management, funding management, communications, training, performance measurement. The DM


funding model within the DMP is designed to provide the mechanism to ensure the allocation of sufficient
capital needed for implementation of the Program. It also defines and describes the methodologies used to
measure both the costs and the organization-wide benefits derived from the DM initiative.

• Establish a DMP function to implement the Program Management Office (PMO) capabilities within the
ODM.
• Facilitate the design and implementation of sustainable business-as-usual DM processes and tools across
the components and their capabilities.
• Establish roles and responsibilities related to the DM capabilities aligned with an organizational
structure and execute in an ODM.
• Define Funding Model, secure and monitor funding, and institute cost and benefits tracking aligned to
the Business Case.
• Establish the DM execution roadmap with supporting project plans to build upon the high-level Data
Management Strategy (DMS) roadmap.
• Engage each stakeholder across the data ecosystem as appropriate to their roles in resource alignment,
funding, communications, training, and skill development.
• Manage the DM initiative by monitoring and socializing DM performance metrics.
• Ensure that the DMP governance is integrated into Data Governance (DG).

3.0 Business & Data Architecture

The Business and Data Architecture (DA) component is a set of capabilities to ensure integration between the
business process requirements and the execution of the DA function. The business architecture function defines
the business process. DA defines data models such as taxonomies and ontologies, as well as data domains,
metadata, and business-critical data to execute processes across the data control environment. The DA function
ensures the control of data content, that the meaning of data is precise and unambiguous and that the use of
data is consistent and transparent.

• Establish a DA function within the Office of Data Management (ODM).


• Work with DM Project Management Office (PMO) to design and implement sustainable business-as-
usual processes and tools for DA, including the required integration with business architecture.
• Identify and establish data domains, authoritative sources, and provisioning points.
• Identify and inventory the data to support the business requirements, including all necessary metadata,
including a glossary, dictionary, classification, lineage, etc.
• Define and assign business definitions linked to the data inventory.
• Ensure that the DA governance is integrated into Data Governance (DG) and aligned to both business
and technology governance activities.

4.0 Data & Technology Architecture

The Data & Technology Architecture (TA) component is a set of capabilities to align the architectural
requirements of the business, data, and technology across the organization to support the desired business
process outcomes. These outcomes include DM processes and the required technology infrastructure.

• Work with DM Project Management Office (PMO) to design and implement sustainable business-as-
usual processes and tools for the integration of business and technology architecture with data
architecture.
• Ensure DM function alignment with business, data, and technology architecture and strategy.
• Ensure that the DM governance is aligned to both business and technology governance activities.

11
DCAM Framework: 1.0 Data Management Strategy & Business Case

5.0 Data Quality Management

The Data Quality Management (DQM) component is a set of capabilities to define data profiling, DQ
measurement, defect management, root cause analysis, and data remediation. These capabilities allow the
organization to execute processes across the data control environment, ensuring that data is fit for its intended
purpose.

• Establish a DQM function within the Office of Data Management (ODM).


• Work with data management (DM) Program Management Office (PMO) to design and implement
sustainable business-as-usual processes and tools for DQM.
• Execute DQM processes against business-critical data. DQM processes include profiling & grading,
measurement, defect management, root cause fix, remediation.
• Establish DQ metrics and reporting routines.
• Ensure that the DQM governance is integrated into Data Governance (DG).

6.0 Data Governance

The Data Governance (DG) component is a set of capabilities to codify the structure, lines of authority, roles &
responsibilities, escalation protocol, policy & standards, compliance, and routines to execute processes across
the data control environment. The component ensures authoritative decision making at all levels of the
organization.

• Establish a data governance function within the Office of Data Management (ODM).
• Work with DM Program Management Office (PMO) to design and implement sustainable business-as-
usual processes and tools for data governance.
• Define clear roles, responsibilities, and accountabilities for DM resources, including those mandated by
DM policy.
• Define and operate the data governance structure with clear lines of authority, responsibility for
decision making, engaged stakeholders, adequate oversight, issue escalation paths, and tracking of
remediation activity.
• Develop and oversee adherence to comprehensive and achievable DM policies, standards, and
procedures, including leading the response to audits.
• Ensure the data governance function is aligned with other relevant control function policies, procedures,
standards, and governance requirements from information security, privacy, technology architecture,
etc.

7.0 Data Control Environment Scope

The Data Control Environment (DCE) component is a set of capabilities that together form the fully operational
data control environment. Data operations, supply-chain management, cross-control function alignment, and
collaborative technology architecture must operate cohesively to ensure the objectives of the DM initiative are
realized across the organization.

• Work with DM Program Management Office (PMO) to design and implement sustainable business-as-
usual processes and routines to enable a successful data control environment.
• Bring together the DM components as a coherent, end-to-end data ecosystem.
• Follow current DM best practices by routinely reviewing and auditing the capabilities and their
processes.
• Ensure all facets of DM for business-critical data such as data lifecycle, end-to-end data lineage, and
data aggregations are fully operational.

12
DCAM Framework: 1.0 Data Management Strategy & Business Case

• Ensure DM is aligned with other control function policies, procedures, standards, and governance.

8.0 Analytics Management Scope

The Analytics Management (AM) component is a set of capabilities used to structure and manage the Analytics
activities of an organization. The capabilities align AM with DM in support of business priorities. They address
the culture, skills, platform, and governance required to enable the organization to obtain business value from
analytics.

Scope
• Define the Analytics strategy.
• Establish the AM function.
• Ensure that analytics activities are driven by the Business Strategy and supported by the DM strategy.
• Ensure clear accountability for the analytics created and for their uses throughout the organization.
• Work with DM to align Analytics with Data Architecture (DA) and Data Quality Management (DQM).
• Establish an analytics platform that provides flexibility and controls to meet the needs of the different
stakeholder roles in the Analytics operating model.
• Apply effective governance over the data analysis lifecycle. Governance includes tollgates for model
reviews, testing, approvals, documentation, release plans, and regular review processes.
• Ensure that Analytics follows established guidelines for privacy, data ethics, model bias, and model
explainability requirements and constraints.
• Manage the cultural change and education activities required to support the Analytics strategy.

DCAM Uses Cases


DCAM has multiple uses within an organization.

• As a well-defined control framework


• As an assessment tool
• As an industry benchmark

DCAM as a Framework
When an organization adopts the standard DCAM framework they introduce a consistent way of understanding
and describing DM. DCAM is a framework of the capabilities required for a comprehensive DM initiative
presented as a best practice paradigm. DCAM helps to accelerate the development of the DM initiative and
make it operational. The DCAM Framework:

• Provides a common and measurable DM framework


• Establishes common language for DM
• Translates industry expertise into operational standards
• Documents DM capability requirements
• Proposes evidence-based artifacts

DCAM as an Assessment Tool


To effectively use DCAM as an assessment tool requires the definition of the assessment objectives and strategy,
planning for the assessment management, and adequate training of the participants to establish a base
understanding of the DCAM Framework.

13
DCAM Framework: 1.0 Data Management Strategy & Business Case

The assessment results translate the practice of DM into a quantifiable science. The benefits afforded an
organization from the assessment outcomes include:

• Baseline measurement of the DM capabilities in the organization compared to an industry standard.


• Quantifiable measurement of the progress the organization has made to operationalize the required DM
capabilities.
• Identification of DM capability gaps to inform a prioritized roadmap for future development aligned to
the organization’s business requirements for data and DM
• Focused attention to the funding requirements of the DM initiative

DCAM as an Industry Benchmark


The EDM Council conducted a DM industry benchmark study in 2015, 2017, and 2020. The benchmark is based
on the capabilities defined in DCAM and thus can be used in comparison analysis for organizations conducting a
DCAM assessment.

The industry benchmark is not restricted to organizations that are using DCAM. Input from a broader range of
DM industry practitioners affords an enhanced perspective on the state of the DM industry.

14
1.0 Data Management Strategy & Business Case
Introduction
The Data Management Strategy & Business Case determines how data management (DM) is defined,
organized, funded, governed and embedded into the operations of the organization. It defines the long-term
vision including a description of stakeholders or stakeholder functions that must be aligned. Data Management
Strategy demonstrates the business value that the program will seek to achieve. It becomes the blueprint for the
organization to evaluate, define, plan, measure and execute a successful and mature DM initiative.

The purpose of developing a DM strategy and business case is to articulate the rationale for the DM initiative.
The strategy defines why the initiative is needed, as well as the goals, and expected benefits. The strategy also
describes how to mobilize the organization in order to implement a successful DM initiative. The DM business
case provides the rationale for the investment in the DM initiative. DM is no different than any other established
business process. It needs to be justified, funded, measured and evaluated. It provides clarity of purpose,
enabling agreement and support of initiative objectives from senior executives as well as program stakeholders.

Definition
The Data Management Strategy (DMS) & Business Case component is a set of capabilities to define, prioritize,
organize, fund and govern DM and how it is embedded into the operations of the organization in alignment with
the objectives and priorities of both the enterprise and operating units. The DM business case is the justification
for creating and funding a DM initiative. The business case articulates the major data and data related issues
facing an organization or operational unit and describes the expected outcomes and benefits that can be
achieved through the implementation of a successful DM initiative.

Scope
• Establish a DMS function within the Office of Data Management (ODM).
• Work with DM Program Management Office (PMO) to design and implement sustainable business-as-
usual processes and tools for the DMS function.
• Align the DMS with the business strategy, objectives, and priorities, including prioritization of data based
on its criticality to the business.
• Define the rationale and business case for management of data as an asset through the organization-
wide DM initiative.
• Ensure the DMS is aligned with the organization-wide Enterprise Data Management Principles.
• Articulate the DM target and current-state. Then, using DCAM as an assessment tool for gap analysis
and prioritized gap closure, create a cohesive execution plan.
• Define the high-level execution roadmap.
• Define strategy execution risks and mitigations.
• Define DM performance metrics.
• Document the DMS with a compelling presentation of the value of an organization-wide DM initiative.
• Ensure that the DMS governance is integrated into the Data Governance (DG) structure.
DCAM Framework: 1.0 Data Management Strategy & Business Case

Value Proposition
Organizations that have enterprise and operating unit executives who understand, support, and offer direction
for the organization-wide DM initiative lower organizational risk and get better acceptance of the DM Initiative
at all levels of staff. Staff engagement in the sustainable management of data for the short and long-term
success of the organization is essential. Organizations that implement effective DM get a return on investment
from several areas:

• Efficiency and effectiveness of data issue resolution, compliance, and auditable demonstration
• Improved enterprise risk management
• Efficiency in business process optimization
• Innovation and differentiation for customers

Overview
The DMS component is one of the three foundational components of the DCAM Framework. The DMS is what
integrates the strategies of each of the other components of the Framework into an overall strategy for the
execution of the DM initiative. It is important to note that the DMS is a function, within the Office of Data
Management in which strategic planning capabilities and skills reside. (You may refer to Overview in the Data
Management Program section of this guide for detail on the structure of the Office of Data Management).

Using the DCAM Framework provides a structure for the DM initiative that includes the core principles of DM. It
also helps stakeholders understand the value of DM as it relates to their operating units and strategic initiatives.

The strategy should align with the organization’s articulated target operating model for executing DM with a
roadmap and timeline to achieve the target. It is important for a strategy to compare the target-state to the
current-state in order to show the organizational, functional, operational, and technological gaps and
inefficiencies. The Strategy can then define, prioritize, and schedule gap closure. DCAM used as a capability
assessment tool is fundamental to the analysis of gaps in each operating level and organizational unit.

The DMS integrates the Framework components at each operating level throughout the organization. For detail
on the levels and context at which DM operates, refer to Data Management Operating Levels in the introductory
section of this guide.

A strategy must be documented for each organizational unit at the various operating levels of the organization
and work in concert with the other organizational units across the organization. Within a single organizational
unit, each DCAM component has a unique input to the strategy that is then integrated with the other
component input and prioritized in the final strategy for that organizational unit. These inputs must align with
the DM target operating model for the organization. The target operating model defines the expected
component and capability requirements for the operating level of the organizational unit.

Because not all organizational units will be at the same maturity in the design and execution of their DM
initiative these strategies are specific to their business objectives, priorities, and identified DM inefficiencies and
gaps.

The importance of the physical documentation should not be underestimated because the document is the
primary internal marketing tool to drive understanding and support from all stakeholders at all levels of the
organization.

The DMS includes the business case that describes how value will be realized from the data assets of an
organization, through the collaboration of business, data, and technology.

16
DCAM Framework: 1.0 Data Management Strategy & Business Case

Diagram 1.1: Data Asset Value Model

The DM business case is the cost-benefit realization of the set of activities and deliverables expected from the
DM initiative. The DM business case answers the question: Why the firm is focusing on data management? This
helps achieve alignment across the stakeholders. The business case helps management understand the costs,
benefits, and risks associated with the evolution of the DM initiative. It is essential to link the business case with
realistic strategic and tactical measurement criteria and align them with the long-term sequence plan for the DM
initiative. This enables the organization to understand the total costs associated with implementation as well as
maintenance of the DM initiative and helps ensure that it is sufficiently funded to meet both near and long-term
objectives.

The DM business case articulates the benefits of DM, in alignment with the objectives defined, communicated
and agreed upon in the DMS. It discusses the defensive benefits of the initiative including operational cost
reduction, improved regulatory reporting, streamlined risk management, controlled data governance, improved
data quality (DQ). It also highlights the offensive benefits of the initiative which include advanced analytics,
improved customer service, innovative product development, increased revenues, improved market
penetration.

In some cases, the best way to build support for the business case is through a demonstrative proof of concept
or pilot project. In these instances, a specific pain point or high-profile business objective would be selected and
used to demonstrate the benefits of implementing effective DM. If this approach is used, it is important to select
a project that is achievable and can provide quick wins. This approach builds confidence among stakeholders on
the foundational benefits of DM to ensure sustainability. Regardless of whether you define the business case
with or without a proof of concept, all activities must align to the strategic business objectives of the
organization.

The DM strategy and the business case are not static and must be able to evolve as the priorities and needs of
the organization change. The most effective and successful DM strategies are living artifacts that are visibly
endorsed by executive management and are supported by mandatory organizational policy.

17
2.0 Data Management Program & Funding Model
Introduction
The Data Management Program is an organizational function dedicated to the management of data as an asset
throughout an organization. It illustrates how the management of data quality (DQ), its definition and its
content support strategic, business and operational objectives. It also reinforces the necessity of orchestration,
active collaboration and alignment among diverse stakeholders in order to instill confidence in data as a trusted
factor of input into business and operational processes.

The purpose of a data management (DM) program is to organize and embed the DM concepts into the
operational framework of an organization on a sustainable basis. The creation and implementation of the DM
program elevates the importance of DM and integrates it as a core aspect of organizational operations. It
establishes DM as a sustainable activity by ensuring sustainable funding. It reinforces the importance of
managing data across the organization via education, training, and communication.

The Data Management Funding Model describes the overall framework and high-level engagement of senior
management used to ensure that the objectives and processes of DM become a sustainably funded activity
within the organization.

Definition
The Data Management Program (DMP) & Funding Model component is a set of capabilities to manage the
Office of Data Management (ODM). These organizational structures include resource requirements, and full
range of Program Management Office (PMO) activity such as execution of program management, stakeholder
management, funding management, communications, training, performance measurement. The DM funding
model within the DMP is designed to provide the mechanism to ensure the allocation of sufficient capital
needed for implementation of the Program. It also defines and describes the methodologies used to measure
both the costs and the organization-wide benefits derived from the DM initiative.

Scope
• Establish a DMP function to implement the Program Management Office (PMO) capabilities within the
ODM.
• Facilitate the design and implementation of sustainable business-as-usual DM processes and tools across
the components and their capabilities.
• Establish roles and responsibilities related to the DM capabilities aligned with an organizational
structure and execute in an ODM.
• Define Funding Model, secure and monitor funding, and institute cost and benefits tracking aligned to
the Business Case.
• Establish the DM execution roadmap with supporting project plans to build upon the high-level Data
Management Strategy (DMS) roadmap.
• Engage each stakeholder across the data ecosystem as appropriate to their roles in resource alignment,
funding, communications, training and skill development.
DCAM Framework: 2.0 Data Management Program & Funding Model

• Manage the DM initiative by monitoring and socializing DM performance metrics.


• Ensure that the DMP governance is integrated into Data Governance (DG).

Value Proposition
Organizations that ensure the right people are involved in the DM initiative demonstrate the ability to eliminate
the replication and misuse of data and improve their ability to integrate data based on enterprise standards.

Organizations that define and follow set processes and standard operating procedures, including requesting,
sharing, defining, producing, and using data, demonstrate the ability to ensure the sharing of data aligns with
enterprise standards.

A DM Funding Model with an appropriate, sustainable funding framework is required to ensure the success of
the objectives and processes of the DM initiative.

Organizations that effectively implement the DMP and achieve an accountable funding model get a return on
investment from several areas:

• Establishes an organization-wide data culture


• Efficiencies in operations with repeatable and sustainable processes
• Productivity from resource and skill alignment
• Continuity of funding over time
• Sustainability for the DM initiative

Overview
The DMP is one of the two foundational components of the DCAM Framework. It should be established as a
formal, independent and sustainable part of the organization, within the ODM. The DMP needs to establish the
lines of responsibility and accountability within the ODM organizational structure and beyond into federated
organizational structures (e.g. operating units, regions, etc.). The DMP must ensure the ODM has access to the
appropriate staff resources and functional capabilities in order to deliver the data needed to support
organizational objectives. An effective DMP has the strong support of executive management, appropriate
governance authority to ensure the implementation of a data control environment and a well-structured model
of how stakeholders will engage in data-related issues. An effectively designed DMP, that is flexible enough to
accommodate to changing circumstances, will help embed the importance of DM into the culture of the
organization. A key goal is to instill a sense of collective respect for the role of data among all stakeholders.

It is important to note that the DMP is an organization, or function, within the ODM in which typical
Program/Project Management Office capabilities and skills reside. The DMP function organizes DM activity
throughout the organization. (Refer to Data Management Operating Levels in the introductory section of this
guide for detail on the levels and context at which DM operates.) The DMP function will operate in a similar way
to a typical program or project, but the DMP must be an ongoing, sustainable business-as-usual organization-
wide commitment that organizes DM for the long term.

19
DCAM Framework: 2.0 Data Management Program & Funding Model

Diagram 2.1: DMP in ODM

Diagram 2.2: Enterprise DMP with Operating Unit DMPs

The DM funding model defines the mechanism used to generate and maintain the capital needed for the DM
initiative throughout its lifecycle. It establishes the methodology used for cost allocation among business lines
and can be used to help align stakeholders on funding-related issues. In mature organizations, the funding
model reflects the individual requirements of the various DM initiative components of the organization and is
integrated with governance to ensure that appropriate oversight and accountability is applied to DM. Verifiable
metrics are essential and must be aligned with tangible business objectives. A well-structured funding model can
help avoid recurring debates over business priorities, mitigate internal competition and facilitate open
discussions among stakeholders.

20
DCAM Framework: 2.0 Data Management Program & Funding Model

Strong consideration should be given to allocating initial funding to the DM initiative as an enterprise level
expenditure rather than an individual organizational unit approach. This type of grassroots funding can become
mired in competition among organizational units, is often aligned with a tactical view of DM and frequently
reinforces short-term evaluation cycles. An organization can expect its funding model to evolve along with the
maturity of its DM initiative.

There is no single model for funding DM. The specific model implemented will depend on the dynamics and
operational culture of the individual firm. Some organizations will fund centrally, others will fund through the
organizational units, while still others may take a hybrid approach. There are pros and cons to all of these
approaches. Whichever is selected, the fundamental aspects of the funding model, such as investment
criteria/priorities, budget management, delivered-versus-expected benefits, allocation methodology and capital
needed for ongoing management of the initiative should always be included. Most importantly, the funding
model must reflect a multi-year journey, incorporating both initial implementation costs, as well as sustainable
ongoing funding. DM must become a day-to-day operation and must be funded accordingly to ensure it
becomes part of the fabric of an organization’s operation.

21
3.0 Business & Data Architecture
Introduction
The path to integrated architecture across the organization begins with business architecture and how it defines
requirements for data architecture.

Business Architecture is the strategy, design and execution of the capabilities needed to
support the organization business functions.

Data Architecture is the strategy and execution of how data is designed (identified and
described) to support the business objectives.

Business architecture defines the processes required to meet the objectives of the business. The processes have
requirements for data and data management (DM). Those requirements must be defined as input and output of
the business process.

Data architecture speaks to the design, definition, management and control of information content. Data
architecture identifies data domains, documents metadata, defines critical data elements, establishes
taxonomies and ontologies that are critical to ensuring that the meaning of data is precise and unambiguous,
and that the usage of data is consistent and transparent.

A data architecture function establishes consistency in definition and use of data throughout an organization.
Adhering to a prescribed data architecture forces business and technology resources to take the necessary steps
to define and document data meaning, define the appropriate use of the data, and to ensure that proper
governance is in place to manage data as meaning on a sustainable basis.

Definition
The Business and Data Architecture (DA) component is a set of capabilities to ensure integration between the
business process requirements and the execution of the DA function. The business process is defined by the
business architecture function. DA defines data models such as taxonomies and ontologies as well as data
domains, metadata, and business critical data to execute processes across the data control environment. The DA
function ensures the control of data content, that the meaning of data is precise and unambiguous and that the
use of data is consistent and transparent.

Scope
• Establish a DA function within the Office of Data Management (ODM).
• Work with DM Project Management Office (PMO) to design and implement sustainable business-as-
usual processes and tools for DA including the required integration with business architecture.
• Identify and establish data domains, authoritative sources, and provisioning points.
DCAM Framework: 6.0 Data Governance

• Identify and inventory the data to support the business requirements, including all necessary metadata
including glossary, dictionary, classification, lineage, etc.
• Define and assign business definitions, linked to the data inventory.
• Ensure that the DA governance is integrated into Data Governance (DG) and aligned to both business
and technology governance activities.

Value Proposition
Organizations that identify, record and make available information about the internal constituencies that define,
produce, and use specific data demonstrate efficient and effective coordination, cooperation, and
communications around this data.

Organizations that document information about highly valued data elements demonstrate improved
understanding and business use of these data.

Organizations that effectively implement DA to understand their data and data ecosystem get a return on
investment from several areas:

• Operational excellence in business processes creates efficiencies and lowers operating cost
• Creation of straight-through, fit-for-purpose data, reduced data debt and remediation costs and
increased value derived from advanced analytics
• Greater understanding of your data leads to data simplification and reduces the cost of DM and
maintenance
• Understanding your data also reduces operational, financial and reputational risks associated with using
the wrong data for analytics, decision-making, and regulatory reporting

Overview
The DA component establishes the unambiguous definition and use of data throughout an organization.
Adhering to a prescribed data architecture forces business and technology to take the necessary steps to define
and document data meaning, define the appropriate use of the data, and to ensure that proper governance is in
place to manage data as meaning on a sustainable basis.

Data exists throughout an organization across all facets of business operations. The design of an organization's
data architecture is based on a comprehensive understanding of business requirements and their impact on
what data is needed. Unraveling the business process informs how data should be identified, defined, modeled
and related. Technology architecture then dictates how the data architecture design is organized and placed into
physical repositories in order to provide optimized access, security, efficient storage management and speed of
processing.

To establish a successful DA function, the following sequence of activities is required.

First, the following two activities will allow an organization to understand what data is needed to satisfy the
business requirements.

• Identification of Logical Data Domains


Logical data domains are the logical groupings of data, not the databases themselves, that are needed
to satisfy the business requirements.
• Identification of Physical Repositories
Underlying the logical data domains are multitudes of physical, often overlapping repositories of data
that will map into the logical data domains. Identification of these underlying physical repositories is a

23
DCAM Framework: 6.0 Data Governance

critical step towards minimizing the complexity of legacy environments, reducing replication, better
understanding data lineage, assigning data ownership, and assessing data quality (DQ).

Once the data domains and their underlying physical sources of data have been identified, precise business
definitions using common semantic language for the identified data must be assigned and agreed upon by
stakeholders. DA is about managing the meaning of data. The importance of assigning precise definitions in the
context of business reality, the creation of a shared data dictionary and getting the agreement from both data
producers and data consumers cannot be minimized. Without this common understanding of data attributes,
aligned to business meaning, data architecture will struggle to succeed. The risk of inappropriate use of data will
increase and the ability to share data across an organization with confidence will be hindered.

The next step in addressing data architecture is to define data taxonomies and business ontologies. Data
taxonomies define how data entities are structurally aligned and related. For each officially designated data
domain that is identified, inventoried and deemed critical, a taxonomy must be defined and maintained. The
taxonomy is then mandated for all systems using this data as input into their business processes. With critical
business function taxonomies defined and in place, the organization needs to model the relationships between
taxonomies into a business ontology. Ontologies are the relationships and knowledge of multiple related
taxonomies across data domains.

A comprehensive data architecture process may include the following; however, this level of complexity is not
always required.

• The business function defines the data in a business model based on the requirements for data as an
input and output of the business process.
• These business function data models are then consolidated into an organization-wide business data
model.
• For each data domain, a taxonomy is defined, maintained, and mandated for all systems using this data
as input to the business process. Data taxonomies define how data entities are structurally aligned and
related.
• With business function taxonomies defined and in place, the organization models the relationships
between taxonomies into a business ontology. These ontologies represent the relationships and
knowledge of multiple related taxonomies across data domains.

Taxonomies and ontologies define and relate the content of data to enable the organization to realize the
maximum value of its data in a consistent and controlled manner. Once the content is defined, it needs to be
precisely described using metadata. Some of the types of metadata may include, but not limited to, business,
operational, technical, descriptive, structural and administrative.

24
DCAM Framework: 6.0 Data Governance

4.0 Data & Technology Architecture


Introduction
The requirements for data as well as for data management (DM), are interpreted through data architecture. This
interpretation defines the requirements to design the physical data consumed, produced and provisioned by the
business process. Data architecture is a bridge between the requirements for data of the business process and
the physical execution of that data in technology infrastructure.

Technology Architecture is the strategy and execution of how the physical infrastructure is
designed to support the business and data needs of the organization.

Technology architecture refers to the strategy, design and implementation of the technology infrastructure
which supports the defined business and data architecture. Technology architecture defines the platforms and
the tools and how they need to be designed for maximum efficiency to support the requirements of the
business process, data and DM. The purpose of technology architecture is to support the business process and
define how data is physically acquired, moved, persisted and distributed in a streamlined and efficient manner.
Physical data proximity, bandwidth, processing time, backup, recovery and archiving are some of the important
elements of a mature technology architecture.

The efficient and effective movement of data is critical to business operations. Technology architecture
determines how data, tools and platforms operate in collaboration to satisfy business requirements. The proper
alignment of these components dictates application efficiency and system processing speed. This enables
organizations to control costs and achieve infrastructure scalability and elasticity which are characteristic of an
organization that is designed for long-term implementation success.

The roadmaps to achieve architecture integration across business, data and technology need to be aligned on a
path to the target-state. The roadmaps define the governance and controls that are needed to ensure
compliance across the organization.

Definition
The Data & Technology Architecture (TA) component is a set of capabilities to align the architectural
requirements of the business, data, and technology across the organization to support the desired business
process outcomes. These outcomes include the DM processes and required technology infrastructure.

Scope
• Work with DM Project Management Office (PMO) to design and implement sustainable business-as-
usual processes and tools for the integration of business and technology architecture with data
architecture.
• Ensure DM function alignment with business, data and technology architecture and strategy.
• Ensure that the DM governance is aligned to both business and technology governance activities.

Value Proposition
Organizations that effectively deliver technology architecture aligned to business and data architecture get a
return on investment from several areas:

25
DCAM Framework: 6.0 Data Governance

• A simplified environment allows an organization to be nimble to market, agile in response to regulatory


changes and faster to innovate
• Tool selection and implementation are simplified, aligned to business process and data requirements
which reduces complexity and cost
• A data storage strategy is developed consistent with the objectives of business while it controls cost and
risk
• Operational risk architecture ensures the continuous flow of data to critical business functions in the
event of an outage incident

Overview
Information architecture is the combination of data architecture and technology architecture. Business
architecture or data architecture should not dictate technology. The technology infrastructure of an organization
is the responsibility of the technology function. However, the requirements of business architecture and data
architecture inform technology. Data architecture captures the information requirements of the business
process and translates them into the what, where and when of data: what data is needed; where is it to be
delivered, and by when. Technology architecture is the enabler and defines the plan and roadmap for
implementation.

There are four areas of technology architecture that are critical to a successful DM initiative.

• Database Platforms: Technology architecture defines acceptable data platforms for enterprise use.
Enterprise-class database platforms, appliance technologies, distributed computing, and in-memory
solutions all need to be defined, communicated and governed by technology architecture.
• Tools: Often one of the biggest expenses and source of inconsistent handling of data is the proliferation
of multiple, disparate DM technology tools within an organization. Technology architecture must define
the allowable tool stacks – what Business Intelligence (BI) tools, extract, transform & load (ETL) tools,
and various discovery tools are permitted for use within the organization.
• Storage Strategy: Technology architecture must define how organizations will store and maintain its
data. Several aspects of the target-state storage strategy are the determination of how to maintain data
and control data storage. This includes decisions about the use of internal versus external cloud
technology, how data will be archived and retained, and how data will be defensibly removed and
destroyed from the organization’s infrastructure.
• Operational Risk Planning: A sound technology architecture addresses operational risk, business
continuity, and disaster recovery strategies. Data is the lifeblood of an organization and needs proper
planning to ensure that data flows to all parts of the organization even in the face of events that
interrupt business continuity.

Finally, all the above aspects of a sound technology architecture must be supported by a strong technology
governance operating model integrated with the governance of business and data architecture. Policies must be
in place, agreed to by business, data and technology stakeholders, supported by executive management, and
subject to internal audit scrutiny and adherence. Without integrated governance over business, data and
technology architecture, infrastructure and tools will grow independently of each other. This uncontrolled
infrastructure will result in inefficiencies and security issues putting data quality (DQ) and the organization at
risk.

26
DCAM Framework: 6.0 Data Governance

5.0 Data Quality Management


Introduction
The Data Quality Management function defines the goals, approaches and plans of action that ensure data
content is of sufficient quality to support defined business and strategic objectives of the organization. The
function should be developed in alignment with business objectives, measured against defined data quality (DQ)
dimensions and based on an analysis of the current state of DQ. Data Quality Management is a series of
processes across the full data supply chain to ensure that the data provisioned meets the needs of its intended
consumers.

DQ requires an understanding of how data is sourced, defined, transformed, provisioned and consumed. DQ is
not a process itself but describes the degree in which data is fit-for-purpose for a given business process or
operation.

Definition
The Data Quality Management (DQM) component is a set of capabilities to define data profiling, DQ
measurement, defect management, root cause analysis and data remediation. These capabilities allow the
organization to execute processes across the data control environment ensuring that data is fit for its intended
purpose.

Scope
• Establish a DQM function within the Office of Data Management (ODM).
• Work with data management (DM) Program Management Office (PMO) to design and implement
sustainable business-as-usual processes and tools for DQM.
• Execute DQM processes against business-critical data. DQM processes include profiling & grading,
measurement, defect management, root cause fix, remediation.
• Establish DQ metrics and reporting routines.
• Ensure that the DQM governance is integrated into the Data Governance (DG).

Value Proposition
Organizations that build, formalize and assign DQ responsibilities into daily routine and methodology achieve a
sustainable organization-wide data culture.

Organizations that effectively implement Data Quality Management and achieve the appropriate level of DQ
across the data ecosystem get a return on investment from several areas:

• Better risk management


• Enhanced analytics
• Better client service and product innovation
• Improved operational efficiencies

Overview
DQ is a broad conceptual term that needs to be understood in the context of how data is intended to be used.
Perfect data is not always a viable objective. The quality of the data needs to be defined in terms that are
relevant to the data consumers to ensure that it is fit for its intended purpose. The overall goal of DM is to
ensure that data consumers have confidence in the data they receive. These consumers are using this data to

27
DCAM Framework: 6.0 Data Governance

support their business functions. For them to make accurate decisions the data must reflect the facts the data is
designed to represent without the need for reconciliation or manual transformation.

The organization needs to develop a DQM strategy and establish the overall plans for managing the integrity and
relevance of its data. One of the essential objectives is to create a shared culture of DQ stemming from
executive management and integrated throughout the operations of the organization. To achieve this cultural
shift, the organization must agree on both requirements and the measurement of DQ that can be applied across
multiple business functions and applications. This will enable business sponsors, data producers, data consumers
and technology stakeholders to link DQ management processes with objectives.

DQ can be segmented into dimensions:

• Accuracy: the relationship of the content with original intent


• Completeness: the availability of required data attributes
• Coverage: the availability of required data records
• Conformity: alignment of data content with required standards
• Consistency: how well the data complies with the required formats/definitions
• Timeliness: the currency of content representation as well as whether the data is available/can be used
when needed
• Uniqueness: the degree that no record or attribute is recorded more than once

The identification and prioritization of data quality dimensions foster effective communication about DQ
expectations and are an essential prerequisite of the DM initiative.

Creating a profile of the current state of DQ is an important aspect of the overall DQM function. A new profile
should be created periodically when data is transformed. The goal is to assess patterns in the data as well as to
identify anomalies and commonalities as a baseline of what is currently stored in databases and how actual
values may differ from expected values. Once the data profile is established, the organization needs to evaluate
the data against the quality tolerances and thresholds defined by the DQ requirements. The evaluation also
examines business requirements to validate that the data is fit-for-purpose.

The purpose of this evaluation process is to measure the quality of the most important business attributes of the
existing data and to determine what content needs remediation. A responsibility of the data producer and data
consumer is to identify the data that is critical to the data consumer’s business process. Prioritizing the data
based on criticality then informs the DQM function which attributes require a heightened level of control and
quality review. The designation of criticality requires that the highest level of accuracy and DQ treatment is
applied. The assessment process identifies the data that needs to be cleansed to meet data consumer
requirements. Data cleansing should be performed against a predefined set of business rules to identify defects
that can be linked to operational processes.

Data cleansing should be performed as close to the point of capture as possible. There should be clear
accountability and a defined strategy for data cleansing to ensure that cleansing rules are known and to avoid
duplicate cleansing processes at multiple points in the data lifecycle. The overall goal is to clean data once at the
point of data capture based on verifiable documentation and business rules as well as to fix the processes that
allowed defective data into the system at the root cause. Data corrections must be communicated to, and
aligned with, all downstream repositories and upstream systems. It is important to have a consistent and
documented process for issue escalation and change verification for both data producers and data vendors.

It is also important to ensure that data meets quality standards throughout the lifecycle so that it can be
integrated into operational data stores. This aspect of the DQ management process is about the identification of

28
DCAM Framework: 6.0 Data Governance

data that is missing, determination of data that needs to be enriched and the validation of data against internal
standards to prevent data errors before data is propagated into production environments.

For DQ to be sustained, a strong governance structure with the highest level of organizational support from
senior executive management must be in place. This supports the DQM activities and ensures compliance to DQ
processes. DQ processes need to be documented, operationalized and routinely validated via DM reviews and
formal audit processes.

DQ cannot be achieved through central control. Organization-wide DQ requires the commitment and
participation of a broad set of stakeholders. DQ is the result of a series of business processes creating a data
supply chain. Therefore, stakeholders, along that chain must be in place, authorized and held responsible for the
quality of data as it flows through their respective areas. DQ requires coordinated organizational support. DQM
processes and objectives must be part of the operational culture of an organization for it to be sustained and
successful.

6.0 Data Governance


Introduction
Data governance function is the backbone of a successful data management (DM) initiative. Data governance is
the process of setting standards, defining rules, establishing policy and implementing oversight. It is these steps
that ensure adherence to DM best practices. Governance formalizes and empowers the DM initiative to ensure
propagation and sustainability throughout the organization.

The purpose of data governance is to formalize DM as an established business function. Data governance
establishes the rules of engagement, drives the prioritization of funding and enforces compliance. Data
governance delineates the guidelines for data movement. These movement guidelines prescribe how data will
be acquired, persisted, distributed, appropriately used, archived and/or defensibly destroyed. Data governance
formalizes oversight by establishing control guidelines, approval processes and evaluation of adherence to
policies and procedures. It identifies stakeholders and empowers them. Data governance ensures that DM
principles are fully detailed and adoption is achieved. Business, data and technology functions are held
responsible for the maintenance, quality and proper use of data throughout the organization as part of the Data
governance function.

Definition
The Data Governance (DG) component is a set of capabilities to codify the structure, lines of authority, roles &
responsibilities, escalation protocol, policy & standards, compliance, and routines to execute processes across
the data control environment. This ensures authoritative decision making at all levels of the organization.

Scope
• Establish a data governance function within the Office of Data Management (ODM).
• Work with DM Program Management Office (PMO) to design and implement sustainable business-as-
usual processes and tools for data governance.
• Define clear roles, responsibilities, and accountabilities for DM resources including those mandated by
DM policy.

29
DCAM Framework: 6.0 Data Governance

• Define and operate the data governance structure with clear lines of authority, responsibility for
decision making, engaged stakeholders, adequate oversight, issue escalation paths and tracking of
remediation activity.
• Develop and oversee adherence to comprehensive and achievable DM policies, standards and
procedures, including leading the response to audits.
• Ensure data governance function is aligned with other relevant control function policies, procedures,
standards, and governance requirements from information security, privacy, technology architecture
etc.

Value Proposition
Organizations that build, effectively communicate, and enforce DM policies assure themselves of lower levels of
enterprise risk when it comes to DM and data compliance assessments.

Overview
Governance is the key to successful DM. It establishes lines of authority and ensures that the principles of DM
can and will be implemented. It establishes the mechanisms for stakeholder collaboration and defines the
organizational structure by which the DM initiative will be governed. The governance infrastructure determines
where the initiative resides in the corporate hierarchy, helps manage stakeholder expectations, aligns policies
and standards to the organization’s mission and values, ensures the adoption of policies and standards,
articulates the mechanism for conflict resolution, ensures adequate funding and sets the methodology for
measuring DM progress.

Governance over the DM initiative is multidimensional and includes activities related to each of the DCAM
components. And while the most appropriate structure for any individual organization will vary, a clear mission
with links to tangible business objectives, as well as a mechanism for realignment, are essential for long-term
success. For example, domain councils might exist to oversee the intersection of business, data and technology
functions. Governing boards might be created to establish business data priorities and resolve conflicts. Tactical
groups might exist to manage workflow, perform data reconciliation, address quality of critical data elements,
perform business analysis and provide triage to resolve issues with defective data or outcomes that violate the
organization’s ethical standards. All these aspects need to be linked into an overall framework if governance is
going to embed DM concepts into the culture of the organization successfully.

The data governance framework establishes the mechanism by which authoritative decisions for data and DM
are made across each of the DCAM components. To implement governance, the organization must ensure that
the deployment plan will be effective within their business environment. The governance structure should
define the in-scope data that is required by the business objectives and establish the DM initiative strategy and
approach. It should give authority and required funding for the ODM, establish policies and standards, make
authoritative decisions about DM and data. The governance structure should provide an issues management
and escalation procedure. After the initial implementation, the governance framework itself needs to be
evaluated, measured and adjusted based on business reality and to ensure that it is fully integrated into
business-as-usual processes.

30
7.0 Data Control Environment
Introduction
The Data Control Environment refers to the state of operation in which the data assets of an organization are
holistically managed throughout the organization. There are three elements of a successful data control
environment.

1. The data management (DM) objectives and capabilities described within this document have been
embraced and adopted throughout the organization.
2. The data lifecycle is fully supported by all stakeholders. These stakeholders ensure understanding,
awareness and control of data throughout the data supply chain–from source to consumption to
disposition.
3. DM is part of the organization’s data ecosystem. It is integrated and coordinated with all other control
functions organization-wide.

The purpose of the data control environment is to coordinate the people, process and technology of DM into a
cohesive operational model. The data control environment defines the mechanisms used to capture data
requirements, unravel data flows and linked processes and determine how data is to be delivered to the data
consumer. The data control environment supports the data lifecycle. It ensures that proper resources and
controls are in place as data moves throughout its journey. Also, the data control environment ensures
collaboration and alignment to cross-organizational control functions. Areas such as Information Security, Data
Privacy and Change Management must operate in sync with DM to ensure data is properly managed across all
business functions.

To the extent that the data control environment is not achieved it results in potential data risk. Data risk should
be managed in alignment with the overall risk management framework of the organization. Data risk scope
includes areas such as data architecture risks, metadata risks, data quality risks, data governance risks and
Master Data risks.

Definition
The Data Control Environment (DCE) component is a set of capabilities that together form the fully operational
data control environment. Data operations, supply-chain management, cross-control function alignment and
collaborative technology architecture must operate cohesively to ensure the objectives of the DM initiative are
realized across the organization.

Scope
• Work with DM Program Management Office (PMO) to design and implement sustainable business-as-
usual processes and routines to enable a successful data control environment.
• Bring together the DM components as a coherent, end-to-end data ecosystem.
• Follow current DM best practices by routinely reviewing and auditing the capabilities and their
processes.
• Ensure all facets of DM for business-critical data such as data lifecycle, end-to-end data lineage and data
aggregations are fully operational.
• Ensure DM is aligned with other control function policies, procedures, standards, and governance.

Value Proposition
Organizations that improve their ability to reconcile requirements for data and accurately share data across the
organization’s ecosystem are able to better respond to changes in business process, regulatory, and audit
requirements.
DCAM Framework: 8.0 Analytics Management

Organizations that deliver data through a controlled environment respond more rapidly to market opportunities
and provide innovation to customers.

Overview
The DCE is where the execution components of Business & Data Architecture, Data & Technology Architecture,
Data Quality Management and Data Governance are made operational in the data supply chain by the data
producer. This operationalization brings a defined set of data into control and makes it available to data
consumers at a point in time, either real-time or period end.

Diagram 7.1: Data Supply Chain

One of the first functions within the DCE is the orchestration of the DM component disciplines. These disciplines
must be aligned to effectively manage data across the organization. The DCE forces the alignment of all the
capabilities discussed in this model into a consistent operational flow. Each capability must be properly
resourced and prioritized as well as supported by business, data and technology senior management.

The successful coordination of these components is a determining factor in the success of the DM initiative. It is
the responsibility of the DM organization and the senior data officer at each level of the organization to
structure and coordinate the DM operating model. This properly defines data meaning, ensures data quality
(DQ), and delivers data in a timely and efficient manner. Evidence of the processes must be compiled through
demonstration of organizational structures, charters, policies, and senior management directives.

Data is a core factor of input into business functions and operational processes. The data lifecycle tracks the
progress of data from source to storage, to maintenance, to distribution and to consumption. From this point
the data may be reused, sent to the archive and finally to defensible destruction. The mechanisms used to
identify, align, and validate the data as factors of input into business functions are derived by reverse
engineering existing business processes into their individual data elements and by unraveling the data assembly
processes used to create the required data sets.

32
DCAM Framework: 8.0 Analytics Management

This reverse engineering process to define data requirements needs to be managed with precision. Only
precision will avoid confusion and miscommunication between what the business users truly need for their
business function and what technology professionals need for technical implementation. Data requirements
should be modeled, aligned with business meaning, prioritized in terms of how critical it is to the business
process and verified by all stakeholders. These steps ensure that essential concepts are not lost in translation.
This is particularly critical for data that is shared among multiple data consumers and for core data attributes
that are used as a baseline for onward expression in operational calculations or business formulas.

For complex applications and for all data aggregation-related processes, it is essential to understand and
document how the data moves from system-to-system; how the data is transformed or mapped; and how the
data is aligned to business definition and standard meaning. Gaining agreement on this data lineage process is
fundamental for ensuring that the results of decentralized or linked processing can be trusted to be consistent
and comparable.

The final aspect of an effective DCE is the integration of DM into the data ecosystem of an organization. The
data ecosystem is a concept that describes how data is managed collaboratively with all cross-organization
control functions. Control functions such as information security, storage management, legal and compliance,
privacy, and vendor management all have responsibilities on how data is managed. It is imperative that the
policies of DM are integrated and aligned with the policies of the cross-organization control functions to ensure
data is being managed consistently and holistically organization-wide.

Finally, a DCE ensures technology’s alignment with DM policies and best practices. DM capabilities such as
architecture, governance, and DQ should be integrated into the organization's Software Development Lifecycle
(SDLC) processes to ensure that DM considerations are being adequately addressed at the appropriate stages of
the development cycle. Nothing should operate in a silo. Operating within an ecosystem recognizes
interdependencies and ensures collaboration.

8.0 Analytics Management


Introduction
The first seven components of DCAM define the capabilities required for best practice Data Management (DM).
These DCAM best practices guide us regardless of the purpose to which the data is subsequently applied. The
issues inherent in Artificial Intelligence/Machine Learning (AI/ML) and an organization’s Code of Data Ethics in
these components are particularly relevant where Analytics consumes the data. Consumption of data for
analytical purposes is increasingly important for many organizations. Analytics is dependent on high-quality,
well-understood data. Analytics functions are, in general, dependent on data produced by areas outside
Analytics or that is sourced from outside the organization.

The purpose of Analytics Management (AM) is to formalize how the Analytics activities of an organization are
structured, executed, and managed and to ensure they are aligned with the DM activities. The degree to which
Analytics teams are either centralized or distributed in an organization will depend on the structure and culture
of the organization. However, synergies, efficiencies, and effectiveness will be maximized if the teams operate
within a well-understood framework as part of a coherent Analytics strategy.

33
DCAM Framework: 8.0 Analytics Management

Definition
The AM component is a set of capabilities used to structure and manage the Analytics activities of an
organization. The capabilities align AM with DM in support of business priorities. They address the culture, skills,
platform, and governance required to enable the organization to obtain business value from analytics.

Scope
• Define the Analytics strategy.
• Establish the AM function.
• Ensure that Analytics activities are driven by the Business Strategy and supported by the DM strategy.
• Ensure clear accountability for the analytics created and for their uses throughout the organization.
• Work with DM to align Analytics with Data Architecture (DA) and Data Quality Management (DQM).
• Establish an analytics platform that provides flexibility and controls to meet the needs of the different
stakeholder roles in the Analytics operating model.
• Apply effective governance over the data analysis lifecycle. Governance includes tollgates for model
reviews, testing, approvals, documentation, release plans, and regular review processes.
• Ensure that Analytics follows established guidelines for privacy, data ethics, model bias, and model
explainability requirements and constraints.
• Manage the cultural change and education activities required to support the Analytics strategy.

Value Proposition
Organizations that best manage their Analytics functions and resources effectively maximize the benefits of
combining advanced algorithms and high-quality data.

Overview
In simplest terms, analytics support decision making by analyzing data. They are not new. Statistical analysis of
manually collected data pre-dates the age of computing. Advances in technology have increased the speed,

34
DCAM Framework: 8.0 Analytics Management

variety, and sophistication of analysis, as well as the quantity and variety of data that can be analyzed. Analytics-
based, real-time, sophisticated decision making is accelerating and has become a key business differentiator.

Advances in technology and data include:

• Lower data storage costs, increasing the quantity of data that can be made available to analytics.
• New sources of data such as sensors, telematics, and satellite imagery, enabling new data sets and new
combinations of data sets to be analyzed.
• NoSQL and graph database technologies are enabling greater varieties and quantities of both structured
and unstructured data to be accessed and processed efficiently.
• Falling costs of processing and the ease with which cloud computing capacity can be scaled up and
down, increasing the affordability of more sophisticated analytics techniques.
• Advanced data visualization simplifying how people explore and interpret very large volumes of
analytics results.
• The evolution of ML, enabling models to make decisions with minimal or no human involvement.

Data scientists have the skills to understand a business need, bring together data sets, and apply advanced
analytics techniques to address that need. In many organizations, they may be aligned more directly with the
business units they support than with the technology teams that provide more traditional business intelligence
analytics. In both circumstances, the organization must have a clear operating model that provides consistency
and efficiency in how the activities are performed. Creators of this model must take care to retain the business
alignment and agility that the organization demands.

The ever-increasing volume and variety of data available to an organization means the universe of issues that
advanced analytics can be tasked with is boundless. The organization must have a clear strategy for how
analytics will be used to support the businesses’ objectives. The operating model should ensure this Analytics
strategy can be delivered, and a funding model must be established to sustain this effort.

In the absence of effective data management, a significant amount of an Analytics practitioner’s time is spent
manipulating, cleansing, and transforming data in preparation for the analytics. Analytics must be aligned with
the DM initiative. In particular, DA should ensure that data is understood and that authoritative sources are
used where these are available. DQM will provide measures of data quality that Analytics should reference, and

35
DCAM Framework: 8.0 Analytics Management

DQM should be used to manage data quality issues uncovered by Analytics practitioners as they prepare data
for their models.

Not all Analytics activities will involve the creation of models that, once successfully tested, will run as
production processes or services. Some analytics will be one-time exercises to investigate a historical issue or
answer a specific question at the moment. There may also be experiments with new data sources or new
analytical techniques. The platform that supports Analytics must be designed to accommodate these different
types of activity and the specific needs of different stakeholders.

Most organizations will have restrictions on access to production data, whether driven by commercial sensitivity
or personal privacy regulations. Analytics practitioners will often need to work with data that has been
obfuscated in some way. Requirements for this must be understood and must be facilitated by the analytics
platform.

Model governance and model transparency are critical to the controlled, auditable development and
deployment of advanced analytics. Compliance with privacy regulations is vital when models are released as
production assets or services. However, with advanced analytics, ML and AI, other governance and control
issues also come into play. The need to explain decisions made by models may be a regulatory requirement. As
with most considerations that need to be confirmed at the point of release into production, requirements for
explainability need to be understood as early as possible in the model development. This need for early
understanding also applies to model bias, particularly if this bias could result in prejudice against groups of
individuals. In this case, the bias in the data sets that are used to train models needs to be controlled. These
controls may need to be extended to data that the model encounters over time in production. Finally, the
models, and the business issues they are addressing, need to be governed against the Code of Data Ethics of the
organization. It is essential to determine if the way the model is using the data is ethical and appropriate. This
determination then must guide the organization as it decides whether and how it should act on the decisions
and recommendations the model makes.

Overcoming the challenges an organization will face in delivering its Analytics strategy will require cultural and
behavioral changes. Analytics practitioners will require specific analytic skills. However, an awareness of both
the value that analytics can bring and the potential for undesirable impacts created through analytics will be
needed throughout the organization. Awareness of model bias and data ethics considerations will be required
by those creating models as well as those approving them.

36

You might also like