Data Governance Final v2
Data Governance Final v2
Org.
DG.1.1
Structure
Org.
DG.1.2
Structure
Org.
DG.1.3
Structure
Org.
DG.1.4
Structure
Org.
DG.1.5
Structure
Org.
DG.1.6
Structure
Org.
DG.1.7
Structure
Org.
DG.2.1
Structure
Org.
DG.2.2
Structure
Org.
DG.2.3
Structure
Org.
DG.2.4
Structure
Org.
DG.2.5
Structure
Org.
DG.2.6
Structure
Org.
DG.2.7
Structure
Org.
DG.2.8
Structure
Org.
DG.2.9
Structure
Org.
DG.2.10
Structure
Org.
DG.2.11
Structure
Org.
DG.2.12
Structure
Org.
DG.2.13
Structure
Org.
DG.2.14
Structure
Org.
DG.2.15
Structure
Org.
DG.2.16
Structure
Metadata Standards
MD.1.1
Conformation
Metadata Standards
MD.1.2
Conformation
Metadata Standards
MD.1.3
Conformation
Metadata Standards
MD.1.4
Conformation
MetaData Management
MD.2.1
Programme
MetaData Management
MD.2.2
Programme
MetaData Management
MD.2.3
Programme
MetaData Management
MD.2.4
Programme
MetaData Management
MD.2.5
Programme
MetaData Management
MD.2.6
Programme
MetaData Management
MD.2.7
Programme
Metadata Architecture MD.3.1
Target DA DA.3.2
Target DA DA.3.3
Target DA DA.3.4
DA Roadmap DA.4.1
DA Roadmap DA.4.2
DA Roadmap DA.4.3
DA Roadmap DA.4.4
Information Security
DSP.1.2
Standards
Information Security
DSP.1.3
Standards
Information Security
DSP.1.4
Standards
Information Security
DSP.1.5
Standards
Information Security
DSP.1.6
Standards
Data Privacy Policy DSP.2.1
DS Roadmap DS.3.1
DS Roadmap DS.3.2
Metadata Standards
MD.1.1
Conformation
Metadata Standards
MD.1.2
Conformation
Metadata Standards
MD.1.3
Conformation
Metadata Standards
MD.1.4
Conformation
MetaData Management
MD.2.1
Programme
MetaData Management
MD.2.2
Programme
MetaData Management
MD.2.3
Programme
MetaData Management
MD.2.4
Programme
MetaData Management
MD.2.5
Programme
MetaData Management
MD.2.6
Programme
MetaData Management
MD.2.7
Programme
Metadata Architecture MD.3.1
Target DA DA.3.2
Target DA DA.3.3
Target DA DA.3.4
DA Roadmap DA.4.1
DA Roadmap DA.4.2
DA Roadmap DA.4.3
DA Roadmap DA.4.4
Information Security
DSP.1.1
Standards
Information Security
DSP.1.2
Standards
Information Security
DSP.1.3
Standards
Information Security
DSP.1.4
Standards
Information Security
DSP.1.5
Standards
Information Security
DSP.1.6
Standards
Data Privacy Policy DSP.2.1
DS Roadmap DS.3.1
DS Roadmap DS.3.2
The Entity shall establish an organizational structure to support the Data Management Programme.
The Entity shall convene the Data Governance Board to manage delegated authority and responsibility within the Entity. The Board
will be the final arbiter within the Entity for all matters relating to data management.
The Entity shall identify and appoint Data Architects to support the Data Manager.
The Entity shall identify and appoint Data Stewards to support the Data Manager in both the business and technical areas of the
organisation.
The Entity shall identify and appoint Data Owners (who are responsible for a particular dataset) to support the Data Stewards. Data
Owners will be drawn from both the business and technical areas of the organisation.
The Entity shall regularly undertake monitoring and compliance checking to ensure that information systems and data related
processes are implemented in accordance with established policy, standards and best practices.
The Entity’s Data Management Policy shall address the scope of its data management systems, roles, responsibilities, management
commitment, coordination among organisational functions, and compliance obligations.
The policy shall contain a definition of data management; its overall objectives and scope, and the importance of data management as
a pillar of upholding high standards of data quality.
The policy shall contain a definition of data management; its overall objectives and scope, and the importance of data management as
a pillar of upholding high standards of data quality.
The policy shall be applicable to all business functions of the organisation and should be supplemented by supporting instructions and
guidance where appropriate for specific areas of activity.
The Entity shall establish its Data Management Policy (through implementing this control), describing how data will be managed
across the Entity.
In support of the Data Management Policy, the Entity shall establish policies for public consumption where there are external
stakeholders.
The policy should clearly express management's commitment to data management principles and highlight its alignment with
government strategy.
The policy should emphasize management expectations for data handling and the importance of maintaining high data quality
throughout the organization.
The Entity must include governance metrics and process checkpoints in their policy to measure ongoing data management
effectiveness in their systems and processes.
The policy should outline a system for users to report data issues and include an escalation plan for their resolution.
The policy must detail the change management process, specifically how it relates to the Data Management Program and its
initiatives.
The policy will undergo at least annual reviews, overseen by the Data Management Board to maintain its relevance and effectiveness.
More frequent reviews may be needed in response to major business or regulatory changes.
The Entity shall ensure that all policy developments are aligned with all relevant legislation.
The Entity shall collect and maintain evidence of compliance with their policies, and with the Control Specifications within these
standards.
The policy must be quantifiable and tied to the Control Standards in this document, with the Entity able to show how each control
supports a specific policy requirement.
The Entity must collect written confirmations from all personnel and stakeholders, including internal and external parties and
contractors, demonstrating their understanding and commitment to adhere to the Policy, with these signed records kept on file for
future reference.
The Entity is required to establish and maintain specific, measurable, and scheduled goals as part of its Data Management
Programme. These goals should align with the Entity's business strategy, risk management, compliance with data management
policies and legal requirements, and the promotion of a data-aware organizational culture.
The Entity must specify robust version control for all Data Management Programme documents in the plan.
The Entity's Data Management Programme must be approved by the accountable executive responsible for the associated
operational risks.
To support its Data Management Programme, the Entity will develop subsidiary plans for specific capabilities, like Data Governance,
Organizational Awareness and Training, Disaster Recovery, Document and Content Management, Data Architecture Management,
Inter-Entity Data Integration, and Reference and Master Data Management. These plans can either stand alone or be included as
appendices in the Entity's Data Management Programme Plan.
The Entity must follow the Government Data Management Model's principles and structure (Owned, Described, Quality, Access,
Implemented) in its Data Management Programme. These principles should be incorporated into subsidiary plans and integrated into
the business processes implemented throughout the Data Management Programme rollout
The Entity’s Data Governance Board should approve all changes to the Data Management Programme (e.g. Plan or Policy)
The Entity shall integrate its existing change management processes into each of the data management domains, or create a new
change management process if none already exists.
The Entity should establish a baseline for its Data Management Programme Plan, with proposed changes to the plan being analysed
for impact
Changes to the Data Management Programme Plan should be coordinated with the organisation-wide Change Management
capabilities of the Entity to ensure on-going alignment between Data Management and other organisation initiatives
When these Standards necessitate changes to existing business processes, the Entity should conduct an impact assessment to identify
stakeholders and processes affected, ensuring coordinated communication of the change.
The Entity shall develop and maintain its change management processes for the Data Management Programme as a whole, and
domain-level processes developed within the Data Management Programme
The Entity shall develop and execute organisation-wide awareness programmes for the required data domains
The Entity shall develop and execute organisation-wide awareness programmes for the required data domains
The Entity shall develop and execute organisation-wide awareness programmes for the required data domains
The Entity shall develop and execute organisation-wide awareness programmes for the required data domains
The Entity shall develop and execute organisation-wide awareness programmes for the required data domains
The Entity shall develop and execute organisation-wide awareness programmes for the required data domains
The Entity shall develop and execute organisation-wide awareness programmes for the required data domains
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall develop, report against and analyse key performance indicators relating to its Data Management Programme
The Entity shall develop, report against and analyse key performance indicators relating to its Data Management Programme
The Entity shall develop, report against and analyse key performance indicators relating to its Data Management Programme
Data Management performance data shall be verified by a competent and independent party that is not directly connected with the
work that is the subject of measurement
Data management performance reporting will evaluate technology and business process compliance, data
architecture maintenance, data model completeness, system-level data models, master profiles, data quality
milestones, master/reference data management achievements, and information/document lifecycles.
Performance data will influence continuing changes, which the Data Governance Board will assess for cost,
benefit, and status.
The Entity will follow Abu Dhabi Government Metadata Standards, including eGIF, SCAD, and geographical
metadata.
The Entity must follow Abu Dhabi Government Metadata Standards for eGIF, SCAD, and GIS.
The Entity must follow ISO/IEC:11179 Part 4 for precise data definitions in its business glossary, data dictionary,
and other data management domains.
The Entity must adopt ISO/IEC:11179 Part 5 for naming and identification principles to provide meaningful
names and IDs (e.g., Emirates ID) for persons or certain data contexts.
The Entity is going to launch a metadata programme that includes metadata assessment, stakeholder interviews,
requirement collecting, metadata architecture development, data stewardship, and a metadata management
implementation strategy.
To customise metadata for its operations, the Entity will employ Abu Dhabi government and international
standards including eGIF, SCAD, Geospatial, and ADMS. Abu Dhabi Government eGIF Metadata Standard-
compliant metadata items, improvements, and encoding methods will be included.
The Entity will employ information manually and automatically. According to the program's timeline, automated
scanning and proprietary techniques will ensure metadata correctness. To ensure metadata quality, Data Stewards
will manage recorded information and add business and technical metadata (Ref: MD.4.4).
The Data Governance Board will settle metadata definition and quality disputes not resolved by Data Stewards,
particularly where information spans departments.
Through the Data Catalogue, the Entity must make all metadata, data dictionary, business vocabulary, and
modelling and architectural deliverables available to users.
Based on user role, the Data Catalogue will index, search, and retrieve metadata.
Metadata values will indicate the capture version for Elements, Refinements, and Encoding Schemes, which the
Entity will version manage.
The Enterprise Data Architecture will include the Entity's metadata architecture, which must be documented per
DA Standards.
The Entity shall evaluate metadata architecture choices for central standard compliance and present arguments to
the Data Governance Board. A single repository, decentralised components with a single access point, or hybrid
components with central metadata management are possible.
Based on Data Quality standards, the Entity will construct metadata name and definition quality measurements,
maybe utilising subjective business experience, user surveys, and other techniques to assess metadata collection,
discovery, and utilisation.
The Entity will monitor and report on metadata quality based on established measurements. .
The Entity will assess metadata coverage across its business functions, including metadata definition, capture, and
usage coverage, with a focus on cross-business function metadata.
The Entity will align its Data Catalog with mandatory standards to enable interoperability. Mandatory standards include
Abu Dhabi Government eGIF Schema Generation, DCAT (Data Catalog Vocabulary), and XSD (XML Schema
Definition) for dataset structure description.
The Entity should align its Data Catalog with recommended standards like ADMS for asset description and RDF for
semantic relationships. In cases where alignment is not possible due to vendor limitations, the Entity must document
and justify the non-alignment to the Data Governance Board.
The Entity is required to create a Data Catalog with key features, including a metadata repository, a publishing portal
for controlled access, a workflow management tool, a business glossary, a data dictionary, a data model repository,
and version control.
The Entity will align its data catalog requirements with emerging government-wide standards.
The Entity shall develop its Data Catalog according to principles like usability, common usage, representation,
accuracy, sufficiency, economy, consistency, and integration. Alignment will be demonstrated through the Governance
Checkpoint Process, with conflicts resolved through practical solutions approved by the Data Governance Board.
The Data Catalog will serve as a central access point for all users, both internal and external, seeking information
about the Entity's data assets, even though these assets are stored in various systems. It will offer a unified resource
for finding and understanding any data asset.
The Entity will select datasets for the Data Catalog, including transactional, reference, master data, statistical, and
geospatial datasets. Factors like user numbers, reusability, and data complexity will be considered.
The Entity should create semantic data models for captured data, defining data relationships with a machine-readable
vocabulary.
The Entity shall define metadata for data capture, following metadata standards and emphasizing reusability. This
includes elements like Elements, Refinements, Encoding Schemes, standard names, and definitions. Compliance with
Abu Dhabi Government eGIF and other relevant domain standards' metadata requirements will be part of the Data
Catalog Population Roadmap.
The Entity will document the approach for capturing and populating metadata in the Data Catalog, with details for each
dataset in the Data Catalog Population Roadmap. Metadata will cover ownership, security classification, data quality,
validity periods, and version information.
The Entity will maintain metadata in the Data Catalog using the Governance Checkpoint Process. Additionally, the
Data Catalog Population Roadmap will define minimum metadata refresh periods for each dataset captured.
The Entity will categorize data assets within a hierarchy as follows: Metadata, Reference Data, Master Data,
Transactional Data, and Audit and Log Data. Data classes higher in the hierarchy are more critical, as lower-class
data relies on them. Higher-level data is relatively stable and has a longer lifespan, whereas lower-level data is more
dynamic and has a shorter useful life. The volume of data decreases as you move up the hierarchy, with the lower
classes having more data that is subject to frequent changes.
The Entity will create and publish a data sharing licensing model accessible via the data catalog.
The Entity will run an awareness program to promote Data Catalog information to stakeholders, emphasizing data
reuse benefits and available datasets.
Data reuse considerations will be integrated into the System Development Lifecycle (SDLC) of information systems,
with monitoring through the Governance Checkpoint Process for Data Governance Board approval.
The Entity will encourage innovative data use submissions from various functions, evaluated by the Data Governance
Board for merit and promoted by the Data Manager resulting from Data Catalog usage.
The Entity will allow dataset consumers to register their data usage in the Data Catalog, ensuring they are informed of
dataset changes. Consumers can be individuals, application system representatives, or business function
representatives.
Registered consumers of datasets will be classified as Formal or Informal. Formal consumers have service level
agreements with data producers, while Informal consumers rely on published licenses and policies.
The Entity will monitor and report on Data Catalog effectiveness using metrics such as dataset coverage, registered
consumers, and metadata completeness. An annual report on data coverage effectiveness will be presented to the
Data Governance Board.
The Entity will ensure that Data Governance Board reviews data models in the software development lifecycle as part
of the Governance Checkpoint Process. Data models are crucial deliverables for systems built, purchased, or
commissioned to support business and technology needs.
The Entity will implement data modeling tools with specific capabilities, including UML2.x-compliant models, support
for UML model interchange using the XMI Interchange Format, modeling of structured and unstructured datasets,
Common Warehouse Metamodel (CWM) support for data warehouse systems, metadata association for reusability,
and model versioning with traceability. Existing toolsets will be certified to meet these requirements, or the Entity will
initiate an effort to fill any gaps.
The Entity will provide training and education programs for developing data models to enhance awareness and value
for both business and technical users, tailored to their engagement levels with information systems.
The Entity will develop data models at conceptual, logical, and physical levels, involving input from various
stakeholders. The journey to document the as-is data model typically includes creating conceptual data models,
logical data models, and physical data models to represent the high-level ideas, system-independent views, and
system-specific implementations of data structures, respectively. Enterprise modeling will emphasize steps 1 and 2,
while information system modeling will focus on steps 2 and 3.
The Entity shall model unstructured data that is associated with structured data based on business terms and logical
concepts. This modeling can involve capturing concepts expressed in documents linked to records, such as medical
or education reports.
Semi-structured or unstructured data, including free text, images, audio, and video, must be modeled to document the
mandatory requirements, metadata describing concepts within unstructured data, and associated structured identifying
data. For instance, modeling may entail specifying requirements for citizen ID photos, including image characteristics
and associated structured data like Emirates ID and date.
The Entity will employ conversion techniques to transform semi-structured and unstructured data into structured
formats, allowing for the formal documentation and modeling of such data.
When converting unstructured data into structured forms, the Entity will align its processes with the Unstructured
Information Management Architecture (UIMA), enabling analysis of unstructured artifacts and the development and
modeling of artifact metadata. Governance of unstructured content lifecycles will be established through suitable
workflows (refer to DCM.2).
The Entity shall create Data Flow Diagrams and Entity Relationship Diagrams for unstructured data. Data Flow
Diagrams will illustrate the flow of unstructured information, along with associated metadata and identifying data,
between systems. Entity Relationship Diagrams will depict relationships between unstructured information concepts
and structured identifying data, as well as relationships between different unstructured information concepts.
The Entity shall model unstructured data that is associated with structured data based on business terms and logical
concepts. This modeling can involve capturing concepts expressed in documents linked to records, such as medical
or education reports.
The Entity will develop data models at the Conceptual, Logical, and Physical levels with interconnections to enable the
mapping of physical information systems to logical models and higher conceptual understanding. These data modeling
artifacts will be integral to the Entity's mandatory system design and architecture documentation.
Data modeling artifacts, including Entity Relationship Diagrams and Data Flow Diagrams, will be produced uniformly
for both structured and unstructured data.
The Entity will make data models available for reference and re-use within the organization. Data Architects will
evaluate pre-existing data models, align or re-use them for new information systems where feasible. Any exceptions to
this practice will require justification in the system design, with approval from the Data Governance Board.
UML diagrams will serve as the primary modeling notation throughout the software development lifecycle. Any
deviations from this standard will be documented and submitted for authorization by the Data Governance Board. The
primary use of UML diagrams will involve structural diagrams, including Class Diagrams, Entity Relationship
Diagrams, Component Diagrams, and Deployment Diagrams.
To enhance communication of data model concepts with business stakeholders, the Entity will employ tools better
suited for this purpose, such as text-based documents, presentation slides, and spreadsheets. The Data Governance
Board will contribute to the development of guidance to ensure effective communication with departments and
stakeholders.
Entity-Relationship diagrams and Class Diagrams will be used to document data object structure and relationships
across conceptual, logical, and physical levels.
Data Flow Diagrams will be employed to model data movement within and between systems, with a particular focus
on data forming part of the Entity's master profiles. This includes identifying and capturing points of data capture,
actions that transform or aggregate data, data export points (automatic or manual), and service endpoints emitting
master and common profiles.
In the case of very large models that can be challenging to read (e.g., models with over 200 tables or descriptive
artifacts), they should be subdivided into smaller, subject area-based models and aggregated into higher-level models
to maintain clarity. The primary purpose of data models is to aid understanding.
Data models will provide clear differentiation between aspects that are currently implemented and those that are not
yet implemented.
Data modeling artifacts shall form an integral part of the Entity's mandatory system design and architecture
documentation.
When designing new conceptual data models, the Entity shall ensure that data objects are represented by nouns, and
data relationships are represented by verbs.
For new logical data models, the Entity shall adhere to rules that include using appropriate data types for attributes
within tables, taking into account performance, storage, and data requirements. For instance, the Entity should
consider more suitable data types before using String or other variable character data types.
When designing new physical data models, the Entity shall follow specific rules, including using numeric primary keys
and employing numeric primary keys in reference data tables. Reference data tables will have, at a minimum, a
numeric primary key and a code value represented as a string. Physical data types with length or precision specifiers
should have appropriate lengths or precisions specified and not rely on default values.
The data model should indicate master/slave/federation rules when the Entity identifies data duplication across the
organization or when datasets owned by another Entity are used by an information system. These rules determine
how the datasets are managed, identifying which datasets are master, slave, or federated across systems.
Business terms for data objects, attributes, relationships, and values with contextual business meaning shall be
captured and defined. Business definitions will ensure consistency in the use and meaning of data objects and
relationships across the Entity. Business definitions will be stored in the business glossary section of the Entity's Data
Catalogue.
Technical definitions for terms within the business glossary will be produced to aid data integration and development
projects that span multiple systems. These technical definitions will consider logical and physical models and may
include technical validations like state diagrams, flow charts, and regular expressions. Technical definitions will be
maintained within the data dictionary of the Data Catalogue.
Minimum data model metadata that the Entity shall maintain includes Model Identifier, Responsibility Assignment,
Published Status, and Change History. Additional metadata will be determined based on the Entity's requirements and
evaluated by the Data Governance Board.
The Entity shall maintain traceability links for different views of the same subject area using annotations that indicate
other existing views. Lower-level identifiers will be used as part of the Reference Number element of the model
identifier to pre-assign numbers to different subject areas. Other metadata for data models will be decided based on
the Entity's requirements, evaluated by the Data Governance Board, and issued to staff.
Data models shall be stored in a version-controlled repository, and the Entity recommends using version control
repositories built into data modeling tooling, external version control repositories or document management systems
that support versioning, or file system structure as an interim solution.
The Entity will develop an enterprise-wide data model that represents an organization-wide view of all data central to
the Entity's core business functions. The enterprise data model is a key aspect of the baseline and target enterprise
data architectures.
The Data Governance Board will maintain oversight and approval of enterprise data models and socialize the
enterprise data model through working groups to facilitate sharing with other Entities.
When developing new data models for system implementations, the Entity shall ensure alignment with the Entity's
Enterprise Data Model. Conceptual, logical, and physical data models will demonstrate alignment with master profiles
and common profiles in the government Data Catalogue.
The Entity will align its Enterprise Data Model with government-wide data models as they emerge.
The Entity shall develop conceptual data models to support the architecture, development, and operational processes
for its data. Conceptual data models will be required as part of the system development lifecycle and provided to the
Data Governance Board through the Governance Checkpoint Process.
Techniques to develop conceptual data models include interviewing stakeholders, identifying candidate data profiles,
and combining candidate data profiles into master data profiles, transactional data profiles, and reference data
profiles. Conceptual data modeling shall be performed at a system or enterprise level, depending on the data view
needed.
Conceptual data models shall be used for documentation to support development of logical data models, change
requests, impact assessments, and gap analyses between baseline and target state requirements.
The Entity shall identify and model all master profiles and relationships between them. Master profiles represent core
data for the Entity's line of business. For example, a 'Citizen' profile may include family relationships, contact details,
and name change history.
Master profiles shall be documented at conceptual and logical levels and form part of the Entity's enterprise data
model. Each system containing master profile data shall have its data modeled at conceptual, logical, and physical
levels.
Entity master profiles shall be made available to ADSIC upon request to facilitate government-wide common profile
development, and the Entity shall align local profiles with government-wide common profiles when appropriate.
Develop logical data models with relationship rules and denormalization process.
Logical data models support development, change requests, and impact assessments. They are shared with the Data
Governance Board.
Develop physical data models based on logical data models for detailed technical specifications.
Physical data models enable technical implementation and operational functions, e.g., SQL queries.
Data architecture in TOGAF includes component models, data profiles, lifecycle models, security, quality, and change
processes.
Data architecture deliverables cover various domains, including metadata, data quality, security, and management
systems.
Baseline data architectures are developed for information systems under the Entity's control, including maintenance
and updates.
Development of baseline data architecture considers business and technical requirements, data architecture themes,
and constraints.
System-level baseline data architecture is used for system changes and reviews, guided by the Data Governance
Board.
Baseline data architectures are continuously maintained and versioned.
The Entity produces a target enterprise data architecture, informed by the baseline architecture but not dependent on
it.
Baseline data architectures provide a foundation for developing and validating target data architectures.
The Entity produces target data architectures as information systems go through change cycles, required in the
Governance Checkpoint Process.
Target data architectures, at system or enterprise levels, address gaps, encourage data integration, remove
duplication, align with standards, and promote reuse.
The target data architecture influences technology and data requirements for system changes alongside business and
quality requirements.
The Entity identifies gaps between baseline and target enterprise data architectures, covering business, technical, and
capability aspects.
Gap analysis results in a roadmap to move from baseline to target enterprise data architecture, periodically reviewed
by the Data Governance Board.
The roadmap includes timelines, budgets, and priorities for component and system changes, remaining flexible to
address business priorities.
The Entity follows the roadmap during system development and maintenance, ensuring alignment with the enterprise
target data architecture. Annual reports assess the roadmap's effectiveness by identifying gaps between the starting
and ending baseline enterprise data architectures.
The Entity establishes data quality definitions covering various data types, which are stored in the business glossary
and data dictionary.
Data quality definitions are linked to business processes to evaluate the impact of data quality on operations, ensuring
accuracy in processes like citizen contact.
Data quality definitions encompass key aspects such as validity, timeliness, integrity, accuracy, and reliability,
determining acceptable criteria and assessing business benefits.
Metadata is aligned with data quality definitions to populate the Data Catalogue, including quantitative and qualitative
measures.
The Entity creates a data quality checklist tailored to its datasets to facilitate data audits in line with established data
quality definitions.
Data Quality Management Plan: The Entity will create a plan for auditing, monitoring, and maintaining data quality. This
plan covers the quality of master profiles, dataset quality, and addressing issues reported by users. It involves defining
roles, using data profiling and cleansing tools, and producing data quality metadata and requirements. The Data
Governance Board will oversee this plan, including one-off and incremental audits.
Data Quality Requirements: The Entity ensures that all new information systems and changes include specific data
quality requirements. These requirements, documented using data quality metadata definitions, serve as the basis for
internal data quality SLAs and external contractual agreements.
Incorporating Data Quality Audits: The Entity integrates data quality audits into the Data Governance Checkpoint
Process. This includes audits during system changes, plans for improving data quality, and documenting
requirements. The Data Governance Board defines when and where these audits are required, considering factors like
system integrity, accuracy, and reliability at various checkpoints in the data lifecycle.
Master Profile Audits: The Entity will audit its master profiles, as per Data Modeling standards, every three months
across all data sources. If data quality misaligns across sources or with defined standards, discrepancies will be
identified and root causes determined. Corrective action, if necessary, will be decided by the Data Governance Board.
Audit Intervals for Non-Common Profiles: The Entity will establish suitable audit intervals for data types not covered by
common profiles defined in DM2. The Data Governance Board will decide on corrective actions once the cause of
discrepancies is understood.
Third Party Data Quality Checks: The Entity will perform spot checks on third-party data to ensure compliance with
data supplier service level agreements. If no agreements exist, the Entity will develop its own data quality
requirements and share them with the data supplier.
Data Profiling Tools: The Entity will systematically use data profiling tools with various analysis capabilities, including
structured data column analysis, data structure-independent integrity analysis, pattern identification, reporting, and
change detection.
Metadata Storage: Data quality measures obtained during audits will be stored as metadata in the Data Catalogue.
Data Cleansing Initiative: The Entity will identify gaps between data quality definitions and measured data quality. It will
execute a data cleansing initiative to improve data quality, led by the Data Governance Board. Strategies may be
system-specific, data-type specific, or prioritized by business benefit, utilizing various tools and expertise.
Target Data Architectures: The Entity will ensure that target data architectures focus on improving data quality across
information systems and services. Master profiles will be a priority, extending to other data types as defined by the
Data Governance Board.
Data Cleansing Process: The end-to-end data cleansing process is outlined as follows:
The Entity must follow the latest approved Information Security Standards in Abu Dhabi Government, prioritizing them
over Data Management Standards in case of conflicts. The Data Governance Board is responsible for recording and
resolving conflicts.
The Entity's data architecture, information systems, and components should align with the approved Information
Security Standards in Abu Dhabi Government. The Data Governance Board will confirm this alignment through the
Governance Checkpoint Process.
When releasing data as Open Data, the Entity must demonstrate compliance with both the approved Information
Security Standards and Data Management Standards. The Data Governance Board will approve the data's
publication.
The Entity must classify systems to identify those at risk of privacy breaches according to the Entity's privacy policy
(see DSP.2).
The Entity must ensure compliance with Payment Card Industry (PCI) Security Standards for information systems
handling credit card data through the Governance Checkpoint Process.
The Entity must ensure that cloud suppliers adhere to ISO/IEC 27017 Cloud Security Standards and ISO/IEC 27018
Handling of Personally Identifiable Information Standards ratified by the International Standards Organisation.
The Entity must create a privacy policy in line with government privacy laws, incorporating guidance from these
Standards, particularly regarding its line of business data. The policy should include relevant information and guidance.
The privacy policy should include a public privacy statement, clearly outlining stakeholders' privacy rights and the
Entity's privacy obligations. It should remain aligned with cross-government policies.
In consultation with legal experts, individuals from whom data is collected should have the rights to view, correct
inaccuracies, and request the removal of their data when no longer relevant.
The Entity should clarify the purpose and use of personal data at the point of collection and offer stakeholders a
mechanism to opt out of non-core activities.
The Entity must create and maintain privacy metadata for its master profiles, clearly identifying attributes containing
private data. This metadata should be stored in the Data Catalogue.
The Entity's Open Data policy should align with the Data Privacy policy, ensuring no data that breaches individual
privacy is made public. Special attention should be given to preventing the 'mosaic effect,' which combines data from
multiple sources to identify individuals.
Develop an awareness program for the data privacy policy, disseminating it to all users of private data to remind them of
their responsibilities regarding data privacy.
The Entity must adhere to the principles of 'Privacy by Design,' which include being proactive, making privacy the default
setting, embedding privacy into designs, accommodating all legitimate interests, ensuring end-to-end security, providing
visibility and transparency, and respecting user privacy. This approach helps the Entity detect privacy issues early and
reduce privacy risks and costs.
The Entity should develop training and awareness materials on the principles and objectives of 'Privacy by Design' for
technical and business users responsible for designing information systems and processes.
The Entity needs to identify and address any deficiencies in its existing data sources regarding compliance with the
principles of 'Privacy by Design.' The requirements from the gap analysis should inform the Entity's target data
architecture at the enterprise level and within specific information systems as needed.
Data governance checkpoints should be used to confirm alignment with the principles of 'Privacy by Design' when:
The Entity must establish a privacy management workflow for identifying, logging, investigating, and resolving data
privacy-related issues in line with its privacy policy. This workflow should encompass issues reported by both internal
users and external stakeholders, involving steps for evidence collection, post-incident analysis, reporting, and
corrective actions. It is used to monitor policy implementation effectiveness, and the Entity should report privacy-
related metrics to cross-government working groups.
The Entity should provide individuals with a route to update or correct their private data, and these updates should be
part of a data quality audit.
The Entity, following its privacy policy, must respond promptly to requests for data disclosure from individuals, with
response times established by the Data Governance Board. Requests should be monitored to ensure timely action.
The Entity shall evaluate requests for data removal in accordance with its privacy policy, balancing business needs
with individual privacy. Requests should be handled internally, with an appeal process available to individuals if
needed, potentially involving cross-Entity collaboration. The Data Manager makes the final decision.
The Entity should take steps to prevent data loss and privacy breaches, considering appropriate architectural
components to enhance information system protection. These components may include data-loss prevention tools,
database activity monitoring, and data discovery tools. The Data Governance Board assesses data loss risks for each
system and incorporates technical components into the target data architectures as needed.
Data security and privacy requirements must be observed in production information systems within test, development,
and training environments. When using a subset of production data, data masking technologies should be applied to
protect sensitive information. Data masking techniques involve transforming, obfuscating, or randomizing data.
Consideration should be given to preserving the characteristics of real "Live" data in test or training environments.
Data quality audits are necessary when considering data masking.
The Entity must engage an infrastructure audit team knowledgeable about platform utilization metrics
and the prevalent hardware and software configurations across the Entity.
The Entity shall perform a comprehensive audit of physical inventory, encompassing Data Centers and
other sites. The audit should record various fields for each system, including location,
service/application name, server details, hardware, software, and more.
The Entity is required to conduct a logical audit of network inventory to reconcile with the physical
inventory. Tools like Spiceworks, SolarWinds, HP Open Computers and Software Inventory Next
Generation, or a Configuration Management Database (CMDB) instance should be used for this
purpose. Any discrepancies between the two audits should be reconciled through a remediation plan.
ADCMA has completed the physical inventory audit and provided evidence.
ADCMA is transitioning from Solarwinds to OP Manager Managed Engine for its operations.
Engage an infrastructure Architecture team to determine the target architecture for Data Centres.
Ensure the target architecture includes flexible infrastructure capabilities like IaaS and PaaS models.
Choose an appropriate cloud deployment model: Private, Community, Public, or Hybrid, while avoiding public cloud for
Abu Dhabi Government Data.
Determine Data Centre Tier: Decide on a Data Centre Tier based on availability and infrastructure criteria.
Compliance with Tiers: Ensure that Data Centre Standards align with the selected Tier and Cloud Deployment Model.
Consider All Options: Explore various data center strategies, including government solutions.
Cost-Benefit Analysis: Evaluate the financial aspects of data center and cloud investments.
Data Centre Transformation: Plan a transition program considering capacity, budget, and sharing resources.
Transformation Program Review: Seek ADSIC approval for the Data Centre Transformation Program.
Execute Transformation Plan: Implement the approved Data Centre Transformation Plan.
Establish a Cloud Centre of Excellence: Create a team with various cloud management roles.
Keep Data Centre Plan Updated: Maintain an up-to-date development plan with periodic reviews.
Define RPO and RTO: Specify Recovery Point and Time Objectives for backup plans.
Offsite Backup: Store backup copies securely offsite with monitoring and fire protection.
BCDR Plan Implementation: Implement a Business Continuity and Disaster Recovery plan.
BCDR Strategy: Define a strategy for protecting, stabilizing, and recovering critical activities.
BCDR Plan Contents: Include roles, activation process, mitigation actions, communication, recovery, media, and
stand-down plans.
Regular BCDR Drills: Plan and execute annual BCDR drills and quarterly paper scenario exercises.
The Entity should establish a policy and standards for managing all recorded information across its lifecycle, ensuring
high quality, security, and availability.
All data within the Entity should be authentic, reliable, complete, unaltered, and usable, with clear chain of custody and
metadata.
The Entity should identify data owners, establish creation and disposal requirements, determine sharing requirements,
and train staff in data management.
The Entity must maintain an inventory of data in a Data Catalogue, provide an annual report to the Data Governance
Board, and address areas of non-compliance.
Data held by the Entity should follow the Information Lifecycle process, including creation, retention, maintenance,
use, retirement, and disposal, with a focus on security and efficiency.
The Entity shall implement a Strategic Integration Platform to facilitate data transfer, transformation, access auditing,
performance monitoring, security controls, and transaction management. It should be part of the target enterprise data
architecture.
The Entity's strategic integration platform should align with the metadata requirements of the Abu Dhabi Government
Interoperability Framework.
The Entity shall develop and publish a policy for usage of its strategic integration platform, covering internal, trusted
third-party, and external data sharing.
Consideration should be given to migrating existing data feeds into and out of information systems through the
Strategic Integration Platform. The Data Governance Board should assess the business value and reusability of each
data feed.
External integration with data from other Entities should be made through the ADSIC Enterprise Service Bus (ESB).
The Entity should not engage in peer-to-peer data transfers. Datasets available through the ESB should be published
in the Entity's Data Catalogue.
Data exchange through the Strategic Integration Platform should be secure and audited, complying with information
exchange requirements of the approved Information Security Standards.
The Entity should consider appropriate data exchange methods when integrating data between applications and
systems, including file-based, message-based, and database-to-database exchanges.
The Entity should plan to migrate peer-to-peer application data sharing to the Strategic Integration Platform in its target
data architecture, enabling data reusability. Justification is required for cases where migration is not possible due to
proprietary software.
The integration platform should have the capability to broker interactions across different integration patterns, such as
file-based and message-based data exchanges.
Data architectural consideration should be given to the data formats allowed by each data service integrated, with
XML and JSON formats preferred for data transfer between Entities. Industry or proprietary formats are allowed with
justification and should be documented in the Data Catalogue.
Data Transfer Protocols: Choose suitable protocols for connecting systems to the Integration Platform, such as FTP,
HTTP, SOAP, and more. Document these in your Data Catalog.
Prefer One-Way Integration: Favor one-way data sharing methods like Publish/Subscribe, Request/Response, or
Broadcast.
Justify Two-Way Integration: Provide reasons for using complex two-way data sharing and address concerns like
transaction management and data concurrency.
Data Integration Design: Plan for detecting data delivery failures, ensuring repeatable and idempotent data retries,
maintaining statelessness, and ensuring high availability. The Data Governance Board will review these design
considerations.
Service Level Agreements (SLAs): Define agreements covering data quality, volume, service availability, data structure,
change control, exception handling, and SLA monitoring frequency.
Internal SLAs: Create agreements for data sharing within your organization and resolve disputes through the Data
Governance Board.
Binding SLAs for External Entities: Establish binding service-level agreements with other government entities via the
ADSIC ESB. In case of non-compliance, follow the exception escalation process and engage cooperatively to
investigate issues with the service level agreement.
10
Open Data Review: The Entity must systematically evaluate data sources and prioritize making them open unless
there are security, privacy, or data quality concerns. The Data Governance Board reviews and sets criteria for closing
data sources.
Record Keeping: The Entity should maintain systematic records of data sources, clearly indicating their open or
closed status, and provide plain language definitions in the Data Catalogue.
Open Data Access: All data deemed open in the Open Data Review should be available through the Open Data
Portal in machine-readable and human-readable forms.
Data Authenticity: Data should be made available as close to the source as possible, with minimal manipulation.
Privacy and security concerns should be addressed with minimal data changes.
Open Data Plan: Develop a plan for releasing open data, including data review, quality checks, and any required
privacy or security adjustments.
Prioritizing Open Data: Prioritize data release by addressing security, privacy, business needs, and data quality.
Systematic Planning: The Open Data Plan should systematically address all datasets identified in the Open Data
Review.
Plan Monitoring: Regularly monitor progress against the Open Data Plan and review it quarterly.
Publication on Data Portal: Publish open data on the Abu Dhabi Government Open Data Portal.
Data Quality Maintenance: Continuously review open data to ensure it meets quality standards and address security
and privacy concerns.
Dealing with Issues: If open data fails quality or faces security/privacy concerns, suspend its publication, conduct a
new review, and address the issues before relisting.
Usage Tracking: Capture usage trends and statistics on data access and report to the Government Data Governance
Committee.
Annual Awareness Campaign: Conduct annual campaigns to inform stakeholders about open data, its quality, and
any security/privacy measures. Inform and educate internal and external stakeholders and the public.
Transparency in Withholding Data: If a dataset isn't published, use the awareness campaign to explain the reasons,
provide a publication timeline, or clarify if the dataset will remain unpublished.
11
Plan and publish a schedule for identifying reference data in information systems, including resource
allocation and reviews.
Establish a team responsible for managing reference data, including discovery, alignment, and change
management.
Identify and document reference data used in information systems, specifying values and definitions.
Ensure reference data values are codified, unique, and not case-sensitive, with associated
descriptions.
Align reference data with relevant standards, creating a "master reference data" dataset.
Regularly review the master reference data to accommodate new systems or changes.
Align reference data used in systems with the master reference data or provide mapping schemas.
Develop processes for actively managing reference data values, including requests and evaluations.
Define the Reference Data Change process, including requests, evaluations, and updates.
Capture and record requests, consultations, and decisions related to reference data changes.
Implement reference data export features for monitoring alignment with the master reference data.
Plan and publish a schedule for identifying master data in information systems, with resource allocation
and reviews.
Establish a team responsible for managing master data, including discovery, alignment, and cleansing.
Identify and define master data profiles, including semantic definitions and lifecycle details.
Implement controls to limit the use of non-primary master data records when duplicates exist.
Match and link equivalent master data records within information systems to identify duplicates.
Assess master data profiles for tangible benefits in merging duplicated records.
Execute master data initiatives to cleanse and deduplicate records when compelling benefits are
identified.
Match and link equivalent master data records across all Entity-owned systems and government-wide
systems.
Develop and publish KPIs and metrics to measure the numbers of master data records across
systems.
Identify master data records without equivalent links for data stewardship activities.
Develop and execute processes to actively manage master data records, prioritizing issues based on
importance and urgency.
Define the Master Data Change process, including identifying the primary information system,
maintaining master data records, and handling external data sources and publications.
Ensure that the process execution is documented and recorded for changes, consultations, and
decisions.
Implement processes to audit the population of master data across all information systems, including
measuring latency and data value alignment.
Implement master data export features for monitoring alignment with the primary master data dataset.
Implement a Master Data Management platform with various features, including workflow
management, multiple versions, import and export support, and data security measures.
Implement system processes to detect and identify new or unrecognized master data values for audit
and process review.
12
Establish quality standards for document and content management, including language style guides,
naming conventions, review processes, and version management.
Define requirements for document and content management, covering document standards, metadata,
retrieval procedures, retention policies, and more.
Ensure documents are authentic, reliable, complete, unaltered, and usable, with proper metadata.
Implement document systems and processes, including file plans, repositories, training, and
performance measurement.
During decommissioning of document systems, no new documents can be created, but existing
documents must remain accessible or be converted to a new system.
Determine retention policies based on business need, regulations, accountability, risks, privacy, and
stakeholder interests.
Establish a document classification scheme for consistent naming, security, access control, and
retention policies.
Ensure correct retirement and disposal techniques are employed, with options like physical destruction
or archiving.
Clearly document and regularly review the document lifecycle and associated processes.
Monitor and ensure compliance with document management processes, retention policies, and user
satisfaction.
Establish and maintain a training and awareness program for document and content management.
Choose a software solution that enables various aspects of document and records management,
including classification, metadata, versioning, retention policies, access control, audit trails, and ease
of use.
Consider international standards when selecting a software platform for document management.
13
Business Vision for Initiatives: Data warehouse and analytics initiatives should be driven by a clear
business vision. The Data Governance Board plays a key role in overseeing these initiatives.
Service Level Agreements (SLAs): SLAs should be developed to regulate data usage within the data
warehouse. They should include parameters such as data availability, data load latency, data retention,
and data quality.
Monitoring and Reporting: The entity should monitor the effectiveness of data warehouse initiatives
and report findings to the Data Governance Board. This should include technical alignment with the
architectural roadmap, implementation experiences, lessons learned, and business successes.
SLAs with External Data Suppliers: SLAs should be agreed upon with external data suppliers to
ensure confidence in externally sourced data. This includes defining ownership, issue resolution
workflows, data refresh cycles, and data quality requirements.
Data Staging Environment: A data staging environment should be used to collect, cleanse, match,
and merge source system data before adding it to the data warehouse. This can be a separate store or
part of an ETL tool.
Integration with Other Data Management Domains: Data warehouse initiatives should consider
other data management domains, including metadata, data catalog, data modeling, data architecture,
data quality, data security, data storage, data integration, and more.
Enriching Data with External Sources: The entity should explore sourcing and using external data to
enrich its own data for better business intelligence.
Use of COTS or Open Source Tools: Commercial Off The Shelf (COTS) or Open Source tools should
be preferred over internally developed tools, with justification required for internal development.
Usability and Complexity in Architectural Designs: Data warehouse designs should favor usability
but also consider implementation complexity. An incremental, business-focused approach is
recommended.
Data Warehouse Table Types: Different table types (staging, dimension, fact) should be used when
modeling the data warehouse, and data modeling should enhance understanding by stakeholders.
Use of Surrogate Keys: Dimension tables should have synthetic or surrogate primary keys to support
performance optimization.
Data Warehouse Schema Design: Simplest schema types like star schemas are preferred, and
deviations from star schemas should be justified.
Conforming Dimensions: Dimensions should be conformed for reuse across multiple fact tables to
support a gradual development of multiple data marts.
Sources for Data Calculations: Sources for data calculations should be present and maintained in
the data warehouse, with audited workflows for management.
Performance Metrics: Performance metrics should be developed to control data quality, volume, and
timeliness within the data warehouse.
Federated Data Warehouse: Data marts should be consolidated into a federated data warehouse with
common tooling and technology across all data marts.
Consolidation of Data Marts: The entity should include the timeline for consolidating data marts into a
federated data warehouse on the data architecture roadmap.
Reuse of Dimensions: Dimensions should be normalized and reused across data marts for efficient
data processing.
Maturity and Competency: The entity should identify effective data marts to develop maturity and
competency across various data marts.
Operational Data Store (ODS): An ODS should act as a data source for the enterprise data
warehouse.
Separation Between ODS and Data Warehouse: A clear separation between data for an ODS and
data in a data warehouse should be maintained.
Use of ODS for Current Data: The ODS should integrate, analyze, and report on current data when it
meets business requirements.
Use of Realistic Data: Realistic data should be used during the design and development of business
intelligence solutions, and reference to the data dictionary and business glossary is recommended.
Production of KPIs and Dashboards: Business intelligence tools should be used to produce KPIs,
dashboards, and scorecards that reflect the entity's business objectives.
Publication of Statistical Data: The entity should publish statistical data in line with the Statistics
Centre Abu Dhabi (SCAD) requirements and establish SLAs for data provided by SCAD.
Data Analysis Capabilities: The entity should develop data analysis capabilities suitable for its data
types and evaluate training opportunities.
Big Data Analysis: The entity should explore the use of 'Big Data' analysis techniques for high-
volume, high-velocity, or high-variety data.
Event Stream-Based Analytical Processing: Event stream-based analytical processing should be
implemented for high-velocity data analysis, and justifications for its implementation should be
evaluated.
14
The Entity shall establish an organizational structure to support the Data Management Program.
The organization shall be positioned in the Entity with sufficient authority such that it is empowered to
do its job effectively.
The organization will take responsibility and accountability for Data Management.
The organization will be based on the Roles and Responsibilities described in this control. An
illustrative example of an appropriate RACI matrix is provided in the appendix.
The Entity shall convene the Data Governance Board to manage delegated authority and responsibility
within the Entity.
The Board will be the final arbiter within the Entity for all matters relating to data management.
This Board should have representatives from each area affected by data management initiatives, with
the Data Manager responsible for the execution of the Board's actions through the program
management function of the Entity.
The Data Governance Board shall meet regularly (weekly, initially) to provide independent oversight
and support for the Data Management initiatives being undertaken by the Entity.
The Entity shall appoint a Data Manager.
The Data Manager shall have delegated authority from the Data Governance Board.
The Data Manager shall ensure compliance with governance, policy, and standards.
The Data Manager shall ensure the coordinated training and awareness programs are executed within
the Entity.
The Data Manager shall share best practices with other Entities.
The Entity shall identify and appoint Data Architects to support the Data Manager.
The Data Architects shall work with the Data Manager and the Data Governance Board to ensure the
implementation of the Data Management Standards in all designs across the Entity.
The Data Architects shall establish a clearly defined target state for all data sources.
The Data Architects shall establish a clearly defined roadmap to achieve the target state for all data
sources.
The Data Architects shall be responsible for developing and maintaining a formal description of the
data and data structures within the Entity, including data designs and design artifacts, dataset
metadata definitions, and data flows throughout the Entity.
The Entity shall identify and appoint Data Stewards to support the Data Manager in both the business
and technical areas of the organization.
The Data Stewards will take responsibility for the lifecycle of the data as it passes through information
systems and ownership boundaries.
The Data Stewards will take responsibility for the quality of the data under their stewardship and
cleanse the data as necessary.
The Entity shall identify and appoint Data Owners (who are responsible for a particular dataset) to
support the Data Stewards.
Data Owners will be drawn from both the business and technical areas of the organization.
The Data Owners will take responsibility for a particular dataset throughout the lifecycle across
systems.
The Data Owners will ensure the quality standards for their dataset are met.
The Data Owners will liaise between the business and technical stakeholders to ensure that their
dataset is maintained to the highest standards possible.
The Entity shall regularly undertake monitoring and compliance checking to ensure that information
systems and data-related processes are implemented in accordance with established policy,
standards, and best practices.
Such reviews should include coverage of the performance of the domain processes and user
satisfaction.
The Entity’s Data Management Policy shall address the scope of its data management systems, roles,
responsibilities, management commitment, coordination among organizational functions, and
compliance obligations.
The policy document shall be approved by the Entity's Data Management Board, Data Manager, and
the Entity's executive management.
The policy shall be published and communicated to all employees and relevant stakeholders.
The policy shall contain a definition of data management, its overall objectives and scope, and the
importance of data management as a pillar of upholding high standards of data quality.
The policy shall be applicable to all business functions of the organization and should be
supplemented by supporting instructions and guidance where appropriate for specific areas of activity.
The Entity shall establish its Data Management Policy, describing how data will be managed across
the Entity.
The Data Management Policy shall be supported by the production of an internal Document Retention
Policy, describing the Entity’s policy for retaining, archiving, and destroying documents (See Document
and Content controls).
In support of the Data Management Policy, the Entity shall establish policies for public consumption
where there are external stakeholders.
The Entity shall make publicly available policies including the Privacy Policy and Open Data Policy.
The policy shall include a clear statement of management intent, showing support for the principles of
data management and reinforcing its importance in alignment with government strategy.
The policy shall underline management expectations of teams and individuals when handling data and
highlight the importance of maintaining high levels of data quality at all points within the organization’s
operations.
The policy shall include governance metrics and process checkpoints within their policy, describing
how they will measure the effectiveness of data management throughout the Entity’s information
systems and processes on a continuous basis.
Measures and metrics should be maintained continuously and tracked to reveal trends, available for
audit purposes at all times.
The policy shall describe the mechanism allowing business and technical users to raise data-related
issues, including a clear escalation plan to ensure such issues are appropriately handled and resolved.
The policy shall describe the change management process and how it applies to the Data
Management Program and its initiatives.
The Entity shall ensure that all policy developments are aligned with all relevant legislation.
The Entity shall collect and maintain evidence of compliance with their policies and with the Control
Specifications within these standards.
The policy shall be quantifiable and traceable back to the Control Standards of this document; the
Entity should be able to demonstrate how each control will contribute to achieving a given policy
requirement.
Ensure that audit findings are thoroughly analyzed to confirm potential risks associated with
unaddressed issues.
Make certain that audit results are classified and protected at a level equivalent to the highest data
source's security classification being audited.
Efficiently coordinate Data Management audit activities with other audits within the organization to
achieve effective performance and compliance reporting while minimizing disruptions.
The organization must keep its Data Management Program Plan and Policy updated in response to
audit findings in each data domain.
Develop outcome-based performance metrics to evaluate the effectiveness and efficiency of the Data
Management Program. The Data Governance Board should oversee the definition of these metrics,
their alignment with the Program Plan, and data performance reporting to stakeholders.
The Data Governance Board's role includes setting performance metrics, analyzing data from various
domains, and reporting Data Management Program performance to relevant stakeholders at specified
intervals and in agreed formats.
Ensure that the organization's Data Management performance metrics align with the Abu Dhabi
Government Data Management Programme's indicators, enabling timely and accurate status reporting
to relevant stakeholders.
Data Management performance reporting should encompass several aspects, including compliant
technology and business processes, data architecture maintenance, completeness of data models,
data quality, and information lifecycle management.
Implement mechanisms for continuous improvement based on performance data analysis, and closely
monitor the cost, benefit, and status of proposed and implemented improvements.
The organization must adhere to applicable Abu Dhabi Government Metadata Standards, such as
eGIF, SCAD standards, and geospatial metadata standards.
Ensure that metadata management tools comply with ISO/IEC:11179 Metadata Registry Standards.
Comply with ISO/IEC:11179 Part 4 'Formulation of Data Definitions' for defining data, which presents
steps for developing unambiguous data definitions.
Follow the principles documented in ISO/IEC:11179 Part 5 'Naming and identification principles' for
developing meaningful names and identifiers.
Execute a metadata initiative to gather, store, and use metadata effectively, covering assessment of
existing metadata sources, requirements gathering, metadata architecture, data stewardship, and a
rollout plan.
Utilize Abu Dhabi government and international standards when developing metadata, including eGIF,
SCAD, Geospatial, and ADMS standards, ensuring that they align with the Entity's operational context.
Manage metadata using a combination of automated scanning and manual techniques, ensuring data
accuracy per a defined schedule.
In cases of metadata conflicts that cannot be resolved by Data Stewards, the Data Governance Board
is responsible for arbitration.
Ensure that all metadata is accessible through the Data Catalogue, serving as the user access point
for metadata, data dictionary, business glossary, and modeling and architectural deliverables.
The Data Catalogue should support indexing, search, and retrieval of metadata relevant to the user's
role.
Define measures for the quality of metadata names and definitions, including subjective business
experience and user surveys to assess metadata effectiveness.
Monitor and report on metadata quality, ensuring that metadata values identify the version they were
captured against.
Monitor metadata coverage across the organization's business functions, assessing metadata
definition, capture, and usage across departments.
Monitor the effectiveness of metadata stewardship through workflow monitoring, issue tracking,
training, and awareness programs.
Align the Data Catalogue with mandatory standards like Abu Dhabi Government eGIF Schema
Generation, DCAT, and XSD to facilitate interoperability.
Also, consider aligning the Data Catalogue with recommended standards such as ADMS and RDF,
providing justifications for non-alignment to the Data Governance Board where necessary.
Develop a Data Catalogue with key features, including a metadata repository, publishing portal,
workflow management tool, business glossary, data dictionary, data model repository, and version
control.
Align Data Catalogue requirements with government-wide data catalogue requirements as they evolve.
Design the Data Catalogue with usability in mind, using a standard vocabulary that represents real-
world concepts accurately.
Serve the Data Catalogue as a central access point for all data assets, even though the actual data
may reside in various systems.
Identify datasets suitable for inclusion in the Data Catalogue, including transactional data, reference
datasets, master data profiles, statistical data, and geospatial data.
Discover datasets using a combination of human interactions and technical tools for scanning data
sources.
Prioritize datasets for inclusion in the Data Catalogue based on past demand, business-level metadata
typically taking precedence.
Develop and store data models for captured datasets at both the business and technical levels.
Consider developing semantic data models that describe data relationships using a defined
vocabulary.
Define appropriate metadata for data capture, including ownership, security classification, data quality,
and version information.
Capture and populate metadata for each dataset, ensuring comprehensive information is included.
Establish a licensing model for data sharing and make it accessible through the Data Catalogue.
Create and execute an awareness program to promote data availability to business and technical
stakeholders, emphasizing data reusability.
Ensure that the System Development Lifecycle (SDLC) includes considerations for reusing datasets
from the Data Catalogue.
Encourage submissions for innovative data usage from various functions, evaluating their merit
through the Data Governance Board.
Allow consumers of datasets to register their usage, providing them with information about dataset
changes and updates.
Classify registered consumers as Formal or Informal, depending on the presence of service level
agreements.
Monitor and report Data Catalogue effectiveness using metrics like dataset coverage, registered
consumers, and completeness of metadata.
Review data models in the software development lifecycle as part of the Governance Checkpoint
Process.
Data models are critical for system development and should align with business and technology
requirements.
Implement data modeling tools with UML2.x compliance, XMI interchange support, metadata
association, versioning, and traceability.
Provide training for data modeling, tailored to user roles (e.g., business vs. DB admins).
Develop data models (conceptual, logical, physical) to document Entity's data structure.
Model unstructured data linked to structured data through business terms and concepts.
Model semi-structured/unstructured data with mandatory requirements and metadata.
Align unstructured data conversion with Unstructured Information Management Architecture (UIMA).
Create Data Flow Diagrams and Entity Relationship Diagrams for unstructured data.
Develop Data Models at the Conceptual, Logical, and Physical levels with references.
Publish data models for reference and re-use; justify deviations to Data Governance Board.
Divide large models into smaller, subject area-based models for clarity.
Clearly indicate current and unimplemented aspects in data models.
Apply naming conventions for data objects and relationships in data modeling.
Ensure Data Governance Board oversight and socialization of enterprise data models.
Use conceptual data models for documentation, change requests, and analysis.
Develop logical data models for data attributes and relationship rules.
Ensure alignment with Enterprise Data Model.
Use logical data models for documentation, change requests, and analysis.
Develop physical data models based on logical models.
Use physical data models for technical implementation and operational functions.
Reverse engineer data models from existing systems and link to logical models.
Produce Data Architecture deliverables for all Data Management Programme domains.
6
Provide data quality definitions in various categories.
Define measures for data quality (validity, timeliness, integrity, accuracy, reliability).
Use data profiling tools for systematic data audits with structured data column analysis, integrity
analysis, pattern recognition, and reporting.
Identify gaps between data quality definitions and measured quality, and execute data cleansing
initiatives.
Ensure target data architectures improve data quality with monitoring and cleansing components.
Apply Information Security Standards, resolving conflicts through the Data Governance Board.
Certify alignment with Information Security Standards for various information systems and
components.
Ensure Open Data policy aligns with Data Privacy policy to avoid privacy breaches.
Evaluate Data Removal Requests: The Entity will thoroughly review and consider data removal
requests concerning an individual's information, striking a balance between business needs and
privacy. The process will include an appeals mechanism and involve cross-Entity collaboration, if
necessary, with the Data Manager having the final say.
Strengthen Data Protection Measures: To safeguard against data loss and privacy breaches, the
Entity will deploy a range of security components. These include data-loss prevention tools, database
activity monitoring, and data discovery tools. Systems will be evaluated based on business value and
data sensitivity to assess the risk of data loss.
Secure Data in Testing Environments: Data security and privacy standards will extend to test,
development, and training environments. When utilizing subsets of production data, data masking
techniques will be employed to ensure data integrity while aligning with quality standards.
Engage Infrastructure Audit Team: An infrastructure audit team, well-versed in platform utilization
metrics and hardware configurations, will be brought in to comprehensively audit the Entity's
infrastructure.
Audit Physical Inventory: A comprehensive audit will be conducted for all systems within Data
Centers and other sites. The audit will involve recording specific details such as hardware models,
power requirements, and current statuses.
Conduct Logical Network Audit: The Entity will perform a logical audit of the network inventory to
ensure it aligns with the physical audit. Any discrepancies identified will be resolved through
remediation plans.
Audit Infrastructure Utilization: Infrastructure utilization audits will be carried out on all information
systems and servers to assess actual loads during usage scenarios, both at peak and baseline levels.
Determine Infrastructure Capacity: Based on the audits, the Entity will determine the current
infrastructure's capacity and utilization trends, providing insights into consolidation ratios and future
capacity requirements.
Categorize Systems by Criticality: Information systems will be categorized based on criticality levels,
allowing for classification as Core Infrastructure, Critical, High, Medium, or Low. This classification will
align with the Entity's business priorities.
Classify Systems for Portability: The categorization of systems into Legacy, Virtualize-able, or
Cloud-able will aid in assessing their suitability for migration to target architectures.
Create Migration List: A migration list will be developed, considering portability and criticality, to
identify systems earmarked for migration.
Adopt a Flexible Target Architecture: The target architecture will reflect the latest flexible
infrastructure capabilities, such as Infrastructure-as-a-Service (IaaS) or Platform-as-a-Service (PaaS),
ensuring alignment with the Entity's requirements.
Choose a Cloud Deployment Model: A suitable cloud deployment model—Private Cloud, Community
Cloud, Public Cloud, or Hybrid Cloud—will be selected based on alignment with the Entity's
requirements and the Abu Dhabi Government's data center capabilities.
Refer to TIA942 Standards: The Entity will adhere to TIA942 standards for its infrastructure, including
access, power, cooling, and networking, ensuring compliance.
Set Data Center Standards: Establishment of data center standards, aligning with the determined Tier
level and chosen Cloud deployment model, is crucial to meet specific criteria.
Explore All Options: The Entity will thoroughly explore various data center strategies before making
final commitments, considering emerging solutions in the Abu Dhabi Government's data center
landscape.
Consider Cost-Sharing and Resilience: To optimize costs and enhance resilience, the Entity will
explore opportunities to share data center capacity and resources with other government entities.
Plan a Data Center Transformation Program: The Entity will plan a structured Data Center
Transformation program to transition from the current state to the target architecture within the
timeframe of the Abu Dhabi Government Data Management Programme.
Submit the Program for Review: The Data Center Transformation program will be submitted to
ADSIC for review and approval, ensuring alignment with government policies and standards.
Execute the Data Center Transformation Plan: Upon approval by ADSIC, the Entity will execute the
approved Data Center Transformation Plan, ensuring compliance with the planned changes.
Establish a 'Cloud' Center of Excellence Team: A dedicated 'Cloud' Center of Excellence team will
be formed to manage various aspects of the cloud infrastructure and administration.
Continuously Monitor Capacity and Utilization: Regular monitoring of capacity and utilization will
ensure optimal performance and allow the Entity to address any potential issues proactively.
Regularly Audit Capacity and Utilization: Periodic audits will follow the methodology outlined in DS1
to ensure the efficient functioning of the infrastructure.
Keep a Data Center Development Plan Up-to-Date: Maintaining an up-to-date development plan will
ensure that the data center's growth aligns with evolving requirements, reviewed annually and
refreshed every three years.
Implement a Backup Plan: A comprehensive backup plan, compliant with approved Information
Security Standards, will be implemented to ensure data integrity and availability in the event of loss or
system failure.
Define Recovery Point Objectives (RPO) and Recovery Time Objectives (RTO): Specific RPO and
RTO for each system covered by the backup plan will be determined and approved by the Data
Governance Board to align with business needs.
Conduct Regular Backup Availability Tests: Regular tests of the availability and effectiveness of the
backup and recovery procedures will be conducted, validating RPO and RTO objectives, backup
schedules, and restoration processes.
Prefer Remote Disk Backup for Offsite Storage: Remote disk backup will be favored for offsite
storage, ensuring secure and environmentally protected backup copies.
Conduct Cost/Benefit Analysis for Backup Processes: A thorough evaluation of the costs and
benefits of different backup processes will be undertaken, emphasizing modern solutions for backup
efficiency.
Implement a Business Continuity and Disaster Recovery (BCDR) Plan: A comprehensive BCDR
plan will be developed and implemented in compliance with approved Information Security Standards,
ensuring effective response to potential disasters.
Determine Appropriate BCDR Strategy: The Entity will assess and determine the most suitable
Business Continuity and Disaster Recovery (BCDR) strategy, taking into account critical activities,
incident mitigation, and vendor evaluations.
BDR Drills and Scenario Exercises: The Entity will plan and execute regular BCDR drills and paper
scenario exercises, ensuring that teams are well-prepared for various disaster scenarios.
Comprehensive BCDR Plan Contents: The Entity's BCDR plan will encompass defined roles and
responsibilities, incident response processes, actions for mitigating consequences, communication
plans, and clear recovery priorities.
Effective Information Lifecycle Management: The Entity will establish a policy and standards for
managing recorded information throughout its lifecycle, ensuring that data is authentic, reliable,
complete, unaltered, and usable.
Data Management Ownership: Data ownership and responsibilities will be clearly assigned, including
data class creation, disposal requirements, information-sharing, and security protocols.
Data Inventory Maintenance: The Entity will maintain an up-to-date Data Catalogue to facilitate
reporting on the status of the Data Inventory, departmental compliance with Data Management
Standards, and areas of risk and recommendations.
Information Lifecycle Stages: Data will be managed throughout the Information Lifecycle stages,
including creation, retention, maintenance, use, retirement, and disposal, ensuring data quality,
accuracy, and security.
Tight Governance and Monitoring: Stringent governance and monitoring will be maintained, with
data classification, retention, and disposal adhering to policies and procedures to prevent unauthorized
changes and ensure data security.
Strategic Integration Platform: The Entity will implement a Strategic Integration Platform as part of its
target enterprise data architecture, enabling data transfer, transformation, access auditing,
performance monitoring, security controls, and transaction management for efficient data integration.
Strategic Integration Platform and eGIF Metadata Alignment The Entity ensures that its
strategic integration platform aligns with the metadata requirements of the Abu Dhabi
Government Interoperability Framework (eGIF).
Policy for Usage of Strategic Integration Platform The Entity develops and publishes a
policy for using its strategic integration platform, covering internal, trusted third-party, and
external data sharing, encouraging cross-functional data sharing and considering service
level agreements.
Migration of Data Feeds Consideration is given to migrating existing data feeds into the
Strategic Integration Platform, evaluating business value and reusability.
External Integration Through ADSIC ESB Data integration with other Entities occurs
through the ADSIC Enterprise Service Bus (ESB) to avoid peer-to-peer data transfers.
Secure and Audited Data Exchange Data is exchanged securely and in compliance with
information exchange requirements outlined in Abu Dhabi Government's Information
Security Standards.
Data Exchange Methods The Entity considers appropriate data exchange methods,
including file-based, message-based, and database-to-database exchange methods.
Migration of Peer-to-Peer Data Sharing The Entity plans to migrate peer-to-peer application
data sharing to the Strategic Integration Platform for data reusability.
Integration Patterns Capability The integration platform allows different integration patterns,
including file-based and message-based exchanges.
Data Format Consideration Data architectural considerations are given to data formats
used in data services, with a preference for XML and JSON.
Data Transfer Protocols Data architectural considerations are given to data transfer
protocols, including FTP, HTTP, SOAP, ODBC, and JDBC.
One-Way Integration Patterns The Entity favors one-way integration patterns, including
publish/subscribe, request/response, and broadcast.
Data Integration Designs Consideration is given to data integration designs for detecting
delivery failure, repeatable retries, statelessness, and high availability.
10
Service Level Agreements Service level agreements (SLAs) are established, covering data
quality, data volume, availability, data variety, change control, exception escalation, and
SLA monitoring frequency.
Internal Service Level Agreements Internal SLAs are produced for data sharing within the
Entity, with dispute resolution through the Data Governance Board.
Binding Service-Level Agreements Binding SLAs are produced for data sharing between
Abu Dhabi Government Entities through the ADSIC ESB, with escalation procedures in
case of non-compliance.
Open Data Review The Entity systematically reviews all data sources, considering 'Open
By Default,' and criteria for closing sources are defined.
Open Data Records Systematic records are maintained, indicating the open or closed
status of data sources in the Entity's Data Catalogue.
Open Data Publication Open data is made available through the Open Data Portal in
machine-readable and, where practicable, human-readable forms.
Data Manipulation and Privacy Data is made available in its closest-to-source form, with
minimal manipulation, aggregation, redaction, or anonymization, taking into account privacy
and security concerns.
Open Data Plan An Open Data Plan is developed based on the Open Data Review,
ensuring data quality, addressing security and privacy, business priorities, and demand.
Open Data Plan Prioritization The Open Data Plan prioritizes data release based on various
criteria, including security, business priorities, demand, and data quality.
Systematic Dataset Address The Open Data Plan systematically addresses all datasets
identified in the Open Data Review.
Open Data Plan Monitoring Progress against the Open Data Plan is monitored and
reviewed quarterly.
Open Data Publication The Entity publishes its Open Data on the Abu Dhabi Government
Open Data Portal.
Continuous Data Quality and Privacy Review Open Data is regularly reviewed for
continuous data quality and privacy compliance.
Handling Open Data Failures In case of data quality or security concerns, the Entity
suspends Open Data publication, reviews the dataset, and executes a mitigation plan.
Data Usage Monitoring Usage trends and statistics regarding data access are captured and
reported to the Government Data Governance Committee.
Non-Published Datasets Explanation In cases of datasets not published, the Entity uses the
awareness campaign to explain the reasons, potential future publication, or the dataset's
non-publication status.
Reference Data Management Plan The Entity plans activities to identify reference data
used in its information systems, covering resources, scoping, and regular reviews.
Reference Data Management Team A dedicated team is established for reference data
management, including discovery, alignment, change management, and coordination.
Definition of Reference Data Reference data used in information systems is identified and
defined, including values and semantic definitions.
Codification of Reference Data All reference data values are codified as unique, non-case-
sensitive, contiguous values with associated descriptions and metadata.
Alignment with Standards Reference data values are aligned with government and local
standards, forming the 'master reference data' dataset.
Regular Reviews Regular reviews of the 'master reference data' dataset are conducted to
incorporate new information systems and changes.
Alignment or Mapping Reference data used in information systems is aligned with the
'master reference data' dataset or mapped, accounting for bi-directional transformations.
Multilingual Reference Data Reference data values are described in Arabic and English.
Active Management Processes are developed to actively manage reference data values,
allowing requests, evaluations, and applications.
Reference Data Change Process The Reference Data Change process is defined,
Process Execution and Evidence The Entity shall ensure that processes are executed in a manner
that allows for the capture and recording of requests, consultations, and decisions.
Reference Data Auditing The Entity shall establish processes to audit the population of reference
data across all its information systems to ensure data integrity.
Reference Data Export for Alignment The Entity shall implement features to export reference data
from information systems for comparison with the 'master reference data' dataset, facilitating alignment
monitoring and analysis.
Reference Data Management Platform The Entity shall implement a comprehensive Reference Data
Management platform with features such as workflow management, support for multiple versions of
reference data, API integration, and more, to effectively manage reference data.
Detection of Unrecognized Reference Data The Entity shall create system processes to detect and
identify the use of new or unrecognised reference data values, triggering audit and process reviews.
Master Data Management Plan The Entity shall plan and publish a schedule for identifying and
managing master data across its information systems, encompassing resource allocation, data
discovery, ongoing stewardship, and more.
Master Data Governance Team The Entity shall establish a dedicated team responsible for managing
all master data, including ownership, accountability, and consultation for significant dataset changes.
Master Data Definition and Lifecycle The Entity shall define master data elements, their semantic
definitions, and their lifecycles in both business and technical contexts.
Unique Codification of Master Data The Entity shall ensure that all master data records are uniquely
identified and codified with contiguous non-whitespace values, and such code values shall be non-
case-sensitive.
Metrics for Duplicate Master Data Records The Entity shall develop and publish key performance
indicators and metrics for measuring the number of duplicated master data records within each
information system.
Controls for Non-Primary Master Data Records The Entity shall implement systematic controls to
limit the use of non-primary master data records within information systems where feasible.
Matching and Linking of Master Data Records The Entity shall match and link equivalent master
data records within each information system to identify duplicate records.
Benefit Analysis for Data Merging For master data profiles that could benefit from merging
duplicated records, the Entity shall conduct a benefit analysis, considering data that references these
records and ensuring data references are updated accordingly.
Master Data Cleansing Initiative When a compelling benefit case or government-wide mandate
exists, the Entity shall schedule and execute a master data initiative to cleanse master data and
associated data to eliminate duplicate entries.
Cross-System Data Matching and Linking The Entity shall match and link equivalent master data
records across all information systems, including those operated by third parties, with special attention
to primary systems.
Metrics for Master Data Records The Entity shall develop and publish metrics to measure the
number of master data records and their equivalents across different information systems.
Unlinked Master Data Focus The Entity should be capable of identifying master data records that
have not been linked to any equivalent records, focusing on data stewardship and conducting regular
reviews.
Safeguarding Reference Data Values The Entity shall implement appropriate system safeguards to
monitor the use of reference data values to ensure their compliance with approved reference data.
Regular System Reviews Regular reviews, as outlined in the Master Data Initiatives plan, shall be
conducted to address new information systems and assess changes not identified through operational
processes.
Multi-Lingual Master Data The Entity shall ensure that all master data values can be described in
more than one language to accommodate linguistic diversity.
Active Master Data Management The Entity shall establish processes for actively managing master
data records, prioritizing issues based on importance and urgency.
Master Data Change Process The Entity shall define the process for maintaining master data
records, including identification of primary systems, maintenance processes, and incorporation of
external data.
Process Execution and Evidence The Entity shall ensure that process execution is well-documented,
including the capture and recording of changes, consultations, and decisions.
Master Data Auditing The Entity shall implement processes to audit the population of master data
across its information systems and develop metrics to measure data updates and alignment.
Master Data Export for Alignment The Entity shall create features for exporting master data from
information systems to compare with the 'primary master data' dataset, enabling alignment monitoring
and initial analysis.
Master Data Management Platform The Entity shall implement a comprehensive Master Data
Management platform with various features, including workflow management, support for multiple
versions of master data, API integration, and more.
Detection of Unrecognized Master Data The Entity shall establish processes to detect and identify
new or unrecognised master data values, ensuring they align with the change management process.
Quality Standards for Documents and Content The Entity shall define quality standards for
managing documents and content, encompassing language style guides, naming conventions, review
processes, and version management.
Document Management Requirements The Entity shall define requirements for managing
documents and content, covering document standards, metadata, retrieval procedures, and legal
considerations.
Document Integrity and Reliability The Entity shall ensure documents are authentic, reliable,
complete, unaltered, and usable, with proper records of changes and access.
Document System Implementation When implementing document systems, the Entity shall establish
file plans, repositories, training, data transfer, standards, retention timelines, and strategic integration.
Decommissioning Document Systems When decommissioning document systems, the Entity shall
discontinue new document creation, ensure accessibility for existing documents, and consider
migration to new systems.
Retention Policy Determination The Entity shall determine appropriate retention policies for
document types based on business needs, regulatory requirements, privacy concerns, and risk
assessments.
Document Classification Scheme The Entity shall establish a document classification scheme for
consistent naming, efficient retrieval, security provisions, access control, and retention policies.
Document Retirement and Disposal The Entity shall employ proper techniques for retiring and
disposing of documents, considering physical destruction, offline retention, archive handover, or
transfer of responsibility.
Document Lifecycle Management The Entity shall document and regularly review the lifecycle and
processes associated with documents and content.
Monitoring and Compliance The Entity shall conduct regular monitoring and compliance checks to
ensure that document systems and processes adhere to established policies and standards.
Training and Awareness The Entity shall maintain an ongoing training and awareness program for
document and content management, covering training requirements, policies, legal frameworks, and
system usage.
Document Management Solution Criteria The solution chosen by the Entity should meet criteria
enabling effective document management, including classification, metadata management, versioning,
search, access control, and auditing.
International Standards for Software Selection When selecting a software platform for document
management, the Entity may refer to related international standards for guidance.
usiness Vision-Driven Initiatives The Entity ensures data warehouse and business intelligence
initiatives align with a clear business vision. The Data Governance Board plays a crucial role in
overseeing these initiatives.
Service Level Agreements (SLAs) for Data Warehouse The Entity develops SLAs based on
business needs, defining SLAs for data availability, load latency, retention, and data quality.
Effective Data Warehouse Monitoring The Entity monitors and reports on initiative effectiveness,
sharing findings with the Data Governance Board for knowledge sharing across government entities.
SLAs with External Data Suppliers The Entity agrees on SLAs with external data suppliers to gain
confidence in externally supplied data, covering aspects like ownership, issue resolution, data refresh
cycles, and quality standards.
Data Staging Environment The Entity employs data staging for source data cleansing and merging,
utilizing various data-staging options.
Data Warehouse Initiative Considerations The Entity ensures alignment with various data
management domains, covering metadata, data catalog, modeling, architecture, quality, security,
storage, integration, master, reference, and open data.
Enrichment with External Data The Entity explores the use of external data sources to enhance
business intelligence.
Preference for COTS and Open Source The Entity prioritizes Commercial Off The Shelf (COTS) and
Open Source tools over internally developed ones, providing justification when necessary.
Usability-Oriented Architectural Designs The Entity favors data warehouse architectural designs
that prioritize usability while considering implementation complexity. An incremental, business-focused
approach is recommended.
Data Warehouse Table Types The Entity explains the different types of data warehouse tables,
including data staging, dimension, and fact tables.
Synthetic Primary Keys for Dimension Tables The Entity uses synthetic or surrogate primary keys
for dimension tables to optimize performance.
Schema Design Preference The Entity prefers star schemas for simplicity but justifies deviations
when necessary.
Conformed Dimensions for Reuse The Entity promotes conformed dimensions for reuse and
explains the concept and benefits.
Sources for Data Calculations The Entity ensures sources for data calculations are present and
managed within the data warehouse.
Performance Metrics for Data Quality The Entity develops performance metrics to control data
quality, volume, and timeliness.
Federated Data Warehouse The Entity normalizes data warehouse technology and explains the
concept of a federated data warehouse.
Consolidating Data Marts The Entity includes the consolidation of data marts in the data architecture
roadmap.
Dimension Normalization and Reuse The Entity normalizes and reuses dimensions for data
processing and reporting.
Maturity Development for Data Marts The Entity identifies effective data marts for organizational
maturity and competency.
Operational Data Store as Data Source The Entity uses the operational data store (ODS) as a data
source for the enterprise data warehouse.
Separation of ODS and Data Warehouse The Entity explains the distinction between ODS and data
warehouse data, highlighting their purposes and usage.
ODS Functionality and Integration The Entity utilizes ODS functionality for integrating, analyzing,
and reporting on current data.
Realistic Data for Business Intelligence The Entity emphasizes using realistic data for clarity, with
reference to the data dictionary and business glossary.
Classify Business Intelligence Initiatives The Entity classifies BI initiatives according to types,
explaining tactical, strategic, and operational BI.
Integration with Enterprise Reporting The Entity integrates BI reporting with enterprise reporting and
discusses the differentiation between enterprise reporting and application reporting.
Non-Authoritative VGI Compliance The Entity complies with government directives on non-
authoritative Volunteered Geographical Information (VGI).
KPIs, Dashboards, and Scorecards The Entity develops and uses BI tooling for key performance
indicators, dashboards, and scorecards.
Statistical Data Publication The Entity publishes statistical data in line with Statistics Centre Abu
Dhabi (SCAD) requirements, establishing SLAs for SCAD-provided statistical data.
Comments Compliance Status
Not applicable
As of this assessment, there are no privacy policies
developed for Information Security Standards which
align with ADG DM Policy specifications, around
Structured data (data stored in tables), data Partially
collected from external entities, websites implemented
(internal/external), video data, sensor data,
Surveillance data and metadata
As of this assessment,
DLP : Data Loss Management is not implemented.
The following products are installed for;
- Network End Point Security.
- Trend micro Apex One – Endpoint
DAM Tools: SIEM solution is currently used to
gather DB access/modification data and provide
alerts for exception activities. SIEM is collecting
audit logs from Database
Not implemented
Data Discovery:
There is no Data Discovery tool implemented.
Not implemented
ADCMA team did not have any documented open Not implemented
data identification and prioritization done.
ADCMA team did not have any documented open Not implemented
data identification and prioritization done.
Not implemented
Not implemented
Not implemented
Not implemented
Not implemented
Not implemented
Not implemented
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
Not applicable
As of this assessment,
1. There is no organisational document
management standard available which mandates
the Quality standards for document and content.
With the ECM programme planned in Q2 2023, it is
recommended that ADCMA establishes
requirements for the programme to establish
organisational document and content management
standards which defines guidelines for document Not applicable
writing, uniform document experience (look-and-
feel), document naming conventions, document
editorial processes.
Not applicable
Not applicable
Not implemented
Not applicable
Not applicable
Planned
Not implemented
Not implemented
Not implemented
Planned
Recommendation
The Data Governance Chair and Manager must use the DG structure from DLV 9,
refine policies as needed, and seek approval from the Data Governance Board.
Before implementing the DG structure and operating model, identify the Data
Manager role. According to the DG Operating Model, nominate the Data Manager
as part of the IT support services Data Governance core team.
The Data Assessment Program will shape Data Governance, Operating Model, and
Roles & Responsibilities, including Data Architects. ADCMA will identify the Data
Manager before implementing DG structure, model, and R&R.
The Data Assessment Program will establish Data Governance, define the
Operating Model, and Roles & Responsibilities, including Data Stewards for
relevant Data Domains. ADCMA will identify the Data Manager before
operationalizing the DG structure and model.
The Data Assessment Program will outline Data Governance, establish the
Operating Model, and define Roles & Responsibilities, including the role of the
Data Manager. ADCMA will identify the Data Manager before putting the DG
structure, model, and R&R into operation.
Once the Data Governance Board is in place, regularly monitor and enforce
compliance with data domain control policies, standards, and best practices across
Information Systems.
The Data Management program will propose a policy that outlines system control
and roles in alignment with ADG Policies.
The Data Assessment Program will define the Data Governance blueprint,
Operating Model, Roles & Responsibilities, and the policy. The policy will outline
data management, its objectives, scope, and importance in maintaining high data
quality standards. ADCMA aims to operationalize the DG structure, operating
model, and R&R.
The Data Assessment Program will establish the Data Governance blueprint, define
the Operating Model and Roles & Responsibilities. The policy will define data
management, its objectives, scope, and its critical role in maintaining high data
quality standards.
The Data Assessment Program will establish the Data Governance blueprint, define
the Operating Model and Roles & Responsibilities. The policy will define data
management, its objectives, scope, and underscore its essential role in upholding
high data quality standards.
The Data Assessment Program will establish the Data Governance blueprint, define
the Operating Model, and Roles & Responsibilities, including the role of the Data
Manager. The policy will define data management, its objectives, scope, and
underscore its pivotal role in upholding high data quality standards.
The Data Assessment Program will shape the Data Governance blueprint,
Operating Model, and Roles & Responsibilities, including the Data Manager's role.
The policy will define data management, its goals, scope, and underscore its pivotal
role in upholding high data quality standards.
The data management policy will encompass data lifecycle management for
specific data domains, such as Data Quality, Data Security & Privacy, Master &
Reference Data Management, Data Retention & Disposition, Data Access
Management, Data Catalog Management, and data classification.
The proposed data management policy will align with the defined data
management principles as per the ADG-DMS and ADG Policies.
The Data Quality management policy and standards will be established within the
data management program. They will be implemented among ADCMA team
members (Data Stewards) through change management and awareness initiatives.
The Data Management Policy will be expanded to include governance metrics and
process checkpoints in future iterations of the Data Assessment program.
Once the Data Governance board is operational, members will convene to review
and update policies as needed to ensure their ongoing relevance, adequacy, and
effectiveness.
After the Data Governance board becomes operational, its members will meet to
review and update policies as necessary to ensure alignment with relevant
legislation.
Once the Data Governance board is operational, members will meet to review and
update policies to align with relevant legislation and securely store evidence in the
ADCMA's secured environment.
The Data Management Policy will be updated in future Data Assessment program
iterations to include governance metrics and process checkpoints, ensuring
traceability to Control Standards. It will also incorporate quantifiable metrics, and a
registry will be maintained, allowing for traceability to the applicable Control
Standards.
ADCMA will review the current NDA to potentially incorporate the Data
Management Policy, thereby reinforcing the obligations of policy consumers.
In the next Data Assessment program iteration, define the Data Management
Program implementation plan, ensuring alignment with business strategy, goals,
and objectives. Post-implementation, the DG board will monitor plan effectiveness
and submit it to ADDA as per the applicable process.
In the next Data Assessment iteration, it is recommended to establish the Data
Management Program implementation plan, aligning it with business strategy,
goals, and objectives. After operationalization, the DG board will verify the plan's
effectiveness.
The Data Assessment program will create a baseline capability for the DW and BI
Analytics platform, focusing on Data Governance, Organisational Awareness and
Training, Data Architecture Management, and Inter-Entity Data Integration. It's
suggested to expand the plan to encompass Reference and Master Data
Management if relevant.
The Data Assessment Program will create a Data Governance blueprint, define the
Operating Model and Roles & Responsibilities. The organization will become
operational, and the first board meeting is scheduled for Q1 2024.
Data Governance change management will align with the existing Change
Management model, which includes Standard (pre-approved) changes, Critical
Changes (with defined types), and Emergency change management.
The Data Assessment program will define the roadmap for the BI Platform,
including Data Integration, Data Collection, Quality, Transformation, Storage, and
Visualization. Any proposed changes to these BI layers, the baseline Data
Governance Model, or the Checkpoint Process will necessitate a Change Impact
Assessment before being presented for review and approval by the Data
Management Board.
The data management program's change management process will align with
ongoing ADCMA initiatives and involve a review and approval process by the DG
board.
After the proposed Change Management plan is in place, any changes to the
proposed BI and Analytics platform layers will require a Change Impact Assessment
before submission for review and approval by the Data Management Board. This
assessment process will also apply to changes suggested for the baseline Data
Governance Model and Checkpoint Process.
Upon implementing the proposed Change Management plan, any modifications to
the proposed BI and Analytics platform layers must go through a Change Impact
Assessment before being submitted for review and approval by the Data
Management Board. This assessment process will also be applied to changes
suggested for the baseline Data Governance Model and Checkpoint Process.
As the next steps for the Data Management program, it's recommended to define
the Data Management Audit framework, ensuring alignment with the Data
Management Program, and align the Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
the Data Management Audit framework to align with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework aligned with the Data Management Program
and aligning the Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Assessment program include defining
Data Management performance metrics for the relevant data domains.
The recommended next steps for the Data Assessment program include defining
Data Management performance metrics for the relevant data domains.
The recommended next steps for the Data Assessment program include defining
Data Management performance metrics for the relevant data domains.
The Data Auditor from IA should be nominated as part of the Data Governance
Council who can function as an independent auditor of the Data Management
Programme audit, compliance, Governance checkpoint results.
The Data Governance Board should closely track the budget, effectiveness,
performance of the overall Data Management Programme and Governance.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.
Data categorization and data governance control will be integrated along with the
BIA system implementation by ADCMA. The precise controls and technology
remedy for this control will be defined with the aid of a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
ADCMA will integrate data categorization and data governance control with the BIA
system implementation. With the help of a data governance architect, the precise
controls and technological solution for this control will be determined.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.
Data categorization and data governance control will be integrated along with the
BIA system implementation by ADCMA. The precise controls and technology
remedy for this control will be defined with the aid of a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.
Currently for the In-house developed information systems, (ADCMA website CMS,
career portal etc.,), ADCMA is suggested to create and maintain data models.
Governance checkpoint processes should be enforced in SDLC (Software
Development Life Cycle) to ensure that the data models are maintained up to date
for all the projects impacting its data models in an information system. data
models should be promoted as one of core deliverables for an information system.
Data Governance board should recommend data modelling specific tool to be used
across ADCMA information systems. Data governance board should suggest
checkpoints for data model reviews in the SDLC. (Mainly should suggest exit criteria
for Design, Build and pre-implementation phases to validate the data model
changes as applicable.)
ADCMA should create a governance body for data modelling for each of the
information systems. Data Working group should create appropriate training
modules on data model for different roles that are present within information
systems. Depending upon the roles, training modules should provide information
on reading / interpreting the data models at different levels (Conceptual, Logical
and Physical) and also developing the data models in-line with target state
(Conceptual, Logical and Physical) data models.
Data Working group should be responsible for this activity.
data architects of respective information systems are responsible for this activity.
ADCMA currently does not enforce maintaining identifiable information for
unstructured data to link it with structured data. Data governance board should
consider defining guidelines and maintaining metadata with identifiable
infromation depending upon the type of unstructured data. (For example, for
document type of unstructured data, identifiable information should be
mandated.)
Data architects of respective information systems are responsible for this activity.
Data governance board should be able to define some comman standards and
guidelines to be followed by different information systems.
Data architects of respective information systems in-line with guidelines from data
governance board should be responsible for this activity.
Depending upon the business use cases, ADCMA should consider conversion of
semi-structured and unstructured data into structured form through
transformational or analytics conversion techniques. As per the current state of
ADCMA, this could be considered as advanced use case.
This specification is for developing and maintaining data models covering all
information systems in the enterprise. For all information systems within ADCMA,
Physical data model information to be mapped to logical models and at a higher
conceptual level. The Data Assessment program will establish the Conceptual Data
Model (CDM) for the newly proposed Business Intelligence and Analytics platform.
ADCMA should consider to develop Logical and Physical data models for Business
Intelligence and Analytics platform in-line with the conceptual data model.
Data models for ADCMA information systems should be created and maintained
with each project impacting the data models of respective information systems
giving equal importance to structured and unstructured data.
Data architects of respective information systems are responsible for this activity.
It is recommended that the data governance board should publish reference data
models for immediate reference of data model teams. Data governance board
should publish data models that could be re-used. When new information systems
are introduced, data model teams should refer to existing reference data models
and re-usable data models that could potentially be used.
Data architects of respective information systems are responsible for this activity.
Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform. While
developing the Logical, Physical data models for Business Intelligence and Analytics
platform, UML diagrams should be used as primary modellin notation.
Data architects of respective information systems are responsible for this activity
Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform. Entity
relationship diagrams and class diagrams to document the structure and
relationships of data objects at logical and physical levels should be developed
during the implementation of Business Intelligence and analytics use cases. For all
other applications of ADCMA, it is recommended to prepare Entity relationships at
Conceptual, Logical and physical level.
Data architects of respective information systems are responsible for this activity.
Data governance board should mandate to maintain data flow diagrams to model
the movement of data within and between the systems including but not limited to
maintaining master profiles. Governance Checkpoints within SDLC should be
established to ensure data flow diagrams are maintained.
Data architects of respective information systems along with change management
team are responsible for this activity.
Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform.
Conceptual model is defined at subject area and entity level. While developing
logical and physical models, subject areas and logical grouping of entities below
subject areas should be created depending upon the number of entities in each
subject areas. While creating data models for other applications in ADCMA, similar
approach should be followed.
Data modellers of respective information systems are responsible for this activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. This should be maintained along with each of the business intelligence
initiatives implemented in business intelligence and analytics platform. Data model
related artefacts should differentiate the components of the model that are
implemented and not implemented. Data modellers should provide guidelines to
data model teams on unified way of representing this.
Data modellers of respective information systems are responsible for this activity.
ADCMA should ensure that the data models are maintained by adhering to the
rules defined by ADCMA DMS.
when designing new conceptual data models below rules should be adhered:
• Data objects are represented by nouns
• Data relationships are represented by verbs
Following rules should be adhered to when designing new logical data models:
• The appropriate data type shall be used for attributes within tables. This shall
take into account performance, storage, and data requirements. Where a String or
other variable character data type is used, consideration must have first been given
for more appropriate data types
Following rules should be adhered to when designing new physical data models:
• Primary keys shall be numeric. Where there is not a suitable numeric candidate
key, a surrogate key in the form of an auto-numbering key shall be used
• Reference data tables shall have a numeric primary key (likewise, tables that use
reference data tables shall use the reference table's numeric primary key in the
foreign key relationship)
• Reference data tables will have, at a minimum, a numeric primary key and a code
value represented as a string. Additional payload information (such as textual
descriptions) may also exist as reference data (See RM.2.3)
• Physical data types that have a length or precision specifier shall have an
appropriate length or precision specified, and not left to the default value
Data modellers of respective information systems are responsible for this activity.
This specification is applicable for Entities that have MDM implemented. Currently
ADCMA does not have MDM needs and this specification is not applicable.
Currently ADCMA does not have business metadata / glossary maintenance
process to define business terms. With Data assessment project, data architecture
considers (Business and Technical) Metadata management for Business Intelligence
and analytics platform. Business terms used in Data model for BIA platform and
business glossary should be in sync. For business terms other than the ones used in
BIA Platform, ADCMA Should ensure that respective business glossary is
maintained.
Currently ADCMA does not have (Business and Technical) metadata / glossary
maintenance process to define business and technical terms. With Data
assessment project, data architecture considers (Business and Technical) Metadata
management for Business intelligence and analytics platform. Technical definitions
for all business terms under ADCMA's ownership should take input from logical and
physical data models. Technical definitions should be populated within the data
dictionary of ADCMA's data catalogue. Only Business Intelligence and Analytics
platform related technical metadata are planned. For other ADCMA systems also
technical definitions should be planned.
Data architects of respective information systems should be responsible for this
activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Considering that the conceptual data model delivered will be a living
document, data model versions should be maintained when there are any updates
to the conceptual data model in future. In-line, respective logical and physical data
models should also be maintained versions with appropriate naming conventions.
Data modellers of respective information systems are responsible for this activity.
For all the data models maintained for different information systems of ADCMA
should maintain traceability between different views of the data model. Some of
the standard data model tools allow to maintain traceability links between
different views (Conceptual, Logical and Physical) of the same model. ADCMA
should plan to use a data modelling tool that could allow traceability between
different views of the data model. Lower level identifiers should be used from
subject area to its lowe level.
Data modellers of respective information systems are responsible for this activity.
This specification is applicable for all information systems of ADCMA that maintains
data models. Data modellers of respective information system should define
mandatory metadata to be captured along with data model changes.
Data governance board to provide guidelines on potential metadata to be captured
with data model changes and data modellers of respective information systems
should ensure to review the metadata in SDLC phases.
Data governance board should mandate data model version maintenance. If the
data model tool being used does not have versioning enabled, should use an
external version control repository or document management system to manage
data model versions.
Data modellers of respective information systems should follow the guidelines
from data governance board on version maintenance.
It is recommended to ensure the changes to the data model and it's metadata goes
through the approval of the data governance board for respective information
systems. The data assessment program will make recommendation on the data
governance board.
ADCMA should work on creating ADCMA enterprise data model. While developing
new data models or amending existing data models for individual information
systems , respective changes should be aligned to ADCMA's enterprise data model.
Data modellers of respective information systems are responsible for this activity.
ADCMA should align Enterprise Data Model with new informations systems within
ADCMA as they emerge.
Enterprise architect is responsible for this activity.
For all the information systems of ADCMA, conceptual data models should be
created to support architecture, development and operational processes. Data
governance board should enforce governance checkpoints during system
development lifecycle to review data model artefacts.
Data modellers of respective information systems are responsible for this activity.
Conceptual Data models for existing ADCMA applications should be created and
maintained with each project impacting the data models of respective information
systems.
Conceptual data model should include, but should not be limited to:
• Interviewing stakeholders, or otherwise undertaking business functional analysis
and requirements gathering to understand all relevant business concepts and
requirements
• Identifying candidate data profiles related to business processes, and capturing
associations between these profiles
• Combining candidate data profiles – as appropriate – into master data profiles,
transactional data profiles and reference data profiles, and modelling the high level
relationships between the data profiles
Data modellers of respective information systems are responsible for this activity.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical data model should be created in-line with conceptual data model
describing the data attributes and the relationships rules between the profiles.
Data modellers of respective information systems are responsible for this activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical and Physical data models should be created in-line with conceptual
data model. While ingesting data in conformed layer, data should ensure no-
duplication to the extent possible. The logical modelling of relationships between
entities should describe referential integrity and normalisation concerns. De-
normalization should be preferred in the data marts rather than in the core model
objects.
Data modellers of respective information systems are responsible for this activity.
Part of Data assessment program, Conceptual data model will be created for new
Business intelligence and analytics platform. Logical data model should be created
in-line with the conceptual data model for Business intelligence and analytics
platform. For other information systems within ADCMA, governance checkpoints
should be enforced to ensure that logical data model artefacts are delivered which
could be used for physical data models, impact assessments and / or gap analysis
between current and target state data models.
Data working group of respective information systems should be responsible for
this activity.
Part of Data assessment program, Conceptual data model will be created for new
Business intelligence and analytics platform. Logical data model should be created
in-line with the conceptual data model for Business intelligence and analytics
platform. For other information systems within ADCMA, governance checkpoints
should be enforced to ensure that physical data model artefacts are delivered
which could be used for impact assessments and / or gap analysis between current
and target state data models.
Data working group of respective information systems should be responsible for
this activity.
Physical data models for respective information systems should be maintained up-
to-date with each of the project implementation which could impact physical data
models. These data models could be utilized to understand the relationships
between different entities.
Data modellers of respective information systems are responsible for this activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical and Physical data models should be created in-line with conceptual
data model in a standard data modelling tool. For other information systems,
ADCMA should create Conceptual, logical and physical data models. ADCMA is
recommended to use standard Data modelling tools which allow different views
(Logical and physical) of data models to be linked.
Data modellers of respective information systems are responsible for this activity.
Currently ADCMA does not maintain Data models for existing applications. ADCMA
should consider reverse engineering data models from existing supported
information systems to create a baseline physical data models. Then reverse
engineer towards logical and Conceptual data models from the physical data
models.
Data modellers of respective information systems are responsible for this activity.
Data architects for respective systems are responsible for this activity.
As part of Data assessment program, for new Business Intelligence and Analytics
platform, data architecture will be recommended based on the understanding of
ADCMA's business needs. For other systems ADCMA should consider data
architecture deliverables for
• Data quality tooling including data profiling and cleansing
• Data security and privacy systems
• Open data management systems
• Document and content management, or workflow systems
• ERP, CRM, HR, Finance, Procurement, Audit, Legal and any other specialist
information systems appropriate to ADCMA
Data architects for respective information systems should be responsible for this
activity
0
Part of Data Assessment project, target state data architecture for new Business
Intelligence and Analytics platform will be proposed. ADCMA Should consider to
create and maintain baseline data architecture for all the information systems.
Data architects of respective information systems should be responsible for this.
Part of Data Assessment project, baseline data architecture for new Business
Intelligence and Analytics platform will be proposed.
Data architecture document covers Business and technical requirements
integration framework covers data architecture themes and
Risk assessment report for enterprise BI Roadmap will covers known constraints
with the Business Intelligence and analytics platform.
ADCMA Should consider to create and maintain Enterprise data architecture for all
the systems supporting Key business functions.
Data architects for respective information systems are responsible for this activity.
For all information systems assets, ADCMA should maintain current state
architecture along with target state architecture. Gaps between current state and
target state architecture should be documented as gaps in the architecture to be
bridged. For all projects that could impact respective information system
architecture capabilities should ensure to update current state architecture
document as well as gaps with target state architecture document. All new
business use cases identified should be run against target state architecture
capabilities of specific information systems and need to make amendments to
target state architecture as required (and Gaps between current state and target
state architecture).
Data governance board should have stage gates in system development life cycle
to ensure that current state architecture and gaps with target state architecture
are maintained.
Data architects of respective information systems are responsible for this activity.
Data governance board along with data architects should enforce stage gates in
System life cycle management to ensure that current state architecture is updated
with all projects that impacts the data architecture. While updating the current
state architectures, versions should be maintained.
ADCMA should create target state architecture for information systems. Target
state architecture to be reviewed for all business use cases identified for specific
information systems and need to make amendments to target state architecture
(and Gaps between current state and target state architecture). In different phases
of SDLC (Ideally multiple phases including design phase closure, built phase closure
and pre-implementation phases), there should be checkpoints to validate the
changes to current state architecture in-line with target state architecture.
target state (Enterprise / system) data architecture to ensure that business and
technology requirements are to be addressed. Any new business and technology
requirements identified should be checked against target state architecture and
need to amend architecture if required. Target state architecture should
• Encourage data integration across the Entity between information systems and
services
• Seek removal of duplication in terminology
• Seek to remove duplication of data processes
• Seek alignment of reference and master data across the Entity's systems
• Align with emerging ADCMA-wide technology platforms
• Integrate with ADCMA-wide reference and master data services and standards as
they emerge
• Show re-use of data and system architectures both within the Entity itself and
through collaboration with other Entities
• Be influenced by the data management requirements emerging from the data
quality, data security, data privacy, data integration and interoperability, and data
storage domains, both within the Entity and as delivered from central government
programmes
ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined and implemented, ADCMA should ensure to
create current state enterprise architecture and identify gaps between current
state and target state architectures.
ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined and implemented, ADCMA should ensure to
create current state enterprise architecture and identify gaps between current
state and a target state architectures. Roadmap to reach target state enterprise
architecture should be revisited when there are changes to Current state (With
new initiatives to be implemented) and Target State enterprise architectures (New
information systems to be introduced or existing information systems to be retired
etc.,)
ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined, all information systems within ADCMA should
ensure to align to target state enterprise architecture.
Data architects of information systems are responsible for this.
ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined, effectiveness of the roadmap implementation
should be reported by identifying gaps between current state and target state
enterprise data architectures.
Enterprise architect is responsible for this activity.
The Data Assessment programme will provide the DQ best practices along with
recommended DQ checklist. The DQ checklist will need to be automated along with
the DQ implementation.
It is recommend, while defining the Data Modelling and Master Data management
design, the Data Quality application to the master profiles and the ability to audit
the implementation with appropriate DQ metrices must be implemented.
The Data Architect along with the Technical Data Architect with the direction of the
Data Manager shall define and apply the DQ SLAs to externally procured datasets
The Data Manager will use the recommended DQ tools and build the business case
for implementation of DQ tools across the organization. It is recommended to
prepare a comprehensive roadmap/plan of DQ tool implementation as part of DG
Operationalization programme
While defining the Data Catalogue and Metadata Management design, the Data
Quality measures used for auditing will be stored with the Data Catalogue.
The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.
The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.
The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.
As next steps to the Data Assessment programme, the ADCMA ISMS Data Security
Policy V1.0 needs to be augmented to align with the Information Security
Standards defined in the DSP data domain covering architecture components.
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.
Implement Information Security standards while defining the standards required
for sharing datasets as "Open Data". The data security & privacy definitions must
be applied to all datasets/data attributes deemed to shared as "Open Data" and
reviewed/approved by the Data Governance board
0
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG to define the Data Privacy
"Metadata" for the Master profiles. This activity can be done while implementing
Data Catalogue or Metadata Management "Data Classification" at the attribute
level.
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.. To comply with this
specification, it is recommended to cover the 'mosaic effect' with the "Data
Classification" process.
The existing Cyber Security awareness programme will need to be integrated with
the awareness module for Data Privacy
Along with the Data Privacy Policy, it is recommended to define the "Privacy by
Design" which is integrated with the Data Privacy Standards and general Data
Management Programme standards
The existing Cyber Security awareness programme will need to be integrated with
the awareness module for Data Privacy
As next steps to the data assessment programme, along with the Data Privacy
Policy, it is recommended to define the "Privacy by Design" which is integrated
with the Data Privacy Standards and general Data Management Programme
standards
The Data Privacy policy should incorporate the "Privacy by Design" principle which
will be integrated with the Data Governance checkpoint process for review and
approval. The DSP.3.3 specification which defines audit capabilities will need to be
integrated with the data governance checkpoint process.
0
0
0
0
ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .
ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .
ADCMA to define the infrastructre and data center standards and policy. This will
be process and role to be enabled inline to the governance operating model.
ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .
ADCMA to define the infrastructre and data center standards and policy. This will
be process and role to be enabled inline to the governance operating model.
ADCMA should conduct assessment of G42 cloud enablement and the associate
cost of re-platforming of application and develop a benchmark of the data center
costing.
ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.
ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.
ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.
ADCMA need to envaluate the backup strategy based on the cloud enablement
strategy part of the BIA assessment.
ADCMA to work on a data center and application BCP strategy and DR roadmap on
G42 in 2023.
0
ADCMA to work on a data center and application BCP strategy and DR roadmap on
G42 in 2023.
As part of the BIA implementation, ADCMA will have a well defined governance
operating model, and the data governance architect will lay out the specifics of the
governance procedure and technical framework for implementing data life cycle
management.
As part of the BIA implementation, ADCMA will have a well defined governance
operating model, and the data governance architect will lay out the specifics of the
governance procedure and technical framework for implementing data life cycle
management.
The Data Manager will appoint the Integration Data Architect as part of the BIA
implementation programme. The Integration Architect shall work with Data
Manager to propose a design of the "Data Integration framework/layer" within the
proposed Business Intelligence and Analytics platform. The Data Manager will audit
the requirements called out in this specification.
The Data Integration architect shall document a detailed set of framework,
standards and guidelines ensuring alignment of Data Integration Platform for
Metadata Management.
The "Trusted Data Sharing Framework" will use the "Data Integration Framework"
as input to define a comprehensive set of standards for "Data Sharing" with
Internal, external and trusted third parties. It is recommended to cover the
following areas while defining the "Trusted Data Sharing Framework";
- Data Sharing Strategy
- Legal & Regulatory Considerations
- Technical & Organizational Considerations
- Operationalizing Data Sharing
The Data Architect in agreement with business and under the DG board guidance,
should revisit, brainstorm and explore the current and possible future data feeds
which may be required into or out of the system and may be included in Strategic
Integration Platform. Re-use of data feed should also consider
The Data Integration architect shall work with the Data Manager and Data Owners
to identify dataset exchange between Entities and define the process to exchange
datasets using ADDA ESB layer. The data integration document shall describe the
process and policy for ADCMA systems to exchange data with other Entities using
the ESB layer.
The Data Integration Architect shall define the data exchange process and adhere
to Information Security Standards defined in Data Security and Privacy.
The BIA platform's Data Integration layer will define the Data Exchange methods. It
is recommended that while designing the "Strategic Integration Platform" these
specifications on data exchange method are taken into consideration.
Migrating Peer-to-Peer connections via the SIP may not be applicable to ADCMA in
the immediate or near future. Although, the BIA platform will apply this
specification limited to the identified data sources being integrated with the BIA
platform,
The data architect responsible for data integration across ADCMA will be
responsible along with other data architects to adhere to the controls
The BIA platform will have need to have the capability to broker (transform) file-
based data exchanges and message-based data exchanges via its integration layer.
The Data Integration Architect will work with the Data Manager to define the
appropriate broker interactions while working on the BIA Data Integration design
specification
The Data Integration Architect shall work with the Data Manager and comply to
this specification while designing the BIA platform integration layer.
The Data Integration Architect shall work with the Data Manager and comply to
this specification while designing the BIA platform integration layer.
It is recommended to implement one-way integration while designing the BIA
platform. Use the broadcast method to publish the dataset/data service to
downstream applications/systems.
If in future, a requirement arises the BIA platform for the identified data source for
information system, has to be extended for two-way or interactive integration.
Proper justification will be provided, Data Governance Board and respective data
architects will be owning/ driving the activity as and when required
The high level plan for BIA platform planned, for the identified data source for
information system, will be incorporated with the required constraints of Detect
data delivery failure, Repeatable/idempotent retries, Statelessness and High
availability
The data architects defined for every applicable domain (E.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements
SLA best practices and guidelines will be provided as part of deliverables
Existing contracts with service providers should be reviewed in the light of the
guidelines
The data architects defined for every applicable domain (e.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements
The data architects defined for every applicable domain (e.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements
Escalation metrics to be planed along with Data Governance Board for any failure
Data Governance board should define Open data Policies. Data working group
should perform review of all data sources to be considered for Open Data in-line
with Open data policies defined.
ADCMA should keep systematic records of ADCMA opened data sources with a
clear explanation of their Open Status (Open or Closed). ADCMA should provide a
definition in their Data Catalogue for each open data set, written clearly and in
plain language in line with the context of its business.
Data working group should maintain the systematic records for the data sources.
All datasets that are deemed ‘open’ in the Open Data Review exercise of ADCMA
are to be made available through:
• The Open Data Portal (an adjunct of the Abu Dhabi Portal) in machine-readable
form (This could include the formats like Delimiter separated data (csv), XMLs,
JSON along with their metadata)
• The Open Data Portal (an adjunct of the Abu Dhabi Portal) in human-readable
form (where practicable) (i.e., to provide metadata in support of data published as
open data)
Data working group to seek approvals from respective data owners on which
datasets could be considered to publish. Prioritized and approved data sources to
be considered for publication by Data working group.
ADCMA should ensure that to the extent possible all data is made available in the
form closest to the source as possible. i.e., Datasets should be closest to the data
collected.
Data should not be manipulated, aggregated, redacted, anonymized or obfuscated
to the extent possible and allowable, with due regard for privacy and security
concerns.
Where such concerns exist, aggregation, redaction, anonymization obfuscation and
other manipulations should be carried out to the minimum extent possible to
alleviate the concern.
The following should be considered:
• Is it reasonably likely that an individual can be identified from those data and
from other data?
• What other data is available, either to the public or to researchers or other
organizations?
• How and why could your data be linked to other datasets?
• What is the likelihood of re-identification being attempted?
• What is the likelihood the re-identification would be successful?
• Which anonymization techniques are available to use?
• What is the quality of the data after anonymization has taken place, and whether
this will meet the quality gate for this data set’s Open Data release?
Data Architect for Open data publication should ensure that Open data publication
should be in the form closest to the source possible
ADCMA team should develop an Open Data Plan, to release the data that is
identified as Open data to publish through the Open Data Portal.
The Open Data Plan shall allow for:
• The dataset to be reviewed and duly approved by data governance committee for
release as Open Data
• Data Quality assessment should be done for the datasets that are considered to
be published as Open.
• Any aggregation, redaction, anonymization or obfuscation required for privacy or
security concerns has been approved and undertaken
• The dataset to be released once it has passed its Open data review, Data quality
checks.
Data working group should publish Open Data Plan to publish Open data in-line
with the data owners approval and prioritization done by Data Governance Group.
ADCMA should ensure that the Open Data Plan prioritizes the release of Open
Data. Some the criteria that could be used but not limited to are :
• Addressing security and privacy concerns
• Addressing the business priorities of ADCMA
• Addressing the demand from third parties for data
• Addressing the measurable quality of the data
Data working group to prioritize Open data to be published in the Open Data plan.
ADCMA should ensure that the Open Data Plan systematically addresses all of the
datasets identified in the Open Data Review.
Data working group to ensure that open data plan systematically addresses all of
the datasets identified.
ADCMA should ensure that progress against the Open Data Plan is monitored, and
the plan is reviewed at regular frequency.
ADCMA should publish its Open Data in the Abu Dhabi Government Open Data
Portal.
ADCMA should take care to ensure that all Open Data that is published should be
reviewed regularly (Especially when related datasets are published by ADCMA or
other entities) and ensure that:
• The data continuously continues to meet ADCMA's data quality definition
• Security and privacy concerns are continuously reviewed, specifically:
1. Is it reasonably likely that an individual can be identified from those data and
from other data?
2. What other data are available, either to the public or to researchers or other
organizations?
3. How and why could the published open data be linked to other datasets?
4. What is the likelihood of re-identification being attempted?
5. What is the likelihood the re-identification would be successful?
6. Which anonymization techniques are available to use?
7. What is the quality of the data after anonymization has taken place and whether
this will meet the quality gate for this data set’s Open Data release?
The Entity shall capture usage trends and statistics regarding access to the data
published as open data, and report these trends and statistics to the ADCMA Data
Governance Committee.
Business teams along with Data Governance group is responsible for this activity.
In the event that an ADCMA does not publish a dataset or datasets, it shall use its
annual awareness campaign to:
• Explain to the extent possible the reasons for withholding a dataset
• Indicate if and/or when a dataset will be published
• To provide a clear statement if a particular dataset is to remain unpublished for
the foreseeable future
Data working group is responsible for this activity.
The ECM programme will ensure establishing organisational document and content
management standards which defines guidelines for document writing, uniform
document experience (look-and-feel), document naming conventions, document
editorial processes
The Data Manager will work with the ECM Architect (Data Architect) and the
Technical Data Steward from the ECM program to identify compliance to the DCM
standards namely;
- Document & Content Management Standards
- Document Type usage for business cases
- Ability to capture Document Metadata through document lifecycle
Introduce Document and Content management policies defining what additions,
alterations or redactions can be done to any document and under what scenarios.
This can be added as a policy to the existing
"ADCMA_document_access_management_2.x.docx"
Introduce 'media content' retirement and disposal techniques for physical media
While defining requirements for the ECM implementation programme, ensure
establishing the following;
- Document & Content Management Standards
- Document Type usage for business cases
- The Document Metadata that needs to be captured through the document
lifecycle (SharePoint is already providing this functionality for document and
contents)
Include detailed training plan of document and systems management as part of the
Organizational Awareness of Data Management Programme. Entity should allocate
a training budget for ECM programme to develop self paced learning modules for
the specifications asked. The users should be offered certifications once they have
passed the criteria. Entity should aim for all the resources interacting with ECM to
be certified
While implementing the BIA Platform in multiple phases, at each logical point,
effectiveness of data warehouse initiatives should be measured by Business
Sponsor of the BIA programme.
While measure effectiveness, should evaluate including but not limited to below
points.
Technical Alignment with the architectural road map.
Implementation and usage experiences (If there is a huge deviation on the quality
of data anticipated or any deviation of performance of the warehouse or any
deviations on the anticipated data volumes impacting performance of the platform
etc.,)
When there are external datasets identified, ADCMA should be aligned on below
aspects with External dataset providers:
1) Have clear interface agreement with Functional and Non-functional
requirements on dataset sharing including data refresh cycles, data quality
requirements and other performance metrics.
2) Service level Agreements on dataset sharing
3) Have clear ownership within ADCMA and within external supplier of the datasets
4) Should clearly define issue resolution workflow (SLA Definitions should call out)
BIA architect should ensure that ADCMA Teams follow guidelines to define
interface agreements.
Depending upon the complexity involved the transformations required from source
data to target data model in Business Intelligence and analytics platform, data-
staging environment should be used in-line with data architecture definition for
Business Intelligence and analytics platform.
BIA Architect should define the guidelines on using data-staging environment while
ingesting data from sources.
Datawarehouse, Business Intelligence and analytics initiatives should considered
many aspects of below data management aspects :
Metadata Management, Data Catalogue, Data modelling and design, Data
Architecture, Data Quality, Data Storage, Data Integration and Interoperability,
Master Data Management, Reference Data Management
for the new BIA platform, Data Assessment program will propose data architecture
which will include the aspects of data management that ADCMA should be
considered for its BIA platform. ADCMA BIA Architect should ensure to maintain
the data architecture to meet new ADCMA business cases identified.
Open Data available could be considered as one of the External data to be used
along with any identified gaps in the existing ADCMA datasets to suit Business use
cases and is available from external sources (Paid or as Open data).
Data Architect (Integration Architect) to take lead on identifying any external data
to be sourced for meeting the business use cases.
For better supportability, ADCMA should prefer Commercial Off The Shelf (COTS) or
Open Source tooling than internally developed tooling as that could potentially
lead to performance and supportability and scalability issues.
Data Modeler in consultation with data architect should own on special purpose
table types when modelling the data warehouse.
Using Surrogate keys to avoid conflicts with any future data integrations should be
considered. Data flows should be designed to address to some of the common
problems including but not limited to late arriving dimensions etc.,
While designing Schemas (Marts) for specific business use cases, should consider
simplest schemas possible. Any specific business process should be considered in
the specific mart tables for re-usability of the schema objects (Dimensions, facts).
Data modelers in consultation with Data architect should own this activity.
ADCMA should identify conformed dimensions while ingesting data into Business
Intelligence and analytics platforms so that these dimensions could be re-used
across multiple schemas.
Data modelers to work along with data architects on identifying confirmed
dimensions to be re-used across multiple fact tables.
While creating data architecture, there should be an area maintained to retain the
source data without any manipulation for debugging purposes in case of any data
quality / process issues identified.
BIA Architect to ensure that un-altered version of source file is maintained within
BIA platform in-line with BIA target state architecture.
ADCMA should develop performance metrics to control the quality, volume and
timeliness of data within the data warehouse. These metrics should be reviewed
on regular basis to identify any potential SLAs to be defined.
BIA Architect should own this and ensure performance metrics are maintained
within data warehouse.
ADCMA should preferably share the tooling and technology for various marts
creation to re-use the processes for common data processing purposes (Like data
standardization, data quality checks etc.,)
ADCMA should preferably should use same or compatible technology platforms for
data marts.
As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.
As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.
As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.
Enterprise architect should be responsible for this activity and might need
consulting with respective reporting solution architects.
ADCMA should develop and publish any identified statistical data in line with the
Statistics Centre Abu Dhabi (SCAD) requirements.
BIA Architect (Integration) should ensure that Service level agreements are placed
in-line with importance and criticality of data consumed from SCAD.
ADCMA should identify the Big Data use cases to encourage innovation.
Data Governance Board should identify the big data use cases.
ADCMA should implement event stream-based analytical processing to support
high velocity data analysis (Like ADCMA IT asset related threats etc.,). In Data
assessment program, data architecture will be proposed to meet the business
needs of ADCMA which includes stream-based analytics processing for IT use cases.
BIA Architect should own this activity in-line with data architecture proposed.
The Data Governance Chairperson and Data Manager shall follow the DG structure
and operating model defined in DLV 9, refine the policies as appropriate and get
them approved by the Data Governance Board/Council.
Using the DG Operating Model definitions, R&R, nominate the Data Manager role
as part of the Data Governance core team within the IT support services section
Identify the key Data Domains applicable to ADCMA and appoint Data Architect (s)
for the appropriate Data Domains.
ADCMA can start with appointment of BI Data Architect for the BIA programme.
Data Owners will be identified from the departments and associated with Dataset
ownership. The Data Owner is accountable to ensure compliance for the OWNED
dataset.
The data manager shall enhance the existing Privacy policy (if required) and align
with the ADG published policies for Open Data.
The data manager shall update the data management policy to address data
lifecycle management for applicable data domains (Data Quality, Data Security &
Privacy, Master & Reference Data management, Data Retention & Disposition,
Data Access Management, Data Catalogue management, data classification)
The Data Manager with the direction from DG Chairperson update the Data
Management Plicy document to include the management intent documenting
support to the DMS Data Domains, Controls and Specifications compliance
The data manager shall enhance the existing Data Management policy and align
with the DMS publsihed Data Quality standards compliance
DLV9 define the Data Governance checkpoint process defining the process to be
followed by the Data Governance team to resolve 'data issues'
The data manager shall update the policy document with Change Management
process
Responsibility of the Data Manager. Defined as one of the functions of the Data
Manager role in DLV9
DG Board and DWG will periodically convene to review and update/modify the
policies as applicable to ensure alignment with the relevant legislation and
maintain evidence in a secured and shared location within the ADCMA secured
environment. The frequency of meetings, roles and approval mandates are defined
in DLV9.
The Data Manager should work with Legal ensuring compliance of Policies with
relevant legislations.
DG Board and DWG will periodically convene to review and update/modify the
policies as applicable to ensure alignment with the relevant legislation and
maintain evidence in a secured and shared location within the ADCMA secured
environment. The frequency of meetings, roles and approval mandates are defined
in DLV9.
The Data Manager shall incorporate the Data Management Policies as appropriate
within the existing NDA such that the internal, external, contractor or other
stakeholders agree to the applicable Data Management Policy. This NDA shall be
agreed to be enforced within the Application Management Programmes, Data
Management Programmes or any Enterprise Capability Management programmes
(e.g The Project Manager os the ECM programme will confirm to compliance to
adherence to applicable Data Policies as defined in the approved/published Data
Management Policy
With direction from DG Chairperson, the Data Manager will document specific,
measurable and scheduled goals supporting the Data Management Programme,
aligning the goals with overall Business Strategy.
The Data Governance Board will decide on the date for reviewing the DG plan. The
same shall be approved by the Data Auditor (IA) from the Data Governance
Council.
The DG board will convene and submit the Data Management Programme plan to
ADDA as per applicable process.
The Data Manager will ensure alignment of the Data Management programme
governance plan with the overall business strategy, plan and objectives.
The Data Manager along with the DWG members will ensure DG plan and other
related artefacts are under version control as defined in the Document and
Content data domain specifications.
The Data Manager along with the DWG to assess the risk to operations while
defining the DG Plan (Operations risk like business criticality, impact to business,
budgeting etc)
DLV9 defines the Data Governance Checkpoint process for Data Governance. The
Data Manager is expected to enhance the DG Checkpoint Process as appropriate.
The Data Manager is expected to define the Disaster Recovery governance plan in
accordance with the Data Storage data domain
The Data Manager should align the DG plan with Programmes/Initiatives like
Document and Content management (ECM programme)
The data manager along with the DWG shall ensure adherence to core principles of
DMS
The Data Manager should work with the organisation Change Management team
to create a common Change Management standard. Examples and references are
provided with DLV 9 (Part III)
The Data Assessment programme will define the roadmap and plan for the BI
Platform which covers Data Integration, Data Collection, Quality, Transformation,
Storage and Visualisation. Future change to any of the above BI layers will need to
undergo a Change Impact Assessment before sending for Data Management Board
review and approval. The change impact assessment will also be applicable to
changes proposed to the baseline Data Governance Model and Checkpoint Process
The Data Manager should work with the organisation Change Management team
to create a common Change Management standard. Examples and references are
provided with DLV 9 (Part III)
The Data Manager should work with the organisation Change Management team
to create a common Change Management standard. Examples and references are
provided with DLV 9 (Part III). Data Manager to follow the recommended
specification
The Data Manager should work with the organisation Change Management team
to create a common Change Management standard. Examples and references are
provided with DLV 9 (Part III). Data Manager to follow the recommended
specification
The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.
The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.
The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.
The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.
The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.
The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.
The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.
The Data Management Audit framework is the Data Governance Operating Model
which defines the DG Organisation structure, the operating model, Data
Governance Checkpoint process, Roles & Responsibilities, RACI, Inter-Forum,
departmental communications etc. This specification expects DG audit capability
which is defined with the Operating model.
Upon approval of the DG Structure, plan, policy, operating model by the DG Board.
This specification can be marked as 'Fully Implemented'
Upon approval of the DG Structure, plan, policy, operating model by the DG Board.
This specification can be marked as 'Fully Implemented'
The Data Management Auditors will be nominated from Internal Affairs (IA).
Upon approval of the DG Structure, plan, policy, operating model by the DG Board.
This specification can be marked as 'Fully Implemented'
After the first iteration of Data Management Auditor (IA) review and approval, the
Data Governance Chairperson and Data Manager will upload the Statement of
compliances, audit results etc to ADDA or approved third parties
The Data Manager along with the DWG will ensure that the Data management
audit results (Data Governance Checkpoint results) are versioned, classified and
protected and adhere to the defined Infosec classification guidelines.
The Data Auditor from IA will be responsible to work with the DG Chairperson and
Data Manager to align the Data Management Governance (Audit) with other
ADCMA internal Audit mechanisms
The Data Governance Council is responsible to perform the audit and maintain he
audit results with the help of the Data Manager
The Data Manager is responsible to refine the Data Governance metrics and report
to the Board and Council as applicable.
The Data Manager is responsible to refine the Data Governance metrics and report
to the Board and Council as applicable.
The Data Manager is responsible to refine the Data Governance metrics and report
to the Board and Council as applicable.
The Data Auditor from IA should be nominated as part of the Data Governance
Council who can function as an independent auditor of the Data Management
Programme audit, compliance, Governance checkpoint results.
The Data Governance Board should closely track the budget, effectiveness,
performance of the overall Data Management Programme and Governance.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.
Data categorization and data governance control will be integrated along with the
BIA system implementation by ADCMA. The precise controls and technology
remedy for this control will be defined with the aid of a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
ADCMA will integrate data categorization and data governance control with the BIA
system implementation. With the help of a data governance architect, the precise
controls and technological solution for this control will be determined.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.
Data categorization and data governance control will be integrated along with the
BIA system implementation by ADCMA. The precise controls and technology
remedy for this control will be defined with the aid of a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.
Currently for the In-house developed information systems, (ADCMA website CMS,
career portal etc.,), ADCMA is suggested to create and maintain data models.
Governance checkpoint processes should be enforced in SDLC (Software
Development Life Cycle) to ensure that the data models are maintained up to date
for all the projects impacting its data models in an information system. data
models should be promoted as one of core deliverables for an information system.
Data Governance board should recommend data modelling specific tool to be used
across ADCMA information systems. Data governance board should suggest
checkpoints for data model reviews in the SDLC. (Mainly should suggest exit criteria
for Design, Build and pre-implementation phases to validate the data model
changes as applicable.)
ADCMA should create a governance body for data modelling for each of the
information systems. Data Working group should create appropriate training
modules on data model for different roles that are present within information
systems. Depending upon the roles, training modules should provide information
on reading / interpreting the data models at different levels (Conceptual, Logical
and Physical) and also developing the data models in-line with target state
(Conceptual, Logical and Physical) data models.
data architects of respective information systems are responsible for this activity.
Data architects of respective information systems are responsible for this activity.
Data governance board should be able to define some comman standards and
guidelines to be followed by different information systems.
Currently, Metadata collection with semi-structured and unstructured data is not in
a standardized way in ADCMA. ADCMA data governance board should provide
guidelines on metadata collection with semi-structured and unstructured data to
ensure uniformity of data collected for a specific type of unstructured or semi-
structured data.
Data architects of respective information systems in-line with guidelines from data
governance board should be responsible for this activity.
Depending upon the business use cases, ADCMA should consider conversion of
semi-structured and unstructured data into structured form through
transformational or analytics conversion techniques. As per the current state of
ADCMA, this could be considered as advanced use case.
https://uima.apache.org/downloads/releaseDocs/2.2.1-incubating/docs/html/
index.html
ADCMA should maintain flow of unstructured information along with its metadata
and identifying data between systems. Once metadata collection for unstructured
data is standardized to include identifying data associated with unstructured data,
relationships between different entities should be maintained.
Data architects of respective information systems are responsible for this activity.
This specification is for developing and maintaining data models covering all
information systems in the enterprise. For all information systems within ADCMA,
Physical data model information to be mapped to logical models and at a higher
conceptual level. The Data Assessment program will establish the Conceptual Data
Model (CDM) for the newly proposed Business Intelligence and Analytics platform.
ADCMA should consider to develop Logical and Physical data models for Business
Intelligence and Analytics platform in-line with the conceptual data model.
Data models for ADCMA information systems should be created and maintained
with each project impacting the data models of respective information systems
giving equal importance to structured and unstructured data.
Data architects of respective information systems are responsible for this activity.
It is recommended that the data governance board should publish reference data
models for immediate reference of data model teams. Data governance board
should publish data models that could be re-used. When new information systems
are introduced, data model teams should refer to existing reference data models
and re-usable data models that could potentially be used.
Data architects of respective information systems are responsible for this activity.
Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform. While
developing the Logical, Physical data models for Business Intelligence and Analytics
platform, UML diagrams should be used as primary modellin notation.
Data architects of respective information systems are responsible for this activity
The Data Governance Board should create guidelines (a pre-defined templates ,
diagrams and notations) with its data model teams to effectively represent of their
data models with different non-technical teams including business teams.
Data architects of respective information systems should adhere to the guidelines
suggested by data governance board.
Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform. Entity
relationship diagrams and class diagrams to document the structure and
relationships of data objects at logical and physical levels should be developed
during the implementation of Business Intelligence and analytics use cases. For all
other applications of ADCMA, it is recommended to prepare Entity relationships at
Conceptual, Logical and physical level.
Data architects of respective information systems are responsible for this activity.
Data governance board should mandate to maintain data flow diagrams to model
the movement of data within and between the systems including but not limited to
maintaining master profiles. Governance Checkpoints within SDLC should be
established to ensure data flow diagrams are maintained.
Data architects of respective information systems along with change management
team are responsible for this activity.
Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform.
Conceptual model is defined at subject area and entity level. While developing
logical and physical models, subject areas and logical grouping of entities below
subject areas should be created depending upon the number of entities in each
subject areas. While creating data models for other applications in ADCMA, similar
approach should be followed.
Data modellers of respective information systems are responsible for this activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. This should be maintained along with each of the business intelligence
initiatives implemented in business intelligence and analytics platform. Data model
related artefacts should differentiate the components of the model that are
implemented and not implemented. Data modellers should provide guidelines to
data model teams on unified way of representing this.
Data modellers of respective information systems are responsible for this activity.
ADCMA should ensure that the data models are maintained by adhering to the
rules defined by ADCMA DMS.
when designing new conceptual data models below rules should be adhered:
• Data objects are represented by nouns
• Data relationships are represented by verbs
Following rules should be adhered to when designing new logical data models:
• The appropriate data type shall be used for attributes within tables. This shall
take into account performance, storage, and data requirements. Where a String or
other variable character data type is used, consideration must have first been given
for more appropriate data types
Following rules should be adhered to when designing new physical data models:
• Primary keys shall be numeric. Where there is not a suitable numeric candidate
key, a surrogate key in the form of an auto-numbering key shall be used
• Reference data tables shall have a numeric primary key (likewise, tables that use
reference data tables shall use the reference table's numeric primary key in the
foreign key relationship)
• Reference data tables will have, at a minimum, a numeric primary key and a code
value represented as a string. Additional payload information (such as textual
descriptions) may also exist as reference data (See RM.2.3)
• Physical data types that have a length or precision specifier shall have an
appropriate length or precision specified, and not left to the default value
Data modellers of respective information systems are responsible for this activity.
This specification is applicable for Entities that have MDM implemented. Currently
ADCMA does not have MDM needs and this specification is not applicable.
Currently ADCMA does not have business metadata / glossary maintenance
process to define business terms. With Data assessment project, data architecture
considers (Business and Technical) Metadata management for Business Intelligence
and analytics platform. Business terms used in Data model for BIA platform and
business glossary should be in sync. For business terms other than the ones used in
BIA Platform, ADCMA Should ensure that respective business glossary is
maintained.
Currently ADCMA does not have (Business and Technical) metadata / glossary
maintenance process to define business and technical terms. With Data
assessment project, data architecture considers (Business and Technical) Metadata
management for Business intelligence and analytics platform. Technical definitions
for all business terms under ADCMA's ownership should take input from logical and
physical data models. Technical definitions should be populated within the data
dictionary of ADCMA's data catalogue. Only Business Intelligence and Analytics
platform related technical metadata are planned. For other ADCMA systems also
technical definitions should be planned.
Data architects of respective information systems should be responsible for this
activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Considering that the conceptual data model delivered will be a living
document, data model versions should be maintained when there are any updates
to the conceptual data model in future. In-line, respective logical and physical data
models should also be maintained versions with appropriate naming conventions.
Data modellers of respective information systems are responsible for this activity.
For all the data models maintained for different information systems of ADCMA
should maintain traceability between different views of the data model. Some of
the standard data model tools allow to maintain traceability links between
different views (Conceptual, Logical and Physical) of the same model. ADCMA
should plan to use a data modelling tool that could allow traceability between
different views of the data model. Lower level identifiers should be used from
subject area to its lowe level.
Data modellers of respective information systems are responsible for this activity.
This specification is applicable for all information systems of ADCMA that maintains
data models. Data modellers of respective information system should define
mandatory metadata to be captured along with data model changes.
Data governance board to provide guidelines on potential metadata to be captured
with data model changes and data modellers of respective information systems
should ensure to review the metadata in SDLC phases.
Data governance board should mandate data model version maintenance. If the
data model tool being used does not have versioning enabled, should use an
external version control repository or document management system to manage
data model versions.
Data modellers of respective information systems should follow the guidelines
from data governance board on version maintenance.
It is recommended to ensure the changes to the data model and it's metadata goes
through the approval of the data governance board for respective information
systems. The data assessment program will make recommendation on the data
governance board.
ADCMA should work on creating ADCMA enterprise data model. While developing
new data models or amending existing data models for individual information
systems , respective changes should be aligned to ADCMA's enterprise data model.
Data modellers of respective information systems are responsible for this activity.
ADCMA should align Enterprise Data Model with new informations systems within
ADCMA as they emerge.
Enterprise architect is responsible for this activity.
For all the information systems of ADCMA, conceptual data models should be
created to support architecture, development and operational processes. Data
governance board should enforce governance checkpoints during system
development lifecycle to review data model artefacts.
Data modellers of respective information systems are responsible for this activity.
Conceptual Data models for existing ADCMA applications should be created and
maintained with each project impacting the data models of respective information
systems.
Conceptual data model should include, but should not be limited to:
• Interviewing stakeholders, or otherwise undertaking business functional analysis
and requirements gathering to understand all relevant business concepts and
requirements
• Identifying candidate data profiles related to business processes, and capturing
associations between these profiles
• Combining candidate data profiles – as appropriate – into master data profiles,
transactional data profiles and reference data profiles, and modelling the high level
relationships between the data profiles
Data modellers of respective information systems are responsible for this activity.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical data model should be created in-line with conceptual data model
describing the data attributes and the relationships rules between the profiles.
Data modellers of respective information systems are responsible for this activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical and Physical data models should be created in-line with conceptual
data model. While ingesting data in conformed layer, data should ensure no-
duplication to the extent possible. The logical modelling of relationships between
entities should describe referential integrity and normalisation concerns. De-
normalization should be preferred in the data marts rather than in the core model
objects.
Data modellers of respective information systems are responsible for this activity.
Part of Data assessment program, Conceptual data model will be created for new
Business intelligence and analytics platform. Logical data model should be created
in-line with the conceptual data model for Business intelligence and analytics
platform. For other information systems within ADCMA, governance checkpoints
should be enforced to ensure that logical data model artefacts are delivered which
could be used for physical data models, impact assessments and / or gap analysis
between current and target state data models.
Data working group of respective information systems should be responsible for
this activity.
Part of Data assessment program, Conceptual data model will be created for new
Business intelligence and analytics platform. Logical data model should be created
in-line with the conceptual data model for Business intelligence and analytics
platform. For other information systems within ADCMA, governance checkpoints
should be enforced to ensure that physical data model artefacts are delivered
which could be used for impact assessments and / or gap analysis between current
and target state data models.
Data working group of respective information systems should be responsible for
this activity.
Physical data models for respective information systems should be maintained up-
to-date with each of the project implementation which could impact physical data
models. These data models could be utilized to understand the relationships
between different entities.
Data modellers of respective information systems are responsible for this activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical and Physical data models should be created in-line with conceptual
data model in a standard data modelling tool. For other information systems,
ADCMA should create Conceptual, logical and physical data models. ADCMA is
recommended to use standard Data modelling tools which allow different views
(Logical and physical) of data models to be linked.
Data modellers of respective information systems are responsible for this activity.
Currently ADCMA does not maintain Data models for existing applications. ADCMA
should consider reverse engineering data models from existing supported
information systems to create a baseline physical data models. Then reverse
engineer towards logical and Conceptual data models from the physical data
models.
Data modellers of respective information systems are responsible for this activity.
Data architects for respective systems are responsible for this activity.
As part of Data assessment program, for new Business Intelligence and Analytics
platform, data architecture will be recommended based on the understanding of
ADCMA's business needs. For other systems ADCMA should consider data
architecture deliverables for
• Data quality tooling including data profiling and cleansing
• Data security and privacy systems
• Open data management systems
• Document and content management, or workflow systems
• ERP, CRM, HR, Finance, Procurement, Audit, Legal and any other specialist
information systems appropriate to ADCMA
Data architects for respective information systems should be responsible for this
activity
Part of Data Assessment project, baseline data architecture for new Business
Intelligence and Analytics platform will be proposed.
Data architecture document covers Business and technical requirements
integration framework covers data architecture themes and
Risk assessment report for enterprise BI Roadmap will covers known constraints
with the Business Intelligence and analytics platform.
ADCMA Should consider to create and maintain Enterprise data architecture for all
the systems supporting Key business functions.
Data architects for respective information systems are responsible for this activity.
For all information systems assets, ADCMA should maintain current state
architecture along with target state architecture. Gaps between current state and
target state architecture should be documented as gaps in the architecture to be
bridged. For all projects that could impact respective information system
architecture capabilities should ensure to update current state architecture
document as well as gaps with target state architecture document. All new
business use cases identified should be run against target state architecture
capabilities of specific information systems and need to make amendments to
target state architecture as required (and Gaps between current state and target
state architecture).
Data governance board should have stage gates in system development life cycle
to ensure that current state architecture and gaps with target state architecture
are maintained.
Data architects of respective information systems are responsible for this activity.
Data governance board along with data architects should enforce stage gates in
System life cycle management to ensure that current state architecture is updated
with all projects that impacts the data architecture. While updating the current
state architectures, versions should be maintained.
ADCMA should create target state architecture for information systems. Target
state architecture to be reviewed for all business use cases identified for specific
information systems and need to make amendments to target state architecture
(and Gaps between current state and target state architecture). In different phases
of SDLC (Ideally multiple phases including design phase closure, built phase closure
and pre-implementation phases), there should be checkpoints to validate the
changes to current state architecture in-line with target state architecture.
target state (Enterprise / system) data architecture to ensure that business and
technology requirements are to be addressed. Any new business and technology
requirements identified should be checked against target state architecture and
need to amend architecture if required. Target state architecture should
• Encourage data integration across the Entity between information systems and
services
• Seek removal of duplication in terminology
• Seek to remove duplication of data processes
• Seek alignment of reference and master data across the Entity's systems
• Align with emerging ADCMA-wide technology platforms
• Integrate with ADCMA-wide reference and master data services and standards as
they emerge
• Show re-use of data and system architectures both within the Entity itself and
through collaboration with other Entities
• Be influenced by the data management requirements emerging from the data
quality, data security, data privacy, data integration and interoperability, and data
storage domains, both within the Entity and as delivered from central government
programmes
ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined and implemented, ADCMA should ensure to
create current state enterprise architecture and identify gaps between current
state and target state architectures.
ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined and implemented, ADCMA should ensure to
create current state enterprise architecture and identify gaps between current
state and a target state architectures. Roadmap to reach target state enterprise
architecture should be revisited when there are changes to Current state (With
new initiatives to be implemented) and Target State enterprise architectures (New
information systems to be introduced or existing information systems to be retired
etc.,)
ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined, all information systems within ADCMA should
ensure to align to target state enterprise architecture.
Data architects of information systems are responsible for this.
ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined, effectiveness of the roadmap implementation
should be reported by identifying gaps between current state and target state
enterprise data architectures.
Enterprise architect is responsible for this activity.
Along with the DQ framework implementation, the DQ metadata identified for the
identified datasets will need to be defined in the data catalogue.
The Data Assessment programme will provide the DQ best practices along with
recommended DQ checklist. The DQ checklist will need to be automated along with
the DQ implementation.
It is recommend, while defining the Data Modelling and Master Data management
design, the Data Quality application to the master profiles and the ability to audit
the implementation with appropriate DQ metrices must be implemented.
It is recommend, while defining the Data Modelling and Master Data management
design, the Data Quality application to the master profiles and the ability to audit
the implementation with appropriate DQ metrices must be implemented.
The Data Architect along with the Technical Data Architect with the direction of the
Data Manager shall define and apply the DQ SLAs to externally procured datasets
The Data Manager will use the recommended DQ tools and build the business case
for implementation of DQ tools across the organization. It is recommended to
prepare a comprehensive roadmap/plan of DQ tool implementation as part of DG
Operationalization programme
While defining the Data Catalogue and Metadata Management design, the Data
Quality measures used for auditing will be stored with the Data Catalogue.
The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.
The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.
The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.
As next steps to the Data Assessment programme, the ADCMA ISMS Data Security
Policy V1.0 needs to be augmented to align with the Information Security
Standards defined in the DSP data domain covering architecture components.
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.
Implement Information Security standards while defining the standards required
for sharing datasets as "Open Data". The data security & privacy definitions must
be applied to all datasets/data attributes deemed to shared as "Open Data" and
reviewed/approved by the Data Governance board
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG to define the Data Privacy
"Metadata" for the Master profiles. This activity can be done while implementing
Data Catalogue or Metadata Management "Data Classification" at the attribute
level.
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.. To comply with this
specification, it is recommended to cover the 'mosaic effect' with the "Data
Classification" process.
The existing Cyber Security awareness programme will need to be integrated with
the awareness module for Data Privacy
Along with the Data Privacy Policy, it is recommended to define the "Privacy by
Design" which is integrated with the Data Privacy Standards and general Data
Management Programme standards
The existing Cyber Security awareness programme will need to be integrated with
the awareness module for Data Privacy
As next steps to the data assessment programme, along with the Data Privacy
Policy, it is recommended to define the "Privacy by Design" which is integrated
with the Data Privacy Standards and general Data Management Programme
standards
The Data Privacy policy should incorporate the "Privacy by Design" principle which
will be integrated with the Data Governance checkpoint process for review and
approval. The DSP.3.3 specification which defines audit capabilities will need to be
integrated with the data governance checkpoint process.
As next steps to the data assessment programme, it is recommended to define and
implement the Privacy Management process and workflow with specific metrices
built around Privacy Management that can be audited
ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .
ADCMA needs to conduct an business application acritivality and availabbility
assessment and determine the application that needs a cloud enablement on G42
based on the criticality scoring of the applicaton.
ADCMA to define the infrastructre and data center standards and policy. This will
be process and role to be enabled inline to the governance operating model.
ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .
ADCMA to define the infrastructre and data center standards and policy. This will
be process and role to be enabled inline to the governance operating model.
ADCMA should conduct assessment of G42 cloud enablement and the associate
cost of re-platforming of application and develop a benchmark of the data center
costing.
ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.
ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.
ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.
ADCMA need to envaluate the backup strategy based on the cloud enablement
strategy part of the BIA assessment.
ADCMA to work on a data center and application BCP strategy and DR roadmap on
G42 in 2023.
ADCMA to work on a data center and application BCP strategy and DR roadmap on
G42 in 2023.
As part of the BIA implementation, ADCMA will have a well defined governance
operating model, and the data governance architect will lay out the specifics of the
governance procedure and technical framework for implementing data life cycle
management.
As part of the BIA implementation, ADCMA will have a well defined governance
operating model, and the data governance architect will lay out the specifics of the
governance procedure and technical framework for implementing data life cycle
management.
The Data Manager will appoint the Integration Data Architect as part of the BIA
implementation programme. The Integration Architect shall work with Data
Manager to propose a design of the "Data Integration framework/layer" within the
proposed Business Intelligence and Analytics platform. The Data Manager will audit
the requirements called out in this specification.
The "Trusted Data Sharing Framework" will use the "Data Integration Framework"
as input to define a comprehensive set of standards for "Data Sharing" with
Internal, external and trusted third parties. It is recommended to cover the
following areas while defining the "Trusted Data Sharing Framework";
- Data Sharing Strategy
- Legal & Regulatory Considerations
- Technical & Organizational Considerations
- Operationalizing Data Sharing
The Data Architect in agreement with business and under the DG board guidance,
should revisit, brainstorm and explore the current and possible future data feeds
which may be required into or out of the system and may be included in Strategic
Integration Platform. Re-use of data feed should also consider
The Data Integration architect shall work with the Data Manager and Data Owners
to identify dataset exchange between Entities and define the process to exchange
datasets using ADDA ESB layer. The data integration document shall describe the
process and policy for ADCMA systems to exchange data with other Entities using
the ESB layer.
The Data Integration Architect shall define the data exchange process and adhere
to Information Security Standards defined in Data Security and Privacy.
The BIA platform's Data Integration layer will define the Data Exchange methods. It
is recommended that while designing the "Strategic Integration Platform" these
specifications on data exchange method are taken into consideration.
Migrating Peer-to-Peer connections via the SIP may not be applicable to ADCMA in
the immediate or near future. Although, the BIA platform will apply this
specification limited to the identified data sources being integrated with the BIA
platform,
The data architect responsible for data integration across ADCMA will be
responsible along with other data architects to adhere to the controls
The BIA platform will have need to have the capability to broker (transform) file-
based data exchanges and message-based data exchanges via its integration layer.
The Data Integration Architect will work with the Data Manager to define the
appropriate broker interactions while working on the BIA Data Integration design
specification
The Data Integration Architect shall work with the Data Manager and comply to
this specification while designing the BIA platform integration layer.
The Data Integration Architect shall work with the Data Manager and comply to
this specification while designing the BIA platform integration layer.
It is recommended to implement one-way integration while designing the BIA
platform. Use the broadcast method to publish the dataset/data service to
downstream applications/systems.
If in future, a requirement arises the BIA platform for the identified data source for
information system, has to be extended for two-way or interactive integration.
Proper justification will be provided, Data Governance Board and respective data
architects will be owning/ driving the activity as and when required
The high level plan for BIA platform planned, for the identified data source for
information system, will be incorporated with the required constraints of Detect
data delivery failure, Repeatable/idempotent retries, Statelessness and High
availability
The data architects defined for every applicable domain (E.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements
SLA best practices and guidelines will be provided as part of deliverables
Existing contracts with service providers should be reviewed in the light of the
guidelines
The data architects defined for every applicable domain (e.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements
The data architects defined for every applicable domain (e.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements
Escalation metrics to be planed along with Data Governance Board for any failure
Data Governance board should define Open data Policies. Data working group
should perform review of all data sources to be considered for Open Data in-line
with Open data policies defined.
ADCMA should keep systematic records of ADCMA opened data sources with a
clear explanation of their Open Status (Open or Closed). ADCMA should provide a
definition in their Data Catalogue for each open data set, written clearly and in
plain language in line with the context of its business.
Data working group should maintain the systematic records for the data sources.
All datasets that are deemed ‘open’ in the Open Data Review exercise of ADCMA
are to be made available through:
• The Open Data Portal (an adjunct of the Abu Dhabi Portal) in machine-readable
form (This could include the formats like Delimiter separated data (csv), XMLs,
JSON along with their metadata)
• The Open Data Portal (an adjunct of the Abu Dhabi Portal) in human-readable
form (where practicable) (i.e., to provide metadata in support of data published as
open data)
Data working group to seek approvals from respective data owners on which
datasets could be considered to publish. Prioritized and approved data sources to
be considered for publication by Data working group.
ADCMA should ensure that to the extent possible all data is made available in the
form closest to the source as possible. i.e., Datasets should be closest to the data
collected.
Data should not be manipulated, aggregated, redacted, anonymized or obfuscated
to the extent possible and allowable, with due regard for privacy and security
concerns.
Where such concerns exist, aggregation, redaction, anonymization obfuscation and
other manipulations should be carried out to the minimum extent possible to
alleviate the concern.
The following should be considered:
• Is it reasonably likely that an individual can be identified from those data and
from other data?
• What other data is available, either to the public or to researchers or other
organizations?
• How and why could your data be linked to other datasets?
• What is the likelihood of re-identification being attempted?
• What is the likelihood the re-identification would be successful?
• Which anonymization techniques are available to use?
• What is the quality of the data after anonymization has taken place, and whether
this will meet the quality gate for this data set’s Open Data release?
Data Architect for Open data publication should ensure that Open data publication
should be in the form closest to the source possible
ADCMA team should develop an Open Data Plan, to release the data that is
identified as Open data to publish through the Open Data Portal.
The Open Data Plan shall allow for:
• The dataset to be reviewed and duly approved by data governance committee for
release as Open Data
• Data Quality assessment should be done for the datasets that are considered to
be published as Open.
• Any aggregation, redaction, anonymization or obfuscation required for privacy or
security concerns has been approved and undertaken
• The dataset to be released once it has passed its Open data review, Data quality
checks.
Data working group should publish Open Data Plan to publish Open data in-line
with the data owners approval and prioritization done by Data Governance Group.
ADCMA should ensure that the Open Data Plan prioritizes the release of Open
Data. Some the criteria that could be used but not limited to are :
• Addressing security and privacy concerns
• Addressing the business priorities of ADCMA
• Addressing the demand from third parties for data
• Addressing the measurable quality of the data
Data working group to prioritize Open data to be published in the Open Data plan.
ADCMA should ensure that the Open Data Plan systematically addresses all of the
datasets identified in the Open Data Review.
Data working group to ensure that open data plan systematically addresses all of
the datasets identified.
ADCMA should ensure that progress against the Open Data Plan is monitored, and
the plan is reviewed at regular frequency.
ADCMA should publish its Open Data in the Abu Dhabi Government Open Data
Portal.
ADCMA should take care to ensure that all Open Data that is published should be
reviewed regularly (Especially when related datasets are published by ADCMA or
other entities) and ensure that:
• The data continuously continues to meet ADCMA's data quality definition
• Security and privacy concerns are continuously reviewed, specifically:
1. Is it reasonably likely that an individual can be identified from those data and
from other data?
2. What other data are available, either to the public or to researchers or other
organizations?
3. How and why could the published open data be linked to other datasets?
4. What is the likelihood of re-identification being attempted?
5. What is the likelihood the re-identification would be successful?
6. Which anonymization techniques are available to use?
7. What is the quality of the data after anonymization has taken place and whether
this will meet the quality gate for this data set’s Open Data release?
Data working group is responsible for this activity.
In the event that the published Open Data fails to meet its quality level or there are
concerns regarding security or privacy, ADCMA Team should:
• Suspend the publication of that dataset as Open Data
• Undertake a new Open Data Review for that dataset
• Establish and execute a mitigation plan for the new concerns and / or data
quality issue
• If necessary, relist the data as ‘Closed’ until such issues can be resolved
The Entity shall capture usage trends and statistics regarding access to the data
published as open data, and report these trends and statistics to the ADCMA Data
Governance Committee.
Business teams along with Data Governance group is responsible for this activity.
In the event that an ADCMA does not publish a dataset or datasets, it shall use its
annual awareness campaign to:
• Explain to the extent possible the reasons for withholding a dataset
• Indicate if and/or when a dataset will be published
• To provide a clear statement if a particular dataset is to remain unpublished for
the foreseeable future
Data working group is responsible for this activity.
The ECM programme will ensure establishing organisational document and content
management standards which defines guidelines for document writing, uniform
document experience (look-and-feel), document naming conventions, document
editorial processes
The Data Manager will work with the ECM Architect (Data Architect) and the
Technical Data Steward from the ECM program to identify compliance to the DCM
standards namely;
- Document & Content Management Standards
- Document Type usage for business cases
- Ability to capture Document Metadata through document lifecycle
Introduce 'media content' retirement and disposal techniques for physical media
While defining requirements for the ECM implementation programme, ensure
establishing the following;
- Document & Content Management Standards
- Document Type usage for business cases
- The Document Metadata that needs to be captured through the document
lifecycle (SharePoint is already providing this functionality for document and
contents)
Include detailed training plan of document and systems management as part of the
Organizational Awareness of Data Management Programme. Entity should allocate
a training budget for ECM programme to develop self paced learning modules for
the specifications asked. The users should be offered certifications once they have
passed the criteria. Entity should aim for all the resources interacting with ECM to
be certified
While implementing the BIA Platform in multiple phases, at each logical point,
effectiveness of data warehouse initiatives should be measured by Business
Sponsor of the BIA programme.
While measure effectiveness, should evaluate including but not limited to below
points.
Technical Alignment with the architectural road map.
Implementation and usage experiences (If there is a huge deviation on the quality
of data anticipated or any deviation of performance of the warehouse or any
deviations on the anticipated data volumes impacting performance of the platform
etc.,)
When there are external datasets identified, ADCMA should be aligned on below
aspects with External dataset providers:
1) Have clear interface agreement with Functional and Non-functional
requirements on dataset sharing including data refresh cycles, data quality
requirements and other performance metrics.
2) Service level Agreements on dataset sharing
3) Have clear ownership within ADCMA and within external supplier of the datasets
4) Should clearly define issue resolution workflow (SLA Definitions should call out)
BIA architect should ensure that ADCMA Teams follow guidelines to define
interface agreements.
Depending upon the complexity involved the transformations required from source
data to target data model in Business Intelligence and analytics platform, data-
staging environment should be used in-line with data architecture definition for
Business Intelligence and analytics platform.
BIA Architect should define the guidelines on using data-staging environment while
ingesting data from sources.
Datawarehouse, Business Intelligence and analytics initiatives should considered
many aspects of below data management aspects :
Metadata Management, Data Catalogue, Data modelling and design, Data
Architecture, Data Quality, Data Storage, Data Integration and Interoperability,
Master Data Management, Reference Data Management
for the new BIA platform, Data Assessment program will propose data architecture
which will include the aspects of data management that ADCMA should be
considered for its BIA platform. ADCMA BIA Architect should ensure to maintain
the data architecture to meet new ADCMA business cases identified.
Open Data available could be considered as one of the External data to be used
along with any identified gaps in the existing ADCMA datasets to suit Business use
cases and is available from external sources (Paid or as Open data).
Data Architect (Integration Architect) to take lead on identifying any external data
to be sourced for meeting the business use cases.
For better supportability, ADCMA should prefer Commercial Off The Shelf (COTS) or
Open Source tooling than internally developed tooling as that could potentially
lead to performance and supportability and scalability issues.
Data Modeler in consultation with data architect should own on special purpose
table types when modelling the data warehouse.
Using Surrogate keys to avoid conflicts with any future data integrations should be
considered. Data flows should be designed to address to some of the common
problems including but not limited to late arriving dimensions etc.,
While designing Schemas (Marts) for specific business use cases, should consider
simplest schemas possible. Any specific business process should be considered in
the specific mart tables for re-usability of the schema objects (Dimensions, facts).
Data modelers in consultation with Data architect should own this activity.
ADCMA should identify conformed dimensions while ingesting data into Business
Intelligence and analytics platforms so that these dimensions could be re-used
across multiple schemas.
Data modelers to work along with data architects on identifying confirmed
dimensions to be re-used across multiple fact tables.
While creating data architecture, there should be an area maintained to retain the
source data without any manipulation for debugging purposes in case of any data
quality / process issues identified.
BIA Architect to ensure that un-altered version of source file is maintained within
BIA platform in-line with BIA target state architecture.
ADCMA should develop performance metrics to control the quality, volume and
timeliness of data within the data warehouse. These metrics should be reviewed
on regular basis to identify any potential SLAs to be defined.
BIA Architect should own this and ensure performance metrics are maintained
within data warehouse.
ADCMA should preferably share the tooling and technology for various marts
creation to re-use the processes for common data processing purposes (Like data
standardization, data quality checks etc.,)
ADCMA should preferably should use same or compatible technology platforms for
data marts.
As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.
As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.
As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.
Enterprise architect should be responsible for this activity and might need
consulting with respective reporting solution architects.
ADCMA should develop and publish any identified statistical data in line with the
Statistics Centre Abu Dhabi (SCAD) requirements.
BIA Architect (Integration) should ensure that Service level agreements are placed
in-line with importance and criticality of data consumed from SCAD.
ADCMA should identify the Big Data use cases to encourage innovation.
Data Governance Board should identify the big data use cases.
ADCMA should implement event stream-based analytical processing to support
high velocity data analysis (Like ADCMA IT asset related threats etc.,). In Data
assessment program, data architecture will be proposed to meet the business
needs of ADCMA which includes stream-based analytics processing for IT use cases.
BIA Architect should own this activity in-line with data architecture proposed.
Quick Wins
The Data Governance organization struture ( Chairperson and Data Manager …etc)
Roles and Responsibilities of DG team
Name of Data Manager as part of the Data Governance core team within the IT support services section
Key Data Domains applicable to ADCMA and Data Architect (s) for the appropriate Data Domains.K
Appointment of BI Data Architect for the BIA programme.
Business Data Stewards from departments & Technical Data Stewards from the IT department
Data Owners list of departments
The data management board
Data Management Policy
Data Governance metrics
Policy document with Change Management process
Responsibility of the Data Manager.
Plan of DG Board of reviewing and update/modify the policies as applicable to ensure alignment with the relevant
legislation and maintain evidence in a secured and shared location within the ADCMA secured environment.
Data Management Standards specifications .
The accountability and responsibility matrix of the data strategy
The change management team
The Data Assessment programme road map
The approval of the DG Structure, plan, policy, operating model by the DG Board
The Data Management Auditors list
Statement of compliance availablity
Master “Business Glossary” with applicable Business Terms (Glossary) in a excel file format.
Data Manager plan audit the “Business Glossary”
Definitions of quality data.
Data security policy
Q1 Q2 Q3 Q4