100% found this document useful (1 vote)
124 views514 pages

Data Governance Final v2

The document contains a list of control names and specifications related to data management. There are over 150 individual controls listed across various categories including organizational structure, data management programs, change management, metadata standards, data catalogues, data modeling, data architecture, data quality, data security and privacy, data systems, and data integration. Each control has a unique code and title.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
124 views514 pages

Data Governance Final v2

The document contains a list of control names and specifications related to data management. There are over 150 individual controls listed across various categories including organizational structure, data management programs, change management, metadata standards, data catalogues, data modeling, data architecture, data quality, data security and privacy, data systems, and data integration. Each control has a unique code and title.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
You are on page 1/ 514

Control Name Control Spec

Org.
DG.1.1
Structure

Org.
DG.1.2
Structure

Org.
DG.1.3
Structure

Org.
DG.1.4
Structure

Org.
DG.1.5
Structure

Org.
DG.1.6
Structure

Org.
DG.1.7
Structure

Org.
DG.2.1
Structure
Org.
DG.2.2
Structure

Org.
DG.2.3
Structure

Org.
DG.2.4
Structure

Org.
DG.2.5
Structure

Org.
DG.2.6
Structure

Org.
DG.2.7
Structure

Org.
DG.2.8
Structure

Org.
DG.2.9
Structure

Org.
DG.2.10
Structure

Org.
DG.2.11
Structure
Org.
DG.2.12
Structure

Org.
DG.2.13
Structure

Org.
DG.2.14
Structure

Org.
DG.2.15
Structure

Org.
DG.2.16
Structure

Data Management Policy DG.2.17

Data Management Programme DG.3.1

Data Management Programme DG.3.2


Data Management Programme DG.3.3

Data Management Programme DG.3.4

Data Management Programme DG.3.5

Data Management Programme DG.3.6

Data Management Programme DG.3.7

Change Management DG.4.1

Change Management DG.4.2

Change Management DG.4.3

Change Management DG.4.4

Change Management DG.4.5


Change Management DG.4.6

Organisational Awareness DG.5.1

Organisational Awareness DG.5.2

Organisational Awareness DG.5.3

Organisational Awareness DG.5.4

Organisational Awareness DG.5.5

Organisational Awareness DG.5.6

Organisational Awareness DG.5.7

Capability Audit DG.6.1


Capability Audit DG.6.2

Capability Audit DG.6.3

Capability Audit DG.6.4

Capability Audit DG.6.5

Capability Audit DG.6.6


Capability Audit DG.6.7

Capability Audit DG.6.8

Capability Audit DG.6.9

Capability Audit DG.6.10


Performance Management DG.7.1

Performance Management DG.7.2

Performance Management DG.7.3


Performance Management DG.7.4

Performance Management DG.7.5

Performance Management DG.7.6

Metadata Standards
MD.1.1
Conformation
Metadata Standards
MD.1.2
Conformation

Metadata Standards
MD.1.3
Conformation
Metadata Standards
MD.1.4
Conformation

MetaData Management
MD.2.1
Programme
MetaData Management
MD.2.2
Programme

MetaData Management
MD.2.3
Programme
MetaData Management
MD.2.4
Programme

MetaData Management
MD.2.5
Programme
MetaData Management
MD.2.6
Programme

MetaData Management
MD.2.7
Programme
Metadata Architecture MD.3.1

Metadata Architecture MD.3.2


Metadata Monitoring MD.4.1

Metadata Monitoring MD.4.2


Metadata Monitoring MD.4.3

Metadata Monitoring MD.4.4

Data Catalogue Requirements DC.1.1

Data Catalogue Requirements DC.1.2


Data Catalogue Requirements DC.1.3

Data Catalogue Requirements DC.1.4

Data Catalogue Principles DC.2.1

Data Catalogue Principles DC.2.2

Data Catalogue Population DC.3.1

Data Catalogue Population DC.3.2

Data Catalogue Population DC.3.3

Data Catalogue Population DC.3.4

Data Catalogue Population DC.3.5


Data Catalogue Population DC.3.6

Data Catalogue Population DC.3.7

Data Catalogue Population DC.3.8

Data Catalogue Population DC.3.9

Data Catalogue Usage DC.4.1

Data Catalogue Usage DC.4.2

Data Catalogue Usage DC.4.3

Data Catalogue Usage DC.4.4

Data Catalogue Usage DC.4.5

Data Catalogue Usage DC.4.6


Data Catalogue Usage DC.4.7

Implement Tools and Methods DM.1.1

Implement Tools and Methods DM.1.2

Implement Tools and Methods DM.1.3

Implement Tools and Methods DM.1.4


Unstructured Data DM.10.1

Unstructured Data DM.10.2

Unstructured Data DM.10.3

Unstructured Data DM.10.4

Unstructured Data DM.10.5


Unstructured Data DM.10.6

Modelling Artefacts DM.2.1

Modelling Artefacts DM.2.10

Modelling Artefacts DM.2.11

Modelling Artefacts DM.2.12


Modelling Artefacts DM.2.2

Modelling Artefacts DM.2.3

Modelling Artefacts DM.2.4

Modelling Artefacts DM.2.5

Modelling Artefacts DM.2.6


Modelling Artefacts DM.2.7

Modelling Artefacts DM.2.8

Modelling Artefacts DM.2.9


Business Glossary and Data
DM.3.1
Dictionary

Business Glossary and Data


DM.3.2
Dictionary

Data Model Metadata DM.4.1

Data Model Metadata DM.4.2


Data Model Metadata DM.4.3

Data Model Metadata DM.4.4

Enterprise Data Model DM.5.1

Enterprise Data Model DM.5.2

Enterprise Data Model DM.5.3

Enterprise Data Model DM.5.4

Conceptual Data Models DM.6.1


Conceptual Data Models DM.6.2

Conceptual Data Models DM.6.3

Conceptual Data Models DM.6.4

Master Profiles DM.7.1

Master Profiles DM.7.2

Master Profiles DM.7.3


Master Profiles DM.7.4

Logical Data Model DM.8.1

Logical Data Model DM.8.2

Logical Data Model DM.8.3

Logical Data Model DM.8.4


Physical Data Model DM.9.1

Physical Data Model DM.9.2

Physical Data Model DM.9.3

Physical Data Model DM.9.4

Arch Methodology DA.1.1


Arch Methodology DA.1.2

Arch Methodology DA.1.3

Arch Methodology DA.1.4

Arch Methodology DA.1.5


Baseline Data Arch DA.2.1

Baseline Data Arch DA.2.2

Baseline Data Arch DA.2.3

Baseline Data Arch DA2.4


Target DA DA.3.1

Target DA DA.3.2

Target DA DA.3.3
Target DA DA.3.4

DA Roadmap DA.4.1

DA Roadmap DA.4.2

DA Roadmap DA.4.3

DA Roadmap DA.4.4

Data Quality Plan DQ.1.1


Data Quality Plan DQ.1.2

Data Quality Plan DQ.1.3

Data Quality Plan DQ.1.4

Data Quality Plan DQ.1.5

Data Quality Plan DQ.1.6

Data Quality Plan DQ.1.7

Data Quality Plan DQ.1.8


Data Quality Audit DQ.2.1

Data Quality Audit DQ.2.2

Data Quality Audit DQ.2.3

Data Quality Audit DQ.2.4

Data Quality Audit DQ.2.5

Data Quality Uplift DQ.3.1

Data Quality Uplift DQ.3.2

Data Quality Uplift DQ.3.3


Information Security
DSP.1.1
Standards

Information Security
DSP.1.2
Standards

Information Security
DSP.1.3
Standards

Information Security
DSP.1.4
Standards

Information Security
DSP.1.5
Standards

Information Security
DSP.1.6
Standards
Data Privacy Policy DSP.2.1

Data Privacy Policy DSP.2.2

Data Privacy Policy DSP.2.3

Data Privacy Policy DSP.2.4

Data Privacy Policy DSP.2.5

Data Privacy Policy DSP.2.6

Data Privacy Policy DSP.2.7


Privacy By Design DSP.3.1

Privacy By Design DSP.3.2

Privacy By Design DSP.3.3

Privacy By Design DSP.3.4

Privacy Management DSP.4.1

Privacy Management DSP.4.2

Privacy Management DSP.4.3


Privacy Management DSP.4.4

Data System Protection DSP.5.1

Data System Protection DSP.5.2

Baseline DS Arch DS1.1


Baseline DS Arch DS1.2

Baseline DS Arch DS1.3

Baseline DS Arch DS1.4

Baseline DS Arch DS1.5

Baseline DS Arch DS1.6

Baseline DS Arch DS1.7

Baseline DS Arch DS1.8

Target DS arch DS.2.1

Target DS arch DS.2.2

Target DS arch DS.2.3

Target DS arch DS.2.4

Target DS arch DS.2.5


Target DS arch DS.2.6

Target DS arch DS.2.7

Target DS arch DS.2.8

DS Roadmap DS.3.1

DS Roadmap DS.3.2

DS Roadmap implementation DS.4.1

DS Roadmap implementation DS.4.2

DS Roadmap implementation DS.4.3

DS Roadmap implementation DS.4.4

DS Roadmap implementation DS.4.5

Data Backup and Recovery DS.5.1

Data Backup and Recovery DS.5.2

Data Backup and Recovery DS.5.3

Data Backup and Recovery DS.5.4

Data Backup and Recovery DS.5.5

Disaster Recovery and Business


Continuity DS.6.1

Disaster Recovery and Business DS.6.2


Continuity
Disaster Recovery and Business DS.6.3
Continuity

Data Lifecycle DS.7.1

Data Lifecycle DS.7.2

Data Lifecycle DS.7.3

Data Lifecycle DS.7.4

Data Lifecycle DS.7.5

Strategic Integration Platform DIO 1.1


Strategic Integration Platform DIO 1.2

Strategic Integration Platform DIO 1.3

Strategic Integration Platform DIO 1.4

Strategic Integration Platform DIO 1.5


Strategic Integration Platform DIO 1.6

Integration Architecture DIO 2.1

Integration Architecture DIO 2.2

Integration Architecture DIO 2.3

Integration Architecture DIO 2.4

Integration Architecture DIO 2.5


Integration Patterns DIO 3.1

Integration Patterns DIO 3.2

Integration Patterns DIO 3.3

Service Level Agreements DIO 4.1

Service Level Agreements DIO 4.2

Service Level Agreements DIO 4.3


Open Data Identification OD.1.1

Open Data Identification OD.1.2

Open Data Identification OD.1.3


Open Data Identification OD.1.4

Open Data Publishing Plan OD.2.1


Open Data Publishing Plan OD.2.2

Open Data Publishing Plan OD.2.3

Open Data Publishing Plan OD.2.4

Open Data Publishing OD.3.1

Open Data Publishing OD.3.2


Open Data Publishing OD.3.3

Open Data Publishing OD.3.4

Open Data Awareness OD.4.1


Open Data Awareness OD.4.2

Reference Data Management RM.1.1


Plan

Reference Data Management RM.1.2


Plan

Identify Reference Data RM.2.1

Identify Reference Data RM.2.2


Identify Reference Data RM.2.3

Identify Reference Data RM.2.4

Identify Reference Data RM.2.5

Identify Reference Data RM.2.6

Reference Data Change RM.3.1


Management
Reference Data Change RM.3.2
Management

Reference Data Change RM.3.3


Management

Reference Data Change RM.3.4


Management

Reference Data Platform RM.4.1

Reference Data Platform RM.4.2


Reference Data Platform RM.4.3

Master Data Management Plan RM.5.1

Master Data Management Plan RM.5.2

Identify Master Data RM.6.1

Identify Master Data RM.6.2


Identify Master Data RM.6.3

Identify Master Data RM.6.4

Operate Master Data RM.7.1

Operate Master Data RM.7.2

Operate Master Data RM.7.3


Operate Master Data RM.7.4

Operate Master Data RM.7.5

Operate Master Data RM.7.6

Operate Master Data RM.7.7

Operate Master Data RM.7.8


Operate Master Data RM.7.9

Master Data Change RM.8.1


Management

Master Data Change RM.8.2


Management

Master Data Change RM.8.3


Management

Master Data Change RM.8.4


Management
Master Data Platform RM.9.1

Master Data Platform RM.9.2

Master Data Platform RM.9.3


Document and Content Quality
Standards DCM.1.1
Document and Content DCM.2.1
Requirements
Document and Content
Requirements DCM.2.2

Document and Content


Requirements DCM.2.3
Document and Content DCM.2.4
Requirements

Document and Content DCM.2.5


Requirements

Document and Content


Requirements DCM.2.6

Document and Content


Requirements DCM.2.7
Document and Content DCM.2.8
Requirements

Document and Content DCM.2.9


Requirements

Document and Content DCM.2.10


Requirements

Document and Content Tools DCM.3.1

Document and Content Tools DCM.3.2

Data Warehouse, Business


Intelligence and Analytics DWBA.1.1
Business Goals
Data Warehouse, Business
Intelligence and Analytics DWBA.1.2
Business Goals

Data Warehouse, Business


Intelligence and Analytics DWBA.1.3
Business Goals

Data Warehouse, Business


Intelligence and Analytics DWBA.1.4
Business Goals

Data Warehouse, Business


Intelligence and Analytics DWBA.2.1
Architecture
Data Warehouse, Business
Intelligence and Analytics DWBA.2.2
Architecture

Data Warehouse, Business


Intelligence and Analytics DWBA.2.3
Architecture

Data Warehouse, Business


Intelligence and Analytics DWBA.2.4
Architecture

Data Warehouse Design and


DWBA.3.1
Modelling

Data Warehouse Design and


DWBA.3.2
Modelling
Data Warehouse Design and
DWBA.3.3
Modelling

Data Warehouse Design and


DWBA.3.4
Modelling

Data Warehouse Design and


DWBA.3.5
Modelling

Data Warehouse Design and


DWBA.3.6
Modelling

Data Warehouse Design and


DWBA.3.7
Modelling

Data Marts DWBA.4.1

Data Marts DWBA.4.2


Data Marts DWBA.4.3

Data Marts DWBA.4.4

Operational Data Stores DWBA.5.1

Operational Data Stores DWBA.5.2

Operational Data Stores DWBA.5.3

Business Intelligence DWBA.6.1

Business Intelligence DWBA.6.2


Business Intelligence DWBA.6.3

Business Intelligence DWBA.6.4

Business Intelligence DWBA.6.5

Business Intelligence DWBA.6.6

Analytics and Big Data DWBA.7.1

Analytics and Big Data DWBA.7.2


Analytics and Big Data DWBA.7.3

Organisational Structure DG.1.1

Organisational Structure DG.1.2

Organisational Structure DG.1.3

Organisational Structure DG.1.4

Organisational Structure DG.1.5

Organisational Structure DG.1.6

Organisational Structure DG.1.7

Data Management Policy DG.2.1


Data Management Policy DG.2.2

Data Management Policy DG.2.3

Data Management Policy DG.2.4

Data Management Policy DG.2.5

Data Management Policy DG.2.6

Data Management Policy DG.2.7

Data Management Policy DG.2.8

Data Management Policy DG.2.9

Data Management Policy DG.2.10

Data Management Policy DG.2.11

Data Management Policy DG.2.12

Data Management Policy DG.2.13


Data Management Policy DG.2.14

Data Management Policy DG.2.15

Data Management Policy DG.2.16

Data Management Policy DG.2.17

Data Management Programme DG.3.1

Data Management Programme DG.3.2

Data Management Programme DG.3.3

Data Management Programme DG.3.4

Data Management Programme DG.3.5


Data Management Programme DG.3.6

Data Management Programme DG.3.7

Change Management DG.4.1

Change Management DG.4.2

Change Management DG.4.3

Change Management DG.4.4

Change Management DG.4.5


Change Management DG.4.6

Organisational Awareness DG.5.1

Organisational Awareness DG.5.2

Organisational Awareness DG.5.3

Organisational Awareness DG.5.4

Organisational Awareness DG.5.5

Organisational Awareness DG.5.6

Organisational Awareness DG.5.7

Capability Audit DG.6.1

Capability Audit DG.6.2


Capability Audit DG.6.3

Capability Audit DG.6.4

Capability Audit DG.6.5

Capability Audit DG.6.6

Capability Audit DG.6.7

Capability Audit DG.6.8

Capability Audit DG.6.9

Capability Audit DG.6.10

Performance Management DG.7.1

Performance Management DG.7.2


Performance Management DG.7.3

Performance Management DG.7.4

Performance Management DG.7.5

Performance Management DG.7.6

Metadata Standards
MD.1.1
Conformation
Metadata Standards
MD.1.2
Conformation

Metadata Standards
MD.1.3
Conformation
Metadata Standards
MD.1.4
Conformation

MetaData Management
MD.2.1
Programme
MetaData Management
MD.2.2
Programme

MetaData Management
MD.2.3
Programme
MetaData Management
MD.2.4
Programme

MetaData Management
MD.2.5
Programme
MetaData Management
MD.2.6
Programme

MetaData Management
MD.2.7
Programme
Metadata Architecture MD.3.1

Metadata Architecture MD.3.2


Metadata Monitoring MD.4.1

Metadata Monitoring MD.4.2


Metadata Monitoring MD.4.3

Metadata Monitoring MD.4.4

Data Catalogue Requirements DC.1.1

Data Catalogue Requirements DC.1.2


Data Catalogue Requirements DC.1.3

Data Catalogue Requirements DC.1.4

Data Catalogue Principles DC.2.1

Data Catalogue Principles DC.2.2

Data Catalogue Population DC.3.1

Data Catalogue Population DC.3.2

Data Catalogue Population DC.3.3

Data Catalogue Population DC.3.4

Data Catalogue Population DC.3.5

Data Catalogue Population DC.3.6

Data Catalogue Population DC.3.7


Data Catalogue Population DC.3.8

Data Catalogue Population DC.3.9

Data Catalogue Usage DC.4.1

Data Catalogue Usage DC.4.2

Data Catalogue Usage DC.4.3

Data Catalogue Usage DC.4.4

Data Catalogue Usage DC.4.5

Data Catalogue Usage DC.4.6

Data Catalogue Usage DC.4.7


Implement Tools and Methods DM.1.1

Implement Tools and Methods DM.1.2

Implement Tools and Methods DM.1.3

Implement Tools and Methods DM.1.4

Unstructured Data DM.10.1


Unstructured Data DM.10.2

Unstructured Data DM.10.3

Unstructured Data DM.10.4

Unstructured Data DM.10.5

Unstructured Data DM.10.6


Modelling Artefacts DM.2.1

Modelling Artefacts DM.2.10

Modelling Artefacts DM.2.11

Modelling Artefacts DM.2.12

Modelling Artefacts DM.2.2


Modelling Artefacts DM.2.3

Modelling Artefacts DM.2.4

Modelling Artefacts DM.2.5

Modelling Artefacts DM.2.6


Modelling Artefacts DM.2.7

Modelling Artefacts DM.2.8

Modelling Artefacts DM.2.9


Business Glossary and Data
DM.3.1
Dictionary

Business Glossary and Data


DM.3.2
Dictionary

Data Model Metadata DM.4.1

Data Model Metadata DM.4.2


Data Model Metadata DM.4.3

Data Model Metadata DM.4.4

Enterprise Data Model DM.5.1

Enterprise Data Model DM.5.2

Enterprise Data Model DM.5.3

Enterprise Data Model DM.5.4

Conceptual Data Models DM.6.1


Conceptual Data Models DM.6.2

Conceptual Data Models DM.6.3

Conceptual Data Models DM.6.4

Master Profiles DM.7.1

Master Profiles DM.7.2

Master Profiles DM.7.3


Master Profiles DM.7.4

Logical Data Model DM.8.1

Logical Data Model DM.8.2

Logical Data Model DM.8.3

Logical Data Model DM.8.4


Physical Data Model DM.9.1

Physical Data Model DM.9.2

Physical Data Model DM.9.3

Physical Data Model DM.9.4

Arch Methodology DA.1.1


Arch Methodology DA.1.2

Arch Methodology DA.1.3

Arch Methodology DA.1.4

Arch Methodology DA.1.5


Baseline Data Arch DA.2.1

Baseline Data Arch DA.2.2

Baseline Data Arch DA.2.3

Baseline Data Arch DA2.4


Target DA DA.3.1

Target DA DA.3.2

Target DA DA.3.3
Target DA DA.3.4

DA Roadmap DA.4.1

DA Roadmap DA.4.2

DA Roadmap DA.4.3

DA Roadmap DA.4.4

Data Quality Plan DQ.1.1


Data Quality Plan DQ.1.2

Data Quality Plan DQ.1.3

Data Quality Plan DQ.1.4

Data Quality Plan DQ.1.5

Data Quality Plan DQ.1.6

Data Quality Plan DQ.1.7

Data Quality Plan DQ.1.8

Data Quality Audit DQ.2.1

Data Quality Audit DQ.2.2

Data Quality Audit DQ.2.3


Data Quality Audit DQ.2.4

Data Quality Audit DQ.2.5

Data Quality Uplift DQ.3.1

Data Quality Uplift DQ.3.2

Data Quality Uplift DQ.3.3

Information Security
DSP.1.1
Standards

Information Security
DSP.1.2
Standards

Information Security
DSP.1.3
Standards

Information Security
DSP.1.4
Standards
Information Security
DSP.1.5
Standards
Information Security
DSP.1.6
Standards
Data Privacy Policy DSP.2.1

Data Privacy Policy DSP.2.2

Data Privacy Policy DSP.2.3


Data Privacy Policy DSP.2.4

Data Privacy Policy DSP.2.5

Data Privacy Policy DSP.2.6

Data Privacy Policy DSP.2.7

Privacy By Design DSP.3.1

Privacy By Design DSP.3.2

Privacy By Design DSP.3.3

Privacy By Design DSP.3.4


Privacy Management DSP.4.1

Privacy Management DSP.4.2


Privacy Management DSP.4.3
Privacy Management DSP.4.4

Data System Protection DSP.5.1

Data System Protection DSP.5.2

Baseline DS Arch DS1.1


Baseline DS Arch DS1.2

Baseline DS Arch DS1.3

Baseline DS Arch DS1.4

Baseline DS Arch DS1.5

Baseline DS Arch DS1.6

Baseline DS Arch DS1.7

Baseline DS Arch DS1.8

Target DS arch DS.2.1

Target DS arch DS.2.2

Target DS arch DS.2.3


Target DS arch DS.2.4

Target DS arch DS.2.5

Target DS arch DS.2.6

Target DS arch DS.2.7

Target DS arch DS.2.8

DS Roadmap DS.3.1

DS Roadmap DS.3.2

DS Roadmap implementation DS.4.1

DS Roadmap implementation DS.4.2

DS Roadmap implementation DS.4.3

DS Roadmap implementation DS.4.4


DS Roadmap implementation DS.4.5

Data Backup and Recovery DS.5.1

Data Backup and Recovery DS.5.2

Data Backup and Recovery DS.5.3

Data Backup and Recovery DS.5.4

Data Backup and Recovery DS.5.5

Disaster Recovery and


DS.6.1
Business Continuity

Disaster Recovery and


DS.6.2
Business Continuity

Disaster Recovery and


DS.6.3
Business Continuity

Data Lifecycle DS.7.1


Data Lifecycle DS.7.2

Data Lifecycle DS.7.3

Data Lifecycle DS.7.4

Data Lifecycle DS.7.5

Strategic Integration Platform DIO 1.1

Strategic Integration Platform DIO 1.2

Strategic Integration Platform DIO 1.3


Strategic Integration Platform DIO 1.4

Strategic Integration Platform DIO 1.5

Strategic Integration Platform DIO 1.6

Integration Architecture DIO 2.1

Integration Architecture DIO 2.2

Integration Architecture DIO 2.3

Integration Architecture DIO 2.4

Integration Architecture DIO 2.5


Integration Patterns DIO 3.1

Integration Patterns DIO 3.2

Integration Patterns DIO 3.3

Service Level Agreements DIO 4.1

Service Level Agreements DIO 4.2

Service Level Agreements DIO 4.3


Open Data Identification OD.1.1

Open Data Identification OD.1.2

Open Data Identification OD.1.3


Open Data Identification OD.1.4

Open Data Publishing Plan OD.2.1


Open Data Publishing Plan OD.2.2

Open Data Publishing Plan OD.2.3

Open Data Publishing Plan OD.2.4

Open Data Publishing OD.3.1

Open Data Publishing OD.3.2


Open Data Publishing OD.3.3

Open Data Publishing OD.3.4

Open Data Awareness OD.4.1


Open Data Awareness OD.4.2

Reference Data Management


RM.1.1
Plan

Reference Data Management


RM.1.2
Plan

Identify Reference Data RM.2.1


Identify Reference Data RM.2.2

Identify Reference Data RM.2.3

Identify Reference Data RM.2.4

Identify Reference Data RM.2.5

Identify Reference Data RM.2.6


Reference Data Change
RM.3.1
Management

Reference Data Change


RM.3.2
Management

Reference Data Change


RM.3.3
Management

Reference Data Change


RM.3.4
Management

Reference Data Platform RM.4.1


Reference Data Platform RM.4.2

Reference Data Platform RM.4.3

Master Data Management


RM.5.1
Plan

Master Data Management


RM.5.2
Plan

Identify Master Data RM.6.1


Identify Master Data RM.6.2

Identify Master Data RM.6.3

Identify Master Data RM.6.4

Operate Master Data RM.7.1

Operate Master Data RM.7.2


Operate Master Data RM.7.3

Operate Master Data RM.7.4

Operate Master Data RM.7.5

Operate Master Data RM.7.6

Operate Master Data RM.7.7


Operate Master Data RM.7.8

Operate Master Data RM.7.9

Master Data Change


RM.8.1
Management

Master Data Change


RM.8.2
Management

Master Data Change


RM.8.3
Management
Master Data Change
RM.8.4
Management

Master Data Platform RM.9.1

Master Data Platform RM.9.2

Master Data Platform RM.9.3


Document and Content
DCM.1.1
Quality Standards

Document and Content


DCM.2.1
Requirements

Document and Content


DCM.2.2
Requirements

Document and Content


DCM.2.3
Requirements

Document and Content


DCM.2.4
Requirements

Document and Content


DCM.2.5
Requirements

Document and Content


DCM.2.6
Requirements

Document and Content


DCM.2.7
Requirements
Document and Content
DCM.2.8
Requirements

Document and Content


DCM.2.9
Requirements

Document and Content


DCM.2.10
Requirements

Document and Content Tools DCM.3.1

Document and Content Tools DCM.3.2

Data Warehouse, Business


Intelligence and Analytics DWBA.1.1
Business Goals
Data Warehouse, Business
Intelligence and Analytics DWBA.1.2
Business Goals

Data Warehouse, Business


Intelligence and Analytics DWBA.1.3
Business Goals

Data Warehouse, Business


Intelligence and Analytics DWBA.1.4
Business Goals

Data Warehouse, Business


Intelligence and Analytics DWBA.2.1
Architecture
Data Warehouse, Business
Intelligence and Analytics DWBA.2.2
Architecture

Data Warehouse, Business


Intelligence and Analytics DWBA.2.3
Architecture

Data Warehouse, Business


Intelligence and Analytics DWBA.2.4
Architecture

Data Warehouse Design and


DWBA.3.1
Modelling

Data Warehouse Design and


DWBA.3.2
Modelling
Data Warehouse Design and
DWBA.3.3
Modelling

Data Warehouse Design and


DWBA.3.4
Modelling

Data Warehouse Design and


DWBA.3.5
Modelling

Data Warehouse Design and


DWBA.3.6
Modelling

Data Warehouse Design and


DWBA.3.7
Modelling

Data Marts DWBA.4.1

Data Marts DWBA.4.2


Data Marts DWBA.4.3

Data Marts DWBA.4.4

Operational Data Stores DWBA.5.1

Operational Data Stores DWBA.5.2

Operational Data Stores DWBA.5.3

Business Intelligence DWBA.6.1

Business Intelligence DWBA.6.2


Business Intelligence DWBA.6.3

Business Intelligence DWBA.6.4

Business Intelligence DWBA.6.5

Business Intelligence DWBA.6.6

Analytics and Big Data DWBA.7.1

Analytics and Big Data DWBA.7.2


Analytics and Big Data DWBA.7.3
Definition

The Entity shall establish an organizational structure to support the Data Management Programme.

The Entity shall convene the Data Governance Board to manage delegated authority and responsibility within the Entity. The Board
will be the final arbiter within the Entity for all matters relating to data management.

The Entity shall appoint a Data Manager.


The Data Manager shall have delegated authority from the Data Governance Board.

The Entity shall identify and appoint Data Architects to support the Data Manager.

The Entity shall identify and appoint Data Stewards to support the Data Manager in both the business and technical areas of the
organisation.

The Entity shall identify and appoint Data Owners (who are responsible for a particular dataset) to support the Data Stewards. Data
Owners will be drawn from both the business and technical areas of the organisation.

The Entity shall regularly undertake monitoring and compliance checking to ensure that information systems and data related
processes are implemented in accordance with established policy, standards and best practices.

The Entity’s Data Management Policy shall address the scope of its data management systems, roles, responsibilities, management
commitment, coordination among organisational functions, and compliance obligations.
The policy shall contain a definition of data management; its overall objectives and scope, and the importance of data management as
a pillar of upholding high standards of data quality.

The policy shall contain a definition of data management; its overall objectives and scope, and the importance of data management as
a pillar of upholding high standards of data quality.

The policy shall be applicable to all business functions of the organisation and should be supplemented by supporting instructions and
guidance where appropriate for specific areas of activity.

The Entity shall establish its Data Management Policy (through implementing this control), describing how data will be managed
across the Entity.

In support of the Data Management Policy, the Entity shall establish policies for public consumption where there are external
stakeholders.

The policy shall cover the end-to-end data management lifecycle.

The policy should clearly express management's commitment to data management principles and highlight its alignment with
government strategy.

The policy should emphasize management expectations for data handling and the importance of maintaining high data quality
throughout the organization.

The Entity must include governance metrics and process checkpoints in their policy to measure ongoing data management
effectiveness in their systems and processes.

The policy should outline a system for users to report data issues and include an escalation plan for their resolution.
The policy must detail the change management process, specifically how it relates to the Data Management Program and its
initiatives.

The policy will undergo at least annual reviews, overseen by the Data Management Board to maintain its relevance and effectiveness.
More frequent reviews may be needed in response to major business or regulatory changes.

The Entity shall ensure that all policy developments are aligned with all relevant legislation.

The Entity shall collect and maintain evidence of compliance with their policies, and with the Control Specifications within these
standards.

The policy must be quantifiable and tied to the Control Standards in this document, with the Entity able to show how each control
supports a specific policy requirement.

The Entity must collect written confirmations from all personnel and stakeholders, including internal and external parties and
contractors, demonstrating their understanding and commitment to adhere to the Policy, with these signed records kept on file for
future reference.

The Entity is required to establish and maintain specific, measurable, and scheduled goals as part of its Data Management
Programme. These goals should align with the Entity's business strategy, risk management, compliance with data management
policies and legal requirements, and the promotion of a data-aware organizational culture.

The Plan shall be made available to ADSIC (ADDA) for review.


The Plan should outline data management initiatives, align with the Entity's strategic plan, undergo annual reviews, include key
performance indicators, and indicate budget requirements for planned initiatives.

The Entity must specify robust version control for all Data Management Programme documents in the plan.

The Entity's Data Management Programme must be approved by the accountable executive responsible for the associated
operational risks.

To support its Data Management Programme, the Entity will develop subsidiary plans for specific capabilities, like Data Governance,
Organizational Awareness and Training, Disaster Recovery, Document and Content Management, Data Architecture Management,
Inter-Entity Data Integration, and Reference and Master Data Management. These plans can either stand alone or be included as
appendices in the Entity's Data Management Programme Plan.

The Entity must follow the Government Data Management Model's principles and structure (Owned, Described, Quality, Access,
Implemented) in its Data Management Programme. These principles should be incorporated into subsidiary plans and integrated into
the business processes implemented throughout the Data Management Programme rollout

The Entity’s Data Governance Board should approve all changes to the Data Management Programme (e.g. Plan or Policy)

The Entity shall integrate its existing change management processes into each of the data management domains, or create a new
change management process if none already exists.

The Entity should establish a baseline for its Data Management Programme Plan, with proposed changes to the plan being analysed
for impact

Changes to the Data Management Programme Plan should be coordinated with the organisation-wide Change Management
capabilities of the Entity to ensure on-going alignment between Data Management and other organisation initiatives

When these Standards necessitate changes to existing business processes, the Entity should conduct an impact assessment to identify
stakeholders and processes affected, ensuring coordinated communication of the change.
The Entity shall develop and maintain its change management processes for the Data Management Programme as a whole, and
domain-level processes developed within the Data Management Programme

The Entity shall develop and execute organisation-wide awareness programmes for the required data domains

The Entity shall develop and execute organisation-wide awareness programmes for the required data domains

The Entity shall develop and execute organisation-wide awareness programmes for the required data domains

The Entity shall develop and execute organisation-wide awareness programmes for the required data domains

The Entity shall develop and execute organisation-wide awareness programmes for the required data domains

The Entity shall develop and execute organisation-wide awareness programmes for the required data domains

The Entity shall develop and execute organisation-wide awareness programmes for the required data domains

The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain

The Entity shall perform an audit of its capabilities and/or current state for each data domain

The Entity shall perform an audit of its capabilities and/or current state for each data domain

The Entity shall perform an audit of its capabilities and/or current state for each data domain

The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall perform an audit of its capabilities and/or current state for each data domain

The Entity shall perform an audit of its capabilities and/or current state for each data domain

The Entity shall perform an audit of its capabilities and/or current state for each data domain

The Entity shall perform an audit of its capabilities and/or current state for each data domain
The Entity shall develop, report against and analyse key performance indicators relating to its Data Management Programme

The Entity shall develop, report against and analyse key performance indicators relating to its Data Management Programme

The Entity shall develop, report against and analyse key performance indicators relating to its Data Management Programme
Data Management performance data shall be verified by a competent and independent party that is not directly connected with the
work that is the subject of measurement

Data management performance reporting will evaluate technology and business process compliance, data
architecture maintenance, data model completeness, system-level data models, master profiles, data quality
milestones, master/reference data management achievements, and information/document lifecycles.

Performance data will influence continuing changes, which the Data Governance Board will assess for cost,
benefit, and status.

The Entity will follow Abu Dhabi Government Metadata Standards, including eGIF, SCAD, and geographical
metadata.
The Entity must follow Abu Dhabi Government Metadata Standards for eGIF, SCAD, and GIS.

The Entity must follow ISO/IEC:11179 Part 4 for precise data definitions in its business glossary, data dictionary,
and other data management domains.
The Entity must adopt ISO/IEC:11179 Part 5 for naming and identification principles to provide meaningful
names and IDs (e.g., Emirates ID) for persons or certain data contexts.

The Entity is going to launch a metadata programme that includes metadata assessment, stakeholder interviews,
requirement collecting, metadata architecture development, data stewardship, and a metadata management
implementation strategy.
To customise metadata for its operations, the Entity will employ Abu Dhabi government and international
standards including eGIF, SCAD, Geospatial, and ADMS. Abu Dhabi Government eGIF Metadata Standard-
compliant metadata items, improvements, and encoding methods will be included.

The Entity will employ information manually and automatically. According to the program's timeline, automated
scanning and proprietary techniques will ensure metadata correctness. To ensure metadata quality, Data Stewards
will manage recorded information and add business and technical metadata (Ref: MD.4.4).
The Data Governance Board will settle metadata definition and quality disputes not resolved by Data Stewards,
particularly where information spans departments.

Through the Data Catalogue, the Entity must make all metadata, data dictionary, business vocabulary, and
modelling and architectural deliverables available to users.
Based on user role, the Data Catalogue will index, search, and retrieve metadata.

Metadata values will indicate the capture version for Elements, Refinements, and Encoding Schemes, which the
Entity will version manage.
The Enterprise Data Architecture will include the Entity's metadata architecture, which must be documented per
DA Standards.

The Entity shall evaluate metadata architecture choices for central standard compliance and present arguments to
the Data Governance Board. A single repository, decentralised components with a single access point, or hybrid
components with central metadata management are possible.
Based on Data Quality standards, the Entity will construct metadata name and definition quality measurements,
maybe utilising subjective business experience, user surveys, and other techniques to assess metadata collection,
discovery, and utilisation.

The Entity will monitor and report on metadata quality based on established measurements. .
The Entity will assess metadata coverage across its business functions, including metadata definition, capture, and
usage coverage, with a focus on cross-business function metadata.

The Entity will align its Data Catalog with mandatory standards to enable interoperability. Mandatory standards include
Abu Dhabi Government eGIF Schema Generation, DCAT (Data Catalog Vocabulary), and XSD (XML Schema
Definition) for dataset structure description.

The Entity should align its Data Catalog with recommended standards like ADMS for asset description and RDF for
semantic relationships. In cases where alignment is not possible due to vendor limitations, the Entity must document
and justify the non-alignment to the Data Governance Board.

The Entity is required to create a Data Catalog with key features, including a metadata repository, a publishing portal
for controlled access, a workflow management tool, a business glossary, a data dictionary, a data model repository,
and version control.
The Entity will align its data catalog requirements with emerging government-wide standards.

The Entity shall develop its Data Catalog according to principles like usability, common usage, representation,
accuracy, sufficiency, economy, consistency, and integration. Alignment will be demonstrated through the Governance
Checkpoint Process, with conflicts resolved through practical solutions approved by the Data Governance Board.

The Data Catalog will serve as a central access point for all users, both internal and external, seeking information
about the Entity's data assets, even though these assets are stored in various systems. It will offer a unified resource
for finding and understanding any data asset.

The Entity will select datasets for the Data Catalog, including transactional, reference, master data, statistical, and
geospatial datasets. Factors like user numbers, reusability, and data complexity will be considered.

The Entity should create semantic data models for captured data, defining data relationships with a machine-readable
vocabulary.

The Entity shall define metadata for data capture, following metadata standards and emphasizing reusability. This
includes elements like Elements, Refinements, Encoding Schemes, standard names, and definitions. Compliance with
Abu Dhabi Government eGIF and other relevant domain standards' metadata requirements will be part of the Data
Catalog Population Roadmap.

The Entity will document the approach for capturing and populating metadata in the Data Catalog, with details for each
dataset in the Data Catalog Population Roadmap. Metadata will cover ownership, security classification, data quality,
validity periods, and version information.

The Entity will maintain metadata in the Data Catalog using the Governance Checkpoint Process. Additionally, the
Data Catalog Population Roadmap will define minimum metadata refresh periods for each dataset captured.

The Entity will categorize data assets within a hierarchy as follows: Metadata, Reference Data, Master Data,
Transactional Data, and Audit and Log Data. Data classes higher in the hierarchy are more critical, as lower-class
data relies on them. Higher-level data is relatively stable and has a longer lifespan, whereas lower-level data is more
dynamic and has a shorter useful life. The volume of data decreases as you move up the hierarchy, with the lower
classes having more data that is subject to frequent changes.
The Entity will create and publish a data sharing licensing model accessible via the data catalog.

The Entity will run an awareness program to promote Data Catalog information to stakeholders, emphasizing data
reuse benefits and available datasets.

Data reuse considerations will be integrated into the System Development Lifecycle (SDLC) of information systems,
with monitoring through the Governance Checkpoint Process for Data Governance Board approval.

The Entity will encourage innovative data use submissions from various functions, evaluated by the Data Governance
Board for merit and promoted by the Data Manager resulting from Data Catalog usage.

The Entity will allow dataset consumers to register their data usage in the Data Catalog, ensuring they are informed of
dataset changes. Consumers can be individuals, application system representatives, or business function
representatives.

Registered consumers of datasets will be classified as Formal or Informal. Formal consumers have service level
agreements with data producers, while Informal consumers rely on published licenses and policies.

The Entity will monitor and report on Data Catalog effectiveness using metrics such as dataset coverage, registered
consumers, and metadata completeness. An annual report on data coverage effectiveness will be presented to the
Data Governance Board.

The Entity will ensure that Data Governance Board reviews data models in the software development lifecycle as part
of the Governance Checkpoint Process. Data models are crucial deliverables for systems built, purchased, or
commissioned to support business and technology needs.

The Entity will implement data modeling tools with specific capabilities, including UML2.x-compliant models, support
for UML model interchange using the XMI Interchange Format, modeling of structured and unstructured datasets,
Common Warehouse Metamodel (CWM) support for data warehouse systems, metadata association for reusability,
and model versioning with traceability. Existing toolsets will be certified to meet these requirements, or the Entity will
initiate an effort to fill any gaps.
The Entity will provide training and education programs for developing data models to enhance awareness and value
for both business and technical users, tailored to their engagement levels with information systems.

The Entity will develop data models at conceptual, logical, and physical levels, involving input from various
stakeholders. The journey to document the as-is data model typically includes creating conceptual data models,
logical data models, and physical data models to represent the high-level ideas, system-independent views, and
system-specific implementations of data structures, respectively. Enterprise modeling will emphasize steps 1 and 2,
while information system modeling will focus on steps 2 and 3.

The Entity shall model unstructured data that is associated with structured data based on business terms and logical
concepts. This modeling can involve capturing concepts expressed in documents linked to records, such as medical
or education reports.

Semi-structured or unstructured data, including free text, images, audio, and video, must be modeled to document the
mandatory requirements, metadata describing concepts within unstructured data, and associated structured identifying
data. For instance, modeling may entail specifying requirements for citizen ID photos, including image characteristics
and associated structured data like Emirates ID and date.

The Entity will employ conversion techniques to transform semi-structured and unstructured data into structured
formats, allowing for the formal documentation and modeling of such data.
When converting unstructured data into structured forms, the Entity will align its processes with the Unstructured
Information Management Architecture (UIMA), enabling analysis of unstructured artifacts and the development and
modeling of artifact metadata. Governance of unstructured content lifecycles will be established through suitable
workflows (refer to DCM.2).

The Entity shall create Data Flow Diagrams and Entity Relationship Diagrams for unstructured data. Data Flow
Diagrams will illustrate the flow of unstructured information, along with associated metadata and identifying data,
between systems. Entity Relationship Diagrams will depict relationships between unstructured information concepts
and structured identifying data, as well as relationships between different unstructured information concepts.

The Entity shall model unstructured data that is associated with structured data based on business terms and logical
concepts. This modeling can involve capturing concepts expressed in documents linked to records, such as medical
or education reports.

The Entity will develop data models at the Conceptual, Logical, and Physical levels with interconnections to enable the
mapping of physical information systems to logical models and higher conceptual understanding. These data modeling
artifacts will be integral to the Entity's mandatory system design and architecture documentation.

Data modeling artifacts, including Entity Relationship Diagrams and Data Flow Diagrams, will be produced uniformly
for both structured and unstructured data.
The Entity will make data models available for reference and re-use within the organization. Data Architects will
evaluate pre-existing data models, align or re-use them for new information systems where feasible. Any exceptions to
this practice will require justification in the system design, with approval from the Data Governance Board.

UML diagrams will serve as the primary modeling notation throughout the software development lifecycle. Any
deviations from this standard will be documented and submitted for authorization by the Data Governance Board. The
primary use of UML diagrams will involve structural diagrams, including Class Diagrams, Entity Relationship
Diagrams, Component Diagrams, and Deployment Diagrams.

To enhance communication of data model concepts with business stakeholders, the Entity will employ tools better
suited for this purpose, such as text-based documents, presentation slides, and spreadsheets. The Data Governance
Board will contribute to the development of guidance to ensure effective communication with departments and
stakeholders.

Entity-Relationship diagrams and Class Diagrams will be used to document data object structure and relationships
across conceptual, logical, and physical levels.

Data Flow Diagrams will be employed to model data movement within and between systems, with a particular focus
on data forming part of the Entity's master profiles. This includes identifying and capturing points of data capture,
actions that transform or aggregate data, data export points (automatic or manual), and service endpoints emitting
master and common profiles.
In the case of very large models that can be challenging to read (e.g., models with over 200 tables or descriptive
artifacts), they should be subdivided into smaller, subject area-based models and aggregated into higher-level models
to maintain clarity. The primary purpose of data models is to aid understanding.

Data models will provide clear differentiation between aspects that are currently implemented and those that are not
yet implemented.

Data modeling artifacts shall form an integral part of the Entity's mandatory system design and architecture
documentation.

When designing new conceptual data models, the Entity shall ensure that data objects are represented by nouns, and
data relationships are represented by verbs.

For new logical data models, the Entity shall adhere to rules that include using appropriate data types for attributes
within tables, taking into account performance, storage, and data requirements. For instance, the Entity should
consider more suitable data types before using String or other variable character data types.
When designing new physical data models, the Entity shall follow specific rules, including using numeric primary keys
and employing numeric primary keys in reference data tables. Reference data tables will have, at a minimum, a
numeric primary key and a code value represented as a string. Physical data types with length or precision specifiers
should have appropriate lengths or precisions specified and not rely on default values.

The data model should indicate master/slave/federation rules when the Entity identifies data duplication across the
organization or when datasets owned by another Entity are used by an information system. These rules determine
how the datasets are managed, identifying which datasets are master, slave, or federated across systems.

Business terms for data objects, attributes, relationships, and values with contextual business meaning shall be
captured and defined. Business definitions will ensure consistency in the use and meaning of data objects and
relationships across the Entity. Business definitions will be stored in the business glossary section of the Entity's Data
Catalogue.
Technical definitions for terms within the business glossary will be produced to aid data integration and development
projects that span multiple systems. These technical definitions will consider logical and physical models and may
include technical validations like state diagrams, flow charts, and regular expressions. Technical definitions will be
maintained within the data dictionary of the Data Catalogue.

Minimum data model metadata that the Entity shall maintain includes Model Identifier, Responsibility Assignment,
Published Status, and Change History. Additional metadata will be determined based on the Entity's requirements and
evaluated by the Data Governance Board.

The Entity shall maintain traceability links for different views of the same subject area using annotations that indicate
other existing views. Lower-level identifiers will be used as part of the Reference Number element of the model
identifier to pre-assign numbers to different subject areas. Other metadata for data models will be decided based on
the Entity's requirements, evaluated by the Data Governance Board, and issued to staff.

Data models shall be stored in a version-controlled repository, and the Entity recommends using version control
repositories built into data modeling tooling, external version control repositories or document management systems
that support versioning, or file system structure as an interim solution.
The Entity will develop an enterprise-wide data model that represents an organization-wide view of all data central to
the Entity's core business functions. The enterprise data model is a key aspect of the baseline and target enterprise
data architectures.

The Data Governance Board will maintain oversight and approval of enterprise data models and socialize the
enterprise data model through working groups to facilitate sharing with other Entities.

When developing new data models for system implementations, the Entity shall ensure alignment with the Entity's
Enterprise Data Model. Conceptual, logical, and physical data models will demonstrate alignment with master profiles
and common profiles in the government Data Catalogue.

The Entity will align its Enterprise Data Model with government-wide data models as they emerge.

The Entity shall develop conceptual data models to support the architecture, development, and operational processes
for its data. Conceptual data models will be required as part of the system development lifecycle and provided to the
Data Governance Board through the Governance Checkpoint Process.

Techniques to develop conceptual data models include interviewing stakeholders, identifying candidate data profiles,
and combining candidate data profiles into master data profiles, transactional data profiles, and reference data
profiles. Conceptual data modeling shall be performed at a system or enterprise level, depending on the data view
needed.

Conceptual data models shall be used for documentation to support development of logical data models, change
requests, impact assessments, and gap analyses between baseline and target state requirements.
The Entity shall identify and model all master profiles and relationships between them. Master profiles represent core
data for the Entity's line of business. For example, a 'Citizen' profile may include family relationships, contact details,
and name change history.

Master profiles shall be documented at conceptual and logical levels and form part of the Entity's enterprise data
model. Each system containing master profile data shall have its data modeled at conceptual, logical, and physical
levels.

Entity master profiles shall be made available to ADSIC upon request to facilitate government-wide common profile
development, and the Entity shall align local profiles with government-wide common profiles when appropriate.

Share master profiles for government-wide alignment.

Develop logical data models with relationship rules and denormalization process.

Ensure technology-independent logical data models, considering alternative data repositories.


Use logical data types and business rules for flexible data representation.

Logical data models support development, change requests, and impact assessments. They are shared with the Data
Governance Board.

Develop physical data models based on logical data models for detailed technical specifications.

Physical data models enable technical implementation and operational functions, e.g., SQL queries.

Map logical to physical design, specifying configuration details as needed.


Reverse engineer data models from existing systems to support data architecture baselining, linking them to logical
models for analysis and inclusion.

Data architecture in TOGAF includes component models, data profiles, lifecycle models, security, quality, and change
processes.

Data architecture deliverables cover various domains, including metadata, data quality, security, and management
systems.

Architectural elements are classified as emerging, current, strategic, or retirement.


Specialized data architecture standards from Abu Dhabi Government centers of excellence are used.

Baseline data architectures are developed for information systems under the Entity's control, including maintenance
and updates.

Development of baseline data architecture considers business and technical requirements, data architecture themes,
and constraints.

System-level baseline data architecture is used for system changes and reviews, guided by the Data Governance
Board.
Baseline data architectures are continuously maintained and versioned.

The Entity produces a target enterprise data architecture, informed by the baseline architecture but not dependent on
it.

Baseline data architectures provide a foundation for developing and validating target data architectures.

The Entity produces target data architectures as information systems go through change cycles, required in the
Governance Checkpoint Process.
Target data architectures, at system or enterprise levels, address gaps, encourage data integration, remove
duplication, align with standards, and promote reuse.

The target data architecture influences technology and data requirements for system changes alongside business and
quality requirements.

The Entity identifies gaps between baseline and target enterprise data architectures, covering business, technical, and
capability aspects.
Gap analysis results in a roadmap to move from baseline to target enterprise data architecture, periodically reviewed
by the Data Governance Board.

The roadmap includes timelines, budgets, and priorities for component and system changes, remaining flexible to
address business priorities.

The Entity follows the roadmap during system development and maintenance, ensuring alignment with the enterprise
target data architecture. Annual reports assess the roadmap's effectiveness by identifying gaps between the starting
and ending baseline enterprise data architectures.

The Entity establishes data quality definitions covering various data types, which are stored in the business glossary
and data dictionary.

Data quality definitions are linked to business processes to evaluate the impact of data quality on operations, ensuring
accuracy in processes like citizen contact.
Data quality definitions encompass key aspects such as validity, timeliness, integrity, accuracy, and reliability,
determining acceptable criteria and assessing business benefits.

Metadata is aligned with data quality definitions to populate the Data Catalogue, including quantitative and qualitative
measures.

The Entity creates a data quality checklist tailored to its datasets to facilitate data audits in line with established data
quality definitions.

Data Quality Management Plan: The Entity will create a plan for auditing, monitoring, and maintaining data quality. This
plan covers the quality of master profiles, dataset quality, and addressing issues reported by users. It involves defining
roles, using data profiling and cleansing tools, and producing data quality metadata and requirements. The Data
Governance Board will oversee this plan, including one-off and incremental audits.

Data Quality Requirements: The Entity ensures that all new information systems and changes include specific data
quality requirements. These requirements, documented using data quality metadata definitions, serve as the basis for
internal data quality SLAs and external contractual agreements.

Incorporating Data Quality Audits: The Entity integrates data quality audits into the Data Governance Checkpoint
Process. This includes audits during system changes, plans for improving data quality, and documenting
requirements. The Data Governance Board defines when and where these audits are required, considering factors like
system integrity, accuracy, and reliability at various checkpoints in the data lifecycle.

Master Profile Audits: The Entity will audit its master profiles, as per Data Modeling standards, every three months
across all data sources. If data quality misaligns across sources or with defined standards, discrepancies will be
identified and root causes determined. Corrective action, if necessary, will be decided by the Data Governance Board.
Audit Intervals for Non-Common Profiles: The Entity will establish suitable audit intervals for data types not covered by
common profiles defined in DM2. The Data Governance Board will decide on corrective actions once the cause of
discrepancies is understood.

Third Party Data Quality Checks: The Entity will perform spot checks on third-party data to ensure compliance with
data supplier service level agreements. If no agreements exist, the Entity will develop its own data quality
requirements and share them with the data supplier.

Data Profiling Tools: The Entity will systematically use data profiling tools with various analysis capabilities, including
structured data column analysis, data structure-independent integrity analysis, pattern identification, reporting, and
change detection.

Metadata Storage: Data quality measures obtained during audits will be stored as metadata in the Data Catalogue.

Data Cleansing Initiative: The Entity will identify gaps between data quality definitions and measured data quality. It will
execute a data cleansing initiative to improve data quality, led by the Data Governance Board. Strategies may be
system-specific, data-type specific, or prioritized by business benefit, utilizing various tools and expertise.

Target Data Architectures: The Entity will ensure that target data architectures focus on improving data quality across
information systems and services. Master profiles will be a priority, extending to other data types as defined by the
Data Governance Board.

Data Cleansing Process: The end-to-end data cleansing process is outlined as follows:
The Entity must follow the latest approved Information Security Standards in Abu Dhabi Government, prioritizing them
over Data Management Standards in case of conflicts. The Data Governance Board is responsible for recording and
resolving conflicts.

The Entity's data architecture, information systems, and components should align with the approved Information
Security Standards in Abu Dhabi Government. The Data Governance Board will confirm this alignment through the
Governance Checkpoint Process.

When releasing data as Open Data, the Entity must demonstrate compliance with both the approved Information
Security Standards and Data Management Standards. The Data Governance Board will approve the data's
publication.

The Entity must classify systems to identify those at risk of privacy breaches according to the Entity's privacy policy
(see DSP.2).

The Entity must ensure compliance with Payment Card Industry (PCI) Security Standards for information systems
handling credit card data through the Governance Checkpoint Process.

The Entity must ensure that cloud suppliers adhere to ISO/IEC 27017 Cloud Security Standards and ISO/IEC 27018
Handling of Personally Identifiable Information Standards ratified by the International Standards Organisation.
The Entity must create a privacy policy in line with government privacy laws, incorporating guidance from these
Standards, particularly regarding its line of business data. The policy should include relevant information and guidance.

The privacy policy should include a public privacy statement, clearly outlining stakeholders' privacy rights and the
Entity's privacy obligations. It should remain aligned with cross-government policies.

In consultation with legal experts, individuals from whom data is collected should have the rights to view, correct
inaccuracies, and request the removal of their data when no longer relevant.

The Entity should clarify the purpose and use of personal data at the point of collection and offer stakeholders a
mechanism to opt out of non-core activities.

The Entity must create and maintain privacy metadata for its master profiles, clearly identifying attributes containing
private data. This metadata should be stored in the Data Catalogue.

The Entity's Open Data policy should align with the Data Privacy policy, ensuring no data that breaches individual
privacy is made public. Special attention should be given to preventing the 'mosaic effect,' which combines data from
multiple sources to identify individuals.

Develop an awareness program for the data privacy policy, disseminating it to all users of private data to remind them of
their responsibilities regarding data privacy.
The Entity must adhere to the principles of 'Privacy by Design,' which include being proactive, making privacy the default
setting, embedding privacy into designs, accommodating all legitimate interests, ensuring end-to-end security, providing
visibility and transparency, and respecting user privacy. This approach helps the Entity detect privacy issues early and
reduce privacy risks and costs.

The Entity should develop training and awareness materials on the principles and objectives of 'Privacy by Design' for
technical and business users responsible for designing information systems and processes.

The Entity needs to identify and address any deficiencies in its existing data sources regarding compliance with the
principles of 'Privacy by Design.' The requirements from the gap analysis should inform the Entity's target data
architecture at the enterprise level and within specific information systems as needed.

Data governance checkpoints should be used to confirm alignment with the principles of 'Privacy by Design' when:

The Entity must establish a privacy management workflow for identifying, logging, investigating, and resolving data
privacy-related issues in line with its privacy policy. This workflow should encompass issues reported by both internal
users and external stakeholders, involving steps for evidence collection, post-incident analysis, reporting, and
corrective actions. It is used to monitor policy implementation effectiveness, and the Entity should report privacy-
related metrics to cross-government working groups.

The Entity should provide individuals with a route to update or correct their private data, and these updates should be
part of a data quality audit.

The Entity, following its privacy policy, must respond promptly to requests for data disclosure from individuals, with
response times established by the Data Governance Board. Requests should be monitored to ensure timely action.
The Entity shall evaluate requests for data removal in accordance with its privacy policy, balancing business needs
with individual privacy. Requests should be handled internally, with an appeal process available to individuals if
needed, potentially involving cross-Entity collaboration. The Data Manager makes the final decision.

The Entity should take steps to prevent data loss and privacy breaches, considering appropriate architectural
components to enhance information system protection. These components may include data-loss prevention tools,
database activity monitoring, and data discovery tools. The Data Governance Board assesses data loss risks for each
system and incorporates technical components into the target data architectures as needed.

Data security and privacy requirements must be observed in production information systems within test, development,
and training environments. When using a subset of production data, data masking technologies should be applied to
protect sensitive information. Data masking techniques involve transforming, obfuscating, or randomizing data.
Consideration should be given to preserving the characteristics of real "Live" data in test or training environments.
Data quality audits are necessary when considering data masking.

The Entity must engage an infrastructure audit team knowledgeable about platform utilization metrics
and the prevalent hardware and software configurations across the Entity.
The Entity shall perform a comprehensive audit of physical inventory, encompassing Data Centers and
other sites. The audit should record various fields for each system, including location,
service/application name, server details, hardware, software, and more.

The Entity is required to conduct a logical audit of network inventory to reconcile with the physical
inventory. Tools like Spiceworks, SolarWinds, HP Open Computers and Software Inventory Next
Generation, or a Configuration Management Database (CMDB) instance should be used for this
purpose. Any discrepancies between the two audits should be reconciled through a remediation plan.

ADCMA maintains its IT storage landscape and conducts regular audits.

ADCMA has completed the physical inventory audit and provided evidence.

ADCMA is transitioning from Solarwinds to OP Manager Managed Engine for its operations.

Classify information systems as Legacy, Virtualize-able, or Cloud-able for migration suitability.

Create a migration list based on portability, criticality, and precedence factors.

Engage an infrastructure Architecture team to determine the target architecture for Data Centres.

Ensure the target architecture includes flexible infrastructure capabilities like IaaS and PaaS models.

Choose an appropriate cloud deployment model: Private, Community, Public, or Hybrid, while avoiding public cloud for
Abu Dhabi Government Data.

Determine Data Centre Tier: Decide on a Data Centre Tier based on availability and infrastructure criteria.

Compliance with Tiers: Ensure that Data Centre Standards align with the selected Tier and Cloud Deployment Model.
Consider All Options: Explore various data center strategies, including government solutions.

Cost-Benefit Analysis: Evaluate the financial aspects of data center and cloud investments.

Data Centre Transformation: Plan a transition program considering capacity, budget, and sharing resources.

Transformation Program Review: Seek ADSIC approval for the Data Centre Transformation Program.

Execute Transformation Plan: Implement the approved Data Centre Transformation Plan.

Establish a Cloud Centre of Excellence: Create a team with various cloud management roles.

Continuous Monitoring: Keep track of capacity and resource utilization.

Regular Capacity Audits: Conduct quarterly capacity audits and updates.

Keep Data Centre Plan Updated: Maintain an up-to-date development plan with periodic reviews.

Backup Plan Implementation: Implement an approved backup plan.

Define RPO and RTO: Specify Recovery Point and Time Objectives for backup plans.

Regular Backup Tests: Periodically test backup and restoration processes.

Offsite Backup: Store backup copies securely offsite with monitoring and fire protection.

Cost/Benefit Analysis: Analyze backup processes, favoring cloud-based solutions.

BCDR Plan Implementation: Implement a Business Continuity and Disaster Recovery plan.

BCDR Strategy: Define a strategy for protecting, stabilizing, and recovering critical activities.

BCDR Plan Contents: Include roles, activation process, mitigation actions, communication, recovery, media, and
stand-down plans.
Regular BCDR Drills: Plan and execute annual BCDR drills and quarterly paper scenario exercises.

The Entity should establish a policy and standards for managing all recorded information across its lifecycle, ensuring
high quality, security, and availability.

All data within the Entity should be authentic, reliable, complete, unaltered, and usable, with clear chain of custody and
metadata.

The Entity should identify data owners, establish creation and disposal requirements, determine sharing requirements,
and train staff in data management.

The Entity must maintain an inventory of data in a Data Catalogue, provide an annual report to the Data Governance
Board, and address areas of non-compliance.

Data held by the Entity should follow the Information Lifecycle process, including creation, retention, maintenance,
use, retirement, and disposal, with a focus on security and efficiency.

The Entity shall implement a Strategic Integration Platform to facilitate data transfer, transformation, access auditing,
performance monitoring, security controls, and transaction management. It should be part of the target enterprise data
architecture.
The Entity's strategic integration platform should align with the metadata requirements of the Abu Dhabi Government
Interoperability Framework.

The Entity shall develop and publish a policy for usage of its strategic integration platform, covering internal, trusted
third-party, and external data sharing.

Consideration should be given to migrating existing data feeds into and out of information systems through the
Strategic Integration Platform. The Data Governance Board should assess the business value and reusability of each
data feed.

External integration with data from other Entities should be made through the ADSIC Enterprise Service Bus (ESB).
The Entity should not engage in peer-to-peer data transfers. Datasets available through the ESB should be published
in the Entity's Data Catalogue.
Data exchange through the Strategic Integration Platform should be secure and audited, complying with information
exchange requirements of the approved Information Security Standards.

The Entity should consider appropriate data exchange methods when integrating data between applications and
systems, including file-based, message-based, and database-to-database exchanges.

The Entity should plan to migrate peer-to-peer application data sharing to the Strategic Integration Platform in its target
data architecture, enabling data reusability. Justification is required for cases where migration is not possible due to
proprietary software.

The integration platform should have the capability to broker interactions across different integration patterns, such as
file-based and message-based data exchanges.

Data architectural consideration should be given to the data formats allowed by each data service integrated, with
XML and JSON formats preferred for data transfer between Entities. Industry or proprietary formats are allowed with
justification and should be documented in the Data Catalogue.

Data Transfer Protocols: Choose suitable protocols for connecting systems to the Integration Platform, such as FTP,
HTTP, SOAP, and more. Document these in your Data Catalog.
Prefer One-Way Integration: Favor one-way data sharing methods like Publish/Subscribe, Request/Response, or
Broadcast.

Justify Two-Way Integration: Provide reasons for using complex two-way data sharing and address concerns like
transaction management and data concurrency.

Data Integration Design: Plan for detecting data delivery failures, ensuring repeatable and idempotent data retries,
maintaining statelessness, and ensuring high availability. The Data Governance Board will review these design
considerations.

Service Level Agreements (SLAs): Define agreements covering data quality, volume, service availability, data structure,
change control, exception handling, and SLA monitoring frequency.

Internal SLAs: Create agreements for data sharing within your organization and resolve disputes through the Data
Governance Board.

Binding SLAs for External Entities: Establish binding service-level agreements with other government entities via the
ADSIC ESB. In case of non-compliance, follow the exception escalation process and engage cooperatively to
investigate issues with the service level agreement.

10
Open Data Review: The Entity must systematically evaluate data sources and prioritize making them open unless
there are security, privacy, or data quality concerns. The Data Governance Board reviews and sets criteria for closing
data sources.

Record Keeping: The Entity should maintain systematic records of data sources, clearly indicating their open or
closed status, and provide plain language definitions in the Data Catalogue.

Open Data Access: All data deemed open in the Open Data Review should be available through the Open Data
Portal in machine-readable and human-readable forms.
Data Authenticity: Data should be made available as close to the source as possible, with minimal manipulation.
Privacy and security concerns should be addressed with minimal data changes.

Open Data Plan: Develop a plan for releasing open data, including data review, quality checks, and any required
privacy or security adjustments.
Prioritizing Open Data: Prioritize data release by addressing security, privacy, business needs, and data quality.

Systematic Planning: The Open Data Plan should systematically address all datasets identified in the Open Data
Review.

Plan Monitoring: Regularly monitor progress against the Open Data Plan and review it quarterly.

Publication on Data Portal: Publish open data on the Abu Dhabi Government Open Data Portal.

Data Quality Maintenance: Continuously review open data to ensure it meets quality standards and address security
and privacy concerns.
Dealing with Issues: If open data fails quality or faces security/privacy concerns, suspend its publication, conduct a
new review, and address the issues before relisting.

Usage Tracking: Capture usage trends and statistics on data access and report to the Government Data Governance
Committee.

Annual Awareness Campaign: Conduct annual campaigns to inform stakeholders about open data, its quality, and
any security/privacy measures. Inform and educate internal and external stakeholders and the public.
Transparency in Withholding Data: If a dataset isn't published, use the awareness campaign to explain the reasons,
provide a publication timeline, or clarify if the dataset will remain unpublished.

11

Plan and publish a schedule for identifying reference data in information systems, including resource
allocation and reviews.

Establish a team responsible for managing reference data, including discovery, alignment, and change
management.

Identify and document reference data used in information systems, specifying values and definitions.

Ensure reference data values are codified, unique, and not case-sensitive, with associated
descriptions.
Align reference data with relevant standards, creating a "master reference data" dataset.

Regularly review the master reference data to accommodate new systems or changes.

Align reference data used in systems with the master reference data or provide mapping schemas.

Describe reference data values in both Arabic and English.

Develop processes for actively managing reference data values, including requests and evaluations.
Define the Reference Data Change process, including requests, evaluations, and updates.

Capture and record requests, consultations, and decisions related to reference data changes.

Implement processes to audit reference data population across information systems.

Implement reference data export features for monitoring alignment with the master reference data.

Establish a Reference Data Management platform with various features.


Implement processes to detect and identify new or unrecognized reference data values for audit.

Plan and publish a schedule for identifying master data in information systems, with resource allocation
and reviews.

Establish a team responsible for managing master data, including discovery, alignment, and cleansing.

Identify and define master data profiles, including semantic definitions and lifecycle details.

Ensure master data records have unique, non-case-sensitive codification.


Publish KPIs and metrics to measure duplicate master data records in information systems.

Implement controls to limit the use of non-primary master data records when duplicates exist.

Match and link equivalent master data records within information systems to identify duplicates.

Assess master data profiles for tangible benefits in merging duplicated records.

Execute master data initiatives to cleanse and deduplicate records when compelling benefits are
identified.
Match and link equivalent master data records across all Entity-owned systems and government-wide
systems.

Develop and publish KPIs and metrics to measure the numbers of master data records across
systems.

Identify master data records without equivalent links for data stewardship activities.

Implement safeguards to monitor reference data values in master data records.

Conduct regular reviews to accommodate new systems or assess changes.


Ensure master data values can be described in multiple languages.

Develop and execute processes to actively manage master data records, prioritizing issues based on
importance and urgency.

Define the Master Data Change process, including identifying the primary information system,
maintaining master data records, and handling external data sources and publications.

Ensure that the process execution is documented and recorded for changes, consultations, and
decisions.

Implement processes to audit the population of master data across all information systems, including
measuring latency and data value alignment.
Implement master data export features for monitoring alignment with the primary master data dataset.

Implement a Master Data Management platform with various features, including workflow
management, multiple versions, import and export support, and data security measures.

Implement system processes to detect and identify new or unrecognized master data values for audit
and process review.

12
Establish quality standards for document and content management, including language style guides,
naming conventions, review processes, and version management.
Define requirements for document and content management, covering document standards, metadata,
retrieval procedures, retention policies, and more.
Ensure documents are authentic, reliable, complete, unaltered, and usable, with proper metadata.

Implement document systems and processes, including file plans, repositories, training, and
performance measurement.
During decommissioning of document systems, no new documents can be created, but existing
documents must remain accessible or be converted to a new system.

Determine retention policies based on business need, regulations, accountability, risks, privacy, and
stakeholder interests.

Establish a document classification scheme for consistent naming, security, access control, and
retention policies.

Ensure correct retirement and disposal techniques are employed, with options like physical destruction
or archiving.
Clearly document and regularly review the document lifecycle and associated processes.

Monitor and ensure compliance with document management processes, retention policies, and user
satisfaction.

Establish and maintain a training and awareness program for document and content management.

Choose a software solution that enables various aspects of document and records management,
including classification, metadata, versioning, retention policies, access control, audit trails, and ease
of use.

Consider international standards when selecting a software platform for document management.

13

Business Vision for Initiatives: Data warehouse and analytics initiatives should be driven by a clear
business vision. The Data Governance Board plays a key role in overseeing these initiatives.
Service Level Agreements (SLAs): SLAs should be developed to regulate data usage within the data
warehouse. They should include parameters such as data availability, data load latency, data retention,
and data quality.

Monitoring and Reporting: The entity should monitor the effectiveness of data warehouse initiatives
and report findings to the Data Governance Board. This should include technical alignment with the
architectural roadmap, implementation experiences, lessons learned, and business successes.

SLAs with External Data Suppliers: SLAs should be agreed upon with external data suppliers to
ensure confidence in externally sourced data. This includes defining ownership, issue resolution
workflows, data refresh cycles, and data quality requirements.

Data Staging Environment: A data staging environment should be used to collect, cleanse, match,
and merge source system data before adding it to the data warehouse. This can be a separate store or
part of an ETL tool.
Integration with Other Data Management Domains: Data warehouse initiatives should consider
other data management domains, including metadata, data catalog, data modeling, data architecture,
data quality, data security, data storage, data integration, and more.

Enriching Data with External Sources: The entity should explore sourcing and using external data to
enrich its own data for better business intelligence.

Use of COTS or Open Source Tools: Commercial Off The Shelf (COTS) or Open Source tools should
be preferred over internally developed tools, with justification required for internal development.

Usability and Complexity in Architectural Designs: Data warehouse designs should favor usability
but also consider implementation complexity. An incremental, business-focused approach is
recommended.

Data Warehouse Table Types: Different table types (staging, dimension, fact) should be used when
modeling the data warehouse, and data modeling should enhance understanding by stakeholders.
Use of Surrogate Keys: Dimension tables should have synthetic or surrogate primary keys to support
performance optimization.

Data Warehouse Schema Design: Simplest schema types like star schemas are preferred, and
deviations from star schemas should be justified.

Conforming Dimensions: Dimensions should be conformed for reuse across multiple fact tables to
support a gradual development of multiple data marts.

Sources for Data Calculations: Sources for data calculations should be present and maintained in
the data warehouse, with audited workflows for management.

Performance Metrics: Performance metrics should be developed to control data quality, volume, and
timeliness within the data warehouse.

Federated Data Warehouse: Data marts should be consolidated into a federated data warehouse with
common tooling and technology across all data marts.

Consolidation of Data Marts: The entity should include the timeline for consolidating data marts into a
federated data warehouse on the data architecture roadmap.
Reuse of Dimensions: Dimensions should be normalized and reused across data marts for efficient
data processing.

Maturity and Competency: The entity should identify effective data marts to develop maturity and
competency across various data marts.

Operational Data Store (ODS): An ODS should act as a data source for the enterprise data
warehouse.

Separation Between ODS and Data Warehouse: A clear separation between data for an ODS and
data in a data warehouse should be maintained.

Use of ODS for Current Data: The ODS should integrate, analyze, and report on current data when it
meets business requirements.

Use of Realistic Data: Realistic data should be used during the design and development of business
intelligence solutions, and reference to the data dictionary and business glossary is recommended.

Classification of Business Intelligence Initiatives: Business intelligence initiatives should be


classified as tactical, strategic, or operational and appropriately located in the data architecture
roadmap.
Integration with Enterprise Reporting: Business intelligence reporting should integrate with or
become the enterprise reporting solution, which is distinct from application reporting.

Avoidance of Non-Authoritative VGI: Non-authoritative Volunteered Geographical Information (VGI)


should be avoided in compliance with government directives.

Production of KPIs and Dashboards: Business intelligence tools should be used to produce KPIs,
dashboards, and scorecards that reflect the entity's business objectives.

Publication of Statistical Data: The entity should publish statistical data in line with the Statistics
Centre Abu Dhabi (SCAD) requirements and establish SLAs for data provided by SCAD.

Data Analysis Capabilities: The entity should develop data analysis capabilities suitable for its data
types and evaluate training opportunities.

Big Data Analysis: The entity should explore the use of 'Big Data' analysis techniques for high-
volume, high-velocity, or high-variety data.
Event Stream-Based Analytical Processing: Event stream-based analytical processing should be
implemented for high-velocity data analysis, and justifications for its implementation should be
evaluated.

14

The Entity shall establish an organizational structure to support the Data Management Program.

The organization shall be positioned in the Entity with sufficient authority such that it is empowered to
do its job effectively.

The organization will take responsibility and accountability for Data Management.

The organization will be based on the Roles and Responsibilities described in this control. An
illustrative example of an appropriate RACI matrix is provided in the appendix.

The Entity shall convene the Data Governance Board to manage delegated authority and responsibility
within the Entity.

The Board will be the final arbiter within the Entity for all matters relating to data management.

This Board should have representatives from each area affected by data management initiatives, with
the Data Manager responsible for the execution of the Board's actions through the program
management function of the Entity.

The Data Governance Board shall meet regularly (weekly, initially) to provide independent oversight
and support for the Data Management initiatives being undertaken by the Entity.
The Entity shall appoint a Data Manager.

The Data Manager shall have delegated authority from the Data Governance Board.

The Data Manager shall oversee the implementation of change.

The Data Manager shall ensure compliance with governance, policy, and standards.

The Data Manager shall ensure the coordinated training and awareness programs are executed within
the Entity.

The Data Manager shall share best practices with other Entities.

The Entity shall identify and appoint Data Architects to support the Data Manager.

The Data Architects shall work with the Data Manager and the Data Governance Board to ensure the
implementation of the Data Management Standards in all designs across the Entity.

The Data Architects shall establish a clearly defined target state for all data sources.

The Data Architects shall establish a clearly defined roadmap to achieve the target state for all data
sources.

The Data Architects shall be responsible for developing and maintaining a formal description of the
data and data structures within the Entity, including data designs and design artifacts, dataset
metadata definitions, and data flows throughout the Entity.

The Entity shall identify and appoint Data Stewards to support the Data Manager in both the business
and technical areas of the organization.
The Data Stewards will take responsibility for the lifecycle of the data as it passes through information
systems and ownership boundaries.

The Data Stewards will take responsibility for the quality of the data under their stewardship and
cleanse the data as necessary.

The Entity shall identify and appoint Data Owners (who are responsible for a particular dataset) to
support the Data Stewards.

Data Owners will be drawn from both the business and technical areas of the organization.

The Data Owners will take responsibility for a particular dataset throughout the lifecycle across
systems.

The Data Owners will ensure the quality standards for their dataset are met.

The Data Owners will liaise between the business and technical stakeholders to ensure that their
dataset is maintained to the highest standards possible.

The Entity shall regularly undertake monitoring and compliance checking to ensure that information
systems and data-related processes are implemented in accordance with established policy,
standards, and best practices.

Such reviews should include coverage of the performance of the domain processes and user
satisfaction.
The Entity’s Data Management Policy shall address the scope of its data management systems, roles,
responsibilities, management commitment, coordination among organizational functions, and
compliance obligations.

The policy document shall be approved by the Entity's Data Management Board, Data Manager, and
the Entity's executive management.

The policy shall be published and communicated to all employees and relevant stakeholders.

The policy shall contain a definition of data management, its overall objectives and scope, and the
importance of data management as a pillar of upholding high standards of data quality.

The policy shall be applicable to all business functions of the organization and should be
supplemented by supporting instructions and guidance where appropriate for specific areas of activity.

The Entity shall establish its Data Management Policy, describing how data will be managed across
the Entity.

The Data Management Policy shall be supported by the production of an internal Document Retention
Policy, describing the Entity’s policy for retaining, archiving, and destroying documents (See Document
and Content controls).
In support of the Data Management Policy, the Entity shall establish policies for public consumption
where there are external stakeholders.

The Entity shall make publicly available policies including the Privacy Policy and Open Data Policy.

The policy shall cover the end-to-end data management lifecycle.

The policy shall include a clear statement of management intent, showing support for the principles of
data management and reinforcing its importance in alignment with government strategy.

The policy shall underline management expectations of teams and individuals when handling data and
highlight the importance of maintaining high levels of data quality at all points within the organization’s
operations.

The policy shall include governance metrics and process checkpoints within their policy, describing
how they will measure the effectiveness of data management throughout the Entity’s information
systems and processes on a continuous basis.

Measures and metrics should be maintained continuously and tracked to reveal trends, available for
audit purposes at all times.

The policy shall describe the mechanism allowing business and technical users to raise data-related
issues, including a clear escalation plan to ensure such issues are appropriately handled and resolved.

The policy shall describe the change management process and how it applies to the Data
Management Program and its initiatives.

The policy shall be regularly reviewed and updated (annually at a minimum).


The Data Management Board shall ensure the policy's continued relevance, adequacy, and
effectiveness, with more frequent reviews if significant changes occur.

The Entity shall ensure that all policy developments are aligned with all relevant legislation.

The Entity shall collect and maintain evidence of compliance with their policies and with the Control
Specifications within these standards.

The policy shall be quantifiable and traceable back to the Control Standards of this document; the
Entity should be able to demonstrate how each control will contribute to achieving a given policy
requirement.

Ensure that audit findings are thoroughly analyzed to confirm potential risks associated with
unaddressed issues.

Make certain that audit results are classified and protected at a level equivalent to the highest data
source's security classification being audited.

Efficiently coordinate Data Management audit activities with other audits within the organization to
achieve effective performance and compliance reporting while minimizing disruptions.

The organization must keep its Data Management Program Plan and Policy updated in response to
audit findings in each data domain.

Data management performance reporting should be based on specific, measurable, achievable,


realistic, and timetabled goals outlined by the Data Governance Board and the Abu Dhabi Data
Management Programme. These goals must align with the Entity's business needs and legal
obligations.

Develop outcome-based performance metrics to evaluate the effectiveness and efficiency of the Data
Management Program. The Data Governance Board should oversee the definition of these metrics,
their alignment with the Program Plan, and data performance reporting to stakeholders.
The Data Governance Board's role includes setting performance metrics, analyzing data from various
domains, and reporting Data Management Program performance to relevant stakeholders at specified
intervals and in agreed formats.

Ensure that the organization's Data Management performance metrics align with the Abu Dhabi
Government Data Management Programme's indicators, enabling timely and accurate status reporting
to relevant stakeholders.

Data Management performance reporting should encompass several aspects, including compliant
technology and business processes, data architecture maintenance, completeness of data models,
data quality, and information lifecycle management.

Implement mechanisms for continuous improvement based on performance data analysis, and closely
monitor the cost, benefit, and status of proposed and implemented improvements.

The organization must adhere to applicable Abu Dhabi Government Metadata Standards, such as
eGIF, SCAD standards, and geospatial metadata standards.
Ensure that metadata management tools comply with ISO/IEC:11179 Metadata Registry Standards.

Comply with ISO/IEC:11179 Part 4 'Formulation of Data Definitions' for defining data, which presents
steps for developing unambiguous data definitions.
Follow the principles documented in ISO/IEC:11179 Part 5 'Naming and identification principles' for
developing meaningful names and identifiers.

Execute a metadata initiative to gather, store, and use metadata effectively, covering assessment of
existing metadata sources, requirements gathering, metadata architecture, data stewardship, and a
rollout plan.
Utilize Abu Dhabi government and international standards when developing metadata, including eGIF,
SCAD, Geospatial, and ADMS standards, ensuring that they align with the Entity's operational context.

Manage metadata using a combination of automated scanning and manual techniques, ensuring data
accuracy per a defined schedule.
In cases of metadata conflicts that cannot be resolved by Data Stewards, the Data Governance Board
is responsible for arbitration.

Ensure that all metadata is accessible through the Data Catalogue, serving as the user access point
for metadata, data dictionary, business glossary, and modeling and architectural deliverables.
The Data Catalogue should support indexing, search, and retrieval of metadata relevant to the user's
role.

Document metadata architecture according to Data Architecture standards, making it a component of


the Enterprise Data Architecture.
Evaluate and choose the most appropriate metadata architecture in alignment with central standards,
considering centralized, decentralized, or hybrid approaches.

Define measures for the quality of metadata names and definitions, including subjective business
experience and user surveys to assess metadata effectiveness.
Monitor and report on metadata quality, ensuring that metadata values identify the version they were
captured against.

Monitor metadata coverage across the organization's business functions, assessing metadata
definition, capture, and usage across departments.
Monitor the effectiveness of metadata stewardship through workflow monitoring, issue tracking,
training, and awareness programs.

Align the Data Catalogue with mandatory standards like Abu Dhabi Government eGIF Schema
Generation, DCAT, and XSD to facilitate interoperability.

Also, consider aligning the Data Catalogue with recommended standards such as ADMS and RDF,
providing justifications for non-alignment to the Data Governance Board where necessary.

Develop a Data Catalogue with key features, including a metadata repository, publishing portal,
workflow management tool, business glossary, data dictionary, data model repository, and version
control.
Align Data Catalogue requirements with government-wide data catalogue requirements as they evolve.

Design the Data Catalogue with usability in mind, using a standard vocabulary that represents real-
world concepts accurately.

Serve the Data Catalogue as a central access point for all data assets, even though the actual data
may reside in various systems.

Identify datasets suitable for inclusion in the Data Catalogue, including transactional data, reference
datasets, master data profiles, statistical data, and geospatial data.

Discover datasets using a combination of human interactions and technical tools for scanning data
sources.

Prioritize datasets for inclusion in the Data Catalogue based on past demand, business-level metadata
typically taking precedence.

Develop and store data models for captured datasets at both the business and technical levels.

Consider developing semantic data models that describe data relationships using a defined
vocabulary.

Define appropriate metadata for data capture, including ownership, security classification, data quality,
and version information.

Capture and populate metadata for each dataset, ensuring comprehensive information is included.

Regularly maintain metadata in the Data Catalogue through a defined process.


Classify data assets according to a hierarchy that includes metadata, reference data, master data,
transactional data, and audit and log data, considering data importance, volume, lifecycle, and
complexity.

Establish a licensing model for data sharing and make it accessible through the Data Catalogue.

Create and execute an awareness program to promote data availability to business and technical
stakeholders, emphasizing data reusability.

Ensure that the System Development Lifecycle (SDLC) includes considerations for reusing datasets
from the Data Catalogue.

Encourage submissions for innovative data usage from various functions, evaluating their merit
through the Data Governance Board.

Allow consumers of datasets to register their usage, providing them with information about dataset
changes and updates.

Classify registered consumers as Formal or Informal, depending on the presence of service level
agreements.

Monitor and report Data Catalogue effectiveness using metrics like dataset coverage, registered
consumers, and completeness of metadata.

Review data models in the software development lifecycle as part of the Governance Checkpoint
Process.
Data models are critical for system development and should align with business and technology
requirements.

Implement data modeling tools with UML2.x compliance, XMI interchange support, metadata
association, versioning, and traceability.

Provide training for data modeling, tailored to user roles (e.g., business vs. DB admins).

Develop data models (conceptual, logical, physical) to document Entity's data structure.

Model unstructured data linked to structured data through business terms and concepts.
Model semi-structured/unstructured data with mandatory requirements and metadata.

Convert unstructured data into structured formats for formal modeling.

Align unstructured data conversion with Unstructured Information Management Architecture (UIMA).

Govern unstructured content lifecycle with appropriate workflows.

Create Data Flow Diagrams and Entity Relationship Diagrams for unstructured data.
Develop Data Models at the Conceptual, Logical, and Physical levels with references.

Include data modeling in mandatory system design and architecture documentation.

Produce data models for structured and unstructured data.

Publish data models for reference and re-use; justify deviations to Data Governance Board.

Use UML diagrams as primary modeling notation; seek exceptions' approval.


Use models best suited for communication with stakeholders.

Utilize Entity-Relationship and Class Diagrams for data structure documentation.

Employ Data Flow Diagrams for data movement modeling.

Divide large models into smaller, subject area-based models for clarity.
Clearly indicate current and unimplemented aspects in data models.

Apply naming conventions for data objects and relationships in data modeling.

Indicate master/slave/federation rules for duplicate datasets and shared data.


Define business terms for data objects, attributes, relationships, and values.

Produce technical definitions for terms in the business glossary.

Maintain metadata including model identifiers, responsibility assignment, and status.

Capture traceability links and lower-level identifiers for data models.


Evaluate and maintain appropriate metadata for data models.

Store data models in a version-controlled repository.

Develop an enterprise-wide data model aligned with Entity's core functions.

Ensure Data Governance Board oversight and socialization of enterprise data models.

Align system data models with the Enterprise Data Model.

Align the Enterprise Data Model with government-wide data models.

Develop conceptual data models for architecture and development processes.


Use techniques like stakeholder interviews to create conceptual data models.

Perform conceptual data modeling at a system or enterprise level.

Use conceptual data models for documentation, change requests, and analysis.

Identify and model all master profiles and their relationships.

Document master profiles in the Data Catalogue.

Develop logical data models for data attributes and relationship rules.
Ensure alignment with Enterprise Data Model.

Describe referential integrity and normalization in logical data models.

Document de-normalization justifications when necessary.

Ensure logical data models are independent of technical implementation details.

Use logical data models for documentation, change requests, and analysis.
Develop physical data models based on logical models.

Use physical data models for technical implementation and operational functions.

Provide mappings between logical and physical data models.

Reverse engineer data models from existing systems and link to logical models.

Data Architect (5)

Develop data architecture within an Enterprise Architecture Framework (TOGAF).


Create components, data profiles, data security, and quality compliance designs.

Produce Data Architecture deliverables for all Data Management Programme domains.

Classify architectural elements into Emerging, Current, Strategic, or Retirement categories.

Use specialized data architecture standards.


Develop baseline data architectures for information systems.

Consider business and technical requirements for data architecture.

Use baseline data architecture for system validation.

Continuously maintain and version baseline data architectures.


Produce a target enterprise data architecture.

Create target data architectures for evolving systems.

Target data architectures should align with various criteria.


Influence technology requirements based on target data architectures.

Identify gaps between baseline and target data architectures.

Develop a roadmap for aligning data architectures.

Follow the roadmap for system and component changes.

Annually report on roadmap implementation effectiveness.

6
Provide data quality definitions in various categories.

Map data quality definitions to business processes.

Define measures for data quality (validity, timeliness, integrity, accuracy, reliability).

Populate Data Catalogue with data quality metadata.

Develop a data quality audit plan.

Establish roles and tools for data quality initiatives.

Include data quality requirements in business requirements.

Integrate data quality audits into Governance Checkpoint Process.

Audit master profiles and align data quality.

Define audit intervals for non-standard data types.


Perform spot checks on third-party data quality.

Use data profiling tools for systematic data audits with structured data column analysis, integrity
analysis, pattern recognition, and reporting.

Store data quality measures in the Data Catalogue.

Identify gaps between data quality definitions and measured quality, and execute data cleansing
initiatives.

Ensure target data architectures improve data quality with monitoring and cleansing components.

Detail an end-to-end data cleansing process.

Apply Information Security Standards, resolving conflicts through the Data Governance Board.

Certify alignment with Information Security Standards for various information systems and
components.

Demonstrate compliance when releasing Open Data.

Identify information systems at risk of privacy breaches.

Ensure PCI Security Standards compliance for credit card data.


Ensure cloud suppliers meet ISO/IEC 27017 and ISO/IEC 27018 standards.

Develop a privacy policy in alignment with government privacy legislation.

Include a public privacy statement.


Consider individuals' rights related to their data.

Clarify data usage and provide opt-out mechanisms for stakeholders.

Produce and maintain privacy metadata for master profiles.

Ensure Open Data policy aligns with Data Privacy policy to avoid privacy breaches.

Develop an awareness program for data privacy.

Adopt 'Privacy by Design' principles.

Provide training and awareness materials for 'Privacy by Design.'

Identify shortcomings and use gap analysis in target data architecture.


Validate 'Privacy by Design' principles in data governance checkpoints.

Develop a privacy management workflow for issue resolution.


Provide individuals with a route to maintain/correct private data.
Respond to data disclosure requests within established time targets.

Evaluate Data Removal Requests: The Entity will thoroughly review and consider data removal
requests concerning an individual's information, striking a balance between business needs and
privacy. The process will include an appeals mechanism and involve cross-Entity collaboration, if
necessary, with the Data Manager having the final say.

Strengthen Data Protection Measures: To safeguard against data loss and privacy breaches, the
Entity will deploy a range of security components. These include data-loss prevention tools, database
activity monitoring, and data discovery tools. Systems will be evaluated based on business value and
data sensitivity to assess the risk of data loss.

Secure Data in Testing Environments: Data security and privacy standards will extend to test,
development, and training environments. When utilizing subsets of production data, data masking
techniques will be employed to ensure data integrity while aligning with quality standards.
Engage Infrastructure Audit Team: An infrastructure audit team, well-versed in platform utilization
metrics and hardware configurations, will be brought in to comprehensively audit the Entity's
infrastructure.

Audit Physical Inventory: A comprehensive audit will be conducted for all systems within Data
Centers and other sites. The audit will involve recording specific details such as hardware models,
power requirements, and current statuses.

Conduct Logical Network Audit: The Entity will perform a logical audit of the network inventory to
ensure it aligns with the physical audit. Any discrepancies identified will be resolved through
remediation plans.

Audit Infrastructure Utilization: Infrastructure utilization audits will be carried out on all information
systems and servers to assess actual loads during usage scenarios, both at peak and baseline levels.

Determine Infrastructure Capacity: Based on the audits, the Entity will determine the current
infrastructure's capacity and utilization trends, providing insights into consolidation ratios and future
capacity requirements.

Categorize Systems by Criticality: Information systems will be categorized based on criticality levels,
allowing for classification as Core Infrastructure, Critical, High, Medium, or Low. This classification will
align with the Entity's business priorities.

Classify Systems for Portability: The categorization of systems into Legacy, Virtualize-able, or
Cloud-able will aid in assessing their suitability for migration to target architectures.

Create Migration List: A migration list will be developed, considering portability and criticality, to
identify systems earmarked for migration.

Engage Infrastructure Architecture Team: A specialized infrastructure architecture team will be


involved in determining the suitable target architecture for the Entity's data centers.

Adopt a Flexible Target Architecture: The target architecture will reflect the latest flexible
infrastructure capabilities, such as Infrastructure-as-a-Service (IaaS) or Platform-as-a-Service (PaaS),
ensuring alignment with the Entity's requirements.
Choose a Cloud Deployment Model: A suitable cloud deployment model—Private Cloud, Community
Cloud, Public Cloud, or Hybrid Cloud—will be selected based on alignment with the Entity's
requirements and the Abu Dhabi Government's data center capabilities.

Refer to TIA942 Standards: The Entity will adhere to TIA942 standards for its infrastructure, including
access, power, cooling, and networking, ensuring compliance.

Set Data Center Standards: Establishment of data center standards, aligning with the determined Tier
level and chosen Cloud deployment model, is crucial to meet specific criteria.

Explore All Options: The Entity will thoroughly explore various data center strategies before making
final commitments, considering emerging solutions in the Abu Dhabi Government's data center
landscape.

Consider Cost-Sharing and Resilience: To optimize costs and enhance resilience, the Entity will
explore opportunities to share data center capacity and resources with other government entities.

Plan a Data Center Transformation Program: The Entity will plan a structured Data Center
Transformation program to transition from the current state to the target architecture within the
timeframe of the Abu Dhabi Government Data Management Programme.

Submit the Program for Review: The Data Center Transformation program will be submitted to
ADSIC for review and approval, ensuring alignment with government policies and standards.

Execute the Data Center Transformation Plan: Upon approval by ADSIC, the Entity will execute the
approved Data Center Transformation Plan, ensuring compliance with the planned changes.

Establish a 'Cloud' Center of Excellence Team: A dedicated 'Cloud' Center of Excellence team will
be formed to manage various aspects of the cloud infrastructure and administration.

Continuously Monitor Capacity and Utilization: Regular monitoring of capacity and utilization will
ensure optimal performance and allow the Entity to address any potential issues proactively.

Regularly Audit Capacity and Utilization: Periodic audits will follow the methodology outlined in DS1
to ensure the efficient functioning of the infrastructure.
Keep a Data Center Development Plan Up-to-Date: Maintaining an up-to-date development plan will
ensure that the data center's growth aligns with evolving requirements, reviewed annually and
refreshed every three years.

Implement a Backup Plan: A comprehensive backup plan, compliant with approved Information
Security Standards, will be implemented to ensure data integrity and availability in the event of loss or
system failure.

Define Recovery Point Objectives (RPO) and Recovery Time Objectives (RTO): Specific RPO and
RTO for each system covered by the backup plan will be determined and approved by the Data
Governance Board to align with business needs.

Conduct Regular Backup Availability Tests: Regular tests of the availability and effectiveness of the
backup and recovery procedures will be conducted, validating RPO and RTO objectives, backup
schedules, and restoration processes.

Prefer Remote Disk Backup for Offsite Storage: Remote disk backup will be favored for offsite
storage, ensuring secure and environmentally protected backup copies.

Conduct Cost/Benefit Analysis for Backup Processes: A thorough evaluation of the costs and
benefits of different backup processes will be undertaken, emphasizing modern solutions for backup
efficiency.

Implement a Business Continuity and Disaster Recovery (BCDR) Plan: A comprehensive BCDR
plan will be developed and implemented in compliance with approved Information Security Standards,
ensuring effective response to potential disasters.

Determine Appropriate BCDR Strategy: The Entity will assess and determine the most suitable
Business Continuity and Disaster Recovery (BCDR) strategy, taking into account critical activities,
incident mitigation, and vendor evaluations.

BDR Drills and Scenario Exercises: The Entity will plan and execute regular BCDR drills and paper
scenario exercises, ensuring that teams are well-prepared for various disaster scenarios.

Comprehensive BCDR Plan Contents: The Entity's BCDR plan will encompass defined roles and
responsibilities, incident response processes, actions for mitigating consequences, communication
plans, and clear recovery priorities.
Effective Information Lifecycle Management: The Entity will establish a policy and standards for
managing recorded information throughout its lifecycle, ensuring that data is authentic, reliable,
complete, unaltered, and usable.

Data Management Ownership: Data ownership and responsibilities will be clearly assigned, including
data class creation, disposal requirements, information-sharing, and security protocols.

Data Inventory Maintenance: The Entity will maintain an up-to-date Data Catalogue to facilitate
reporting on the status of the Data Inventory, departmental compliance with Data Management
Standards, and areas of risk and recommendations.

Information Lifecycle Stages: Data will be managed throughout the Information Lifecycle stages,
including creation, retention, maintenance, use, retirement, and disposal, ensuring data quality,
accuracy, and security.

Tight Governance and Monitoring: Stringent governance and monitoring will be maintained, with
data classification, retention, and disposal adhering to policies and procedures to prevent unauthorized
changes and ensure data security.
Strategic Integration Platform: The Entity will implement a Strategic Integration Platform as part of its
target enterprise data architecture, enabling data transfer, transformation, access auditing,
performance monitoring, security controls, and transaction management for efficient data integration.

Strategic Integration Platform and eGIF Metadata Alignment The Entity ensures that its
strategic integration platform aligns with the metadata requirements of the Abu Dhabi
Government Interoperability Framework (eGIF).

Policy for Usage of Strategic Integration Platform The Entity develops and publishes a
policy for using its strategic integration platform, covering internal, trusted third-party, and
external data sharing, encouraging cross-functional data sharing and considering service
level agreements.

Migration of Data Feeds Consideration is given to migrating existing data feeds into the
Strategic Integration Platform, evaluating business value and reusability.

External Integration Through ADSIC ESB Data integration with other Entities occurs
through the ADSIC Enterprise Service Bus (ESB) to avoid peer-to-peer data transfers.

Secure and Audited Data Exchange Data is exchanged securely and in compliance with
information exchange requirements outlined in Abu Dhabi Government's Information
Security Standards.

Data Exchange Methods The Entity considers appropriate data exchange methods,
including file-based, message-based, and database-to-database exchange methods.

Migration of Peer-to-Peer Data Sharing The Entity plans to migrate peer-to-peer application
data sharing to the Strategic Integration Platform for data reusability.
Integration Patterns Capability The integration platform allows different integration patterns,
including file-based and message-based exchanges.

Data Format Consideration Data architectural considerations are given to data formats
used in data services, with a preference for XML and JSON.

Data Transfer Protocols Data architectural considerations are given to data transfer
protocols, including FTP, HTTP, SOAP, ODBC, and JDBC.

One-Way Integration Patterns The Entity favors one-way integration patterns, including
publish/subscribe, request/response, and broadcast.

Two-Way or Interactive Integration Patterns Two-way data integration patterns are


considered and require justification through the Governance Checkpoint Process.

Data Integration Designs Consideration is given to data integration designs for detecting
delivery failure, repeatable retries, statelessness, and high availability.

10
Service Level Agreements Service level agreements (SLAs) are established, covering data
quality, data volume, availability, data variety, change control, exception escalation, and
SLA monitoring frequency.

Internal Service Level Agreements Internal SLAs are produced for data sharing within the
Entity, with dispute resolution through the Data Governance Board.

Binding Service-Level Agreements Binding SLAs are produced for data sharing between
Abu Dhabi Government Entities through the ADSIC ESB, with escalation procedures in
case of non-compliance.
Open Data Review The Entity systematically reviews all data sources, considering 'Open
By Default,' and criteria for closing sources are defined.

Open Data Records Systematic records are maintained, indicating the open or closed
status of data sources in the Entity's Data Catalogue.
Open Data Publication Open data is made available through the Open Data Portal in
machine-readable and, where practicable, human-readable forms.

Data Manipulation and Privacy Data is made available in its closest-to-source form, with
minimal manipulation, aggregation, redaction, or anonymization, taking into account privacy
and security concerns.

Open Data Plan An Open Data Plan is developed based on the Open Data Review,
ensuring data quality, addressing security and privacy, business priorities, and demand.

Open Data Plan Prioritization The Open Data Plan prioritizes data release based on various
criteria, including security, business priorities, demand, and data quality.

Systematic Dataset Address The Open Data Plan systematically addresses all datasets
identified in the Open Data Review.
Open Data Plan Monitoring Progress against the Open Data Plan is monitored and
reviewed quarterly.

Open Data Publication The Entity publishes its Open Data on the Abu Dhabi Government
Open Data Portal.

Continuous Data Quality and Privacy Review Open Data is regularly reviewed for
continuous data quality and privacy compliance.
Handling Open Data Failures In case of data quality or security concerns, the Entity
suspends Open Data publication, reviews the dataset, and executes a mitigation plan.

Data Usage Monitoring Usage trends and statistics regarding data access are captured and
reported to the Government Data Governance Committee.

Awareness Campaign Annual awareness campaigns are conducted to inform potential


users and stakeholders about the Entity's Open Data, its quality, and context.

Non-Published Datasets Explanation In cases of datasets not published, the Entity uses the
awareness campaign to explain the reasons, potential future publication, or the dataset's
non-publication status.
Reference Data Management Plan The Entity plans activities to identify reference data
used in its information systems, covering resources, scoping, and regular reviews.

Reference Data Management Team A dedicated team is established for reference data
management, including discovery, alignment, change management, and coordination.

Definition of Reference Data Reference data used in information systems is identified and
defined, including values and semantic definitions.

Codification of Reference Data All reference data values are codified as unique, non-case-
sensitive, contiguous values with associated descriptions and metadata.

Alignment with Standards Reference data values are aligned with government and local
standards, forming the 'master reference data' dataset.
Regular Reviews Regular reviews of the 'master reference data' dataset are conducted to
incorporate new information systems and changes.

Alignment or Mapping Reference data used in information systems is aligned with the
'master reference data' dataset or mapped, accounting for bi-directional transformations.

Multilingual Reference Data Reference data values are described in Arabic and English.

Active Management Processes are developed to actively manage reference data values,
allowing requests, evaluations, and applications.

Reference Data Change Process The Reference Data Change process is defined,
Process Execution and Evidence The Entity shall ensure that processes are executed in a manner
that allows for the capture and recording of requests, consultations, and decisions.

Reference Data Auditing The Entity shall establish processes to audit the population of reference
data across all its information systems to ensure data integrity.

Reference Data Export for Alignment The Entity shall implement features to export reference data
from information systems for comparison with the 'master reference data' dataset, facilitating alignment
monitoring and analysis.

Reference Data Management Platform The Entity shall implement a comprehensive Reference Data
Management platform with features such as workflow management, support for multiple versions of
reference data, API integration, and more, to effectively manage reference data.

Detection of Unrecognized Reference Data The Entity shall create system processes to detect and
identify the use of new or unrecognised reference data values, triggering audit and process reviews.
Master Data Management Plan The Entity shall plan and publish a schedule for identifying and
managing master data across its information systems, encompassing resource allocation, data
discovery, ongoing stewardship, and more.

Master Data Governance Team The Entity shall establish a dedicated team responsible for managing
all master data, including ownership, accountability, and consultation for significant dataset changes.

Master Data Definition and Lifecycle The Entity shall define master data elements, their semantic
definitions, and their lifecycles in both business and technical contexts.

Unique Codification of Master Data The Entity shall ensure that all master data records are uniquely
identified and codified with contiguous non-whitespace values, and such code values shall be non-
case-sensitive.

Metrics for Duplicate Master Data Records The Entity shall develop and publish key performance
indicators and metrics for measuring the number of duplicated master data records within each
information system.
Controls for Non-Primary Master Data Records The Entity shall implement systematic controls to
limit the use of non-primary master data records within information systems where feasible.

Matching and Linking of Master Data Records The Entity shall match and link equivalent master
data records within each information system to identify duplicate records.

Benefit Analysis for Data Merging For master data profiles that could benefit from merging
duplicated records, the Entity shall conduct a benefit analysis, considering data that references these
records and ensuring data references are updated accordingly.

Master Data Cleansing Initiative When a compelling benefit case or government-wide mandate
exists, the Entity shall schedule and execute a master data initiative to cleanse master data and
associated data to eliminate duplicate entries.

Cross-System Data Matching and Linking The Entity shall match and link equivalent master data
records across all information systems, including those operated by third parties, with special attention
to primary systems.
Metrics for Master Data Records The Entity shall develop and publish metrics to measure the
number of master data records and their equivalents across different information systems.

Unlinked Master Data Focus The Entity should be capable of identifying master data records that
have not been linked to any equivalent records, focusing on data stewardship and conducting regular
reviews.

Safeguarding Reference Data Values The Entity shall implement appropriate system safeguards to
monitor the use of reference data values to ensure their compliance with approved reference data.

Regular System Reviews Regular reviews, as outlined in the Master Data Initiatives plan, shall be
conducted to address new information systems and assess changes not identified through operational
processes.

Multi-Lingual Master Data The Entity shall ensure that all master data values can be described in
more than one language to accommodate linguistic diversity.
Active Master Data Management The Entity shall establish processes for actively managing master
data records, prioritizing issues based on importance and urgency.

Master Data Change Process The Entity shall define the process for maintaining master data
records, including identification of primary systems, maintenance processes, and incorporation of
external data.

Process Execution and Evidence The Entity shall ensure that process execution is well-documented,
including the capture and recording of changes, consultations, and decisions.

Master Data Auditing The Entity shall implement processes to audit the population of master data
across its information systems and develop metrics to measure data updates and alignment.
Master Data Export for Alignment The Entity shall create features for exporting master data from
information systems to compare with the 'primary master data' dataset, enabling alignment monitoring
and initial analysis.

Master Data Management Platform The Entity shall implement a comprehensive Master Data
Management platform with various features, including workflow management, support for multiple
versions of master data, API integration, and more.

Detection of Unrecognized Master Data The Entity shall establish processes to detect and identify
new or unrecognised master data values, ensuring they align with the change management process.

Quality Standards for Documents and Content The Entity shall define quality standards for
managing documents and content, encompassing language style guides, naming conventions, review
processes, and version management.

Document Management Requirements The Entity shall define requirements for managing
documents and content, covering document standards, metadata, retrieval procedures, and legal
considerations.

Document Integrity and Reliability The Entity shall ensure documents are authentic, reliable,
complete, unaltered, and usable, with proper records of changes and access.

Document System Implementation When implementing document systems, the Entity shall establish
file plans, repositories, training, data transfer, standards, retention timelines, and strategic integration.

Decommissioning Document Systems When decommissioning document systems, the Entity shall
discontinue new document creation, ensure accessibility for existing documents, and consider
migration to new systems.
Retention Policy Determination The Entity shall determine appropriate retention policies for
document types based on business needs, regulatory requirements, privacy concerns, and risk
assessments.

Document Classification Scheme The Entity shall establish a document classification scheme for
consistent naming, efficient retrieval, security provisions, access control, and retention policies.

Document Retirement and Disposal The Entity shall employ proper techniques for retiring and
disposing of documents, considering physical destruction, offline retention, archive handover, or
transfer of responsibility.

Document Lifecycle Management The Entity shall document and regularly review the lifecycle and
processes associated with documents and content.

Monitoring and Compliance The Entity shall conduct regular monitoring and compliance checks to
ensure that document systems and processes adhere to established policies and standards.

Training and Awareness The Entity shall maintain an ongoing training and awareness program for
document and content management, covering training requirements, policies, legal frameworks, and
system usage.
Document Management Solution Criteria The solution chosen by the Entity should meet criteria
enabling effective document management, including classification, metadata management, versioning,
search, access control, and auditing.

International Standards for Software Selection When selecting a software platform for document
management, the Entity may refer to related international standards for guidance.

usiness Vision-Driven Initiatives The Entity ensures data warehouse and business intelligence
initiatives align with a clear business vision. The Data Governance Board plays a crucial role in
overseeing these initiatives.

Service Level Agreements (SLAs) for Data Warehouse The Entity develops SLAs based on
business needs, defining SLAs for data availability, load latency, retention, and data quality.
Effective Data Warehouse Monitoring The Entity monitors and reports on initiative effectiveness,
sharing findings with the Data Governance Board for knowledge sharing across government entities.

SLAs with External Data Suppliers The Entity agrees on SLAs with external data suppliers to gain
confidence in externally supplied data, covering aspects like ownership, issue resolution, data refresh
cycles, and quality standards.

Data Staging Environment The Entity employs data staging for source data cleansing and merging,
utilizing various data-staging options.

Data Warehouse Initiative Considerations The Entity ensures alignment with various data
management domains, covering metadata, data catalog, modeling, architecture, quality, security,
storage, integration, master, reference, and open data.

Enrichment with External Data The Entity explores the use of external data sources to enhance
business intelligence.
Preference for COTS and Open Source The Entity prioritizes Commercial Off The Shelf (COTS) and
Open Source tools over internally developed ones, providing justification when necessary.

Usability-Oriented Architectural Designs The Entity favors data warehouse architectural designs
that prioritize usability while considering implementation complexity. An incremental, business-focused
approach is recommended.

Data Warehouse Table Types The Entity explains the different types of data warehouse tables,
including data staging, dimension, and fact tables.

Synthetic Primary Keys for Dimension Tables The Entity uses synthetic or surrogate primary keys
for dimension tables to optimize performance.

Schema Design Preference The Entity prefers star schemas for simplicity but justifies deviations
when necessary.

Conformed Dimensions for Reuse The Entity promotes conformed dimensions for reuse and
explains the concept and benefits.

Sources for Data Calculations The Entity ensures sources for data calculations are present and
managed within the data warehouse.
Performance Metrics for Data Quality The Entity develops performance metrics to control data
quality, volume, and timeliness.

Federated Data Warehouse The Entity normalizes data warehouse technology and explains the
concept of a federated data warehouse.

Consolidating Data Marts The Entity includes the consolidation of data marts in the data architecture
roadmap.

Dimension Normalization and Reuse The Entity normalizes and reuses dimensions for data
processing and reporting.

Maturity Development for Data Marts The Entity identifies effective data marts for organizational
maturity and competency.

Operational Data Store as Data Source The Entity uses the operational data store (ODS) as a data
source for the enterprise data warehouse.

Separation of ODS and Data Warehouse The Entity explains the distinction between ODS and data
warehouse data, highlighting their purposes and usage.
ODS Functionality and Integration The Entity utilizes ODS functionality for integrating, analyzing,
and reporting on current data.

Realistic Data for Business Intelligence The Entity emphasizes using realistic data for clarity, with
reference to the data dictionary and business glossary.

Classify Business Intelligence Initiatives The Entity classifies BI initiatives according to types,
explaining tactical, strategic, and operational BI.

Integration with Enterprise Reporting The Entity integrates BI reporting with enterprise reporting and
discusses the differentiation between enterprise reporting and application reporting.

Non-Authoritative VGI Compliance The Entity complies with government directives on non-
authoritative Volunteered Geographical Information (VGI).

KPIs, Dashboards, and Scorecards The Entity develops and uses BI tooling for key performance
indicators, dashboards, and scorecards.
Statistical Data Publication The Entity publishes statistical data in line with Statistics Centre Abu
Dhabi (SCAD) requirements, establishing SLAs for SCAD-provided statistical data.
Comments Compliance Status

No identified structure or model supports the Data


Management program in this assessment. Not implemented

No Data Governance board currently exists with Not implemented


delegated authority and defined responsibilities.

The Data Manager role is undefined and not in Not implemented


operation as of this assessment.

The Data Architects' role is undefined and not in Not implemented


operation in this assessment.

The Data Stewards' role is not defined or


operationalized in this assessment Not implemented

As of this assessment, no 'Datasets' are identified


across the organization, and therefore, roles as Data Not implemented
Owners are neither defined nor operationalized.

As of this assessment, no organizational structure,


R&R, or operating model supports the Data
Management program, and there is a lack of Not implemented
defined DG standards, best practices, and policies.

In this assessment, ADCMA has already


implemented the "Privacy Policy" on its website.
However, there is no overarching Data
Management policy, organizational structure, or
operating model established to support the Data Not implemented
Management program. Furthermore, the Data
Management Policy remains undefined, and there is
no documentation for the systems' span-of-control.
Not implemented
As of this assessment, there is no established
organizational structure, R&R, or operating model
to govern the data management policy.

As of this assessment, there's no established


structure, R&R, or operating model to oversee the Not implemented
data management policy.

In this assessment, there is no established


organizational structure, R&R, or operating model Not implemented
to oversee the data management policy.

In this assessment, there is no established


organizational structure, roles and responsibilities
(R&R), or operating model to govern the data Not implemented
management policy.

As of this assessment, ADCMA has implemented the


"Privacy Policy," available on its website. ADCMA
complies with Open Data policy requirements when Not implemented
accessing Open Data but currently does not share
any data.

As of this assessment, there is no defined end-to-


end data management lifecycle. Not implemented

As of this assessment, there is no statement of


intent from ADCMA management regarding Data Not implemented
Management.

As of this assessment, there's no defined end-to-


end data management lifecycle, and the team lacks Not implemented
awareness about maintaining high data quality.

As of this assessment, no governance metrics or


process checkpoints are defined to measure the Not implemented
effectiveness of data management controls.

In this assessment, there is an ongoing ITSM


initiative, but there is no defined procedure for Not implemented
business and technical users to report data-related
issues.
There is no defined change management process
for the data management program in this Not implemented
assessment.

In this assessment, there is no established


organizational structure, R&R, or operating model
to support the Data Management program. Not implemented
Additionally, there is no DG board in place with the
authority to review Data Management policies.

In this assessment, no organizational structure,


roles and responsibilities (R&R), or operating model
exists to support the Data Management program. Not implemented
Additionally, there is no DG board with the
authority to review policies and ensure alignment
with legislation.

In this assessment, the process for gathering and


maintaining evidence for the DM Statement of Not implemented
Compliance is not established.

In this assessment, there is no implementation of


data management policy traceability to control Not implemented
standards.

ADCMA follows a standard NDA process when


engaging with stakeholders, ensuring stakeholders Not implemented
adhere to relevant policies in writing.

The Entity will set and uphold specific, measurable,


and scheduled goals in support of its Data
Management Program, focusing on business
strategy, risk management, compliance with data Not implemented
management policies and standards, and fostering
an organizational culture that is mindful of data
management responsibilities.

As of this assessment, there is no established data


management program plan. Not implemented
In this assessment, no data management program Not implemented
plan is defined.

In this assessment, there are no version control


specifications for data management program Not implemented
artifacts.

The Data Assessment program, starting in Q4 2023,


will serve as the baseline for the Data Management Not implemented
Program.

In this assessment, ADCMA has established a Data


Management capability plan for Disaster Recovery Not implemented
and Document & Content Management.

In this assessment, the ADG's data management


model principles (Owned, Described, Quality, Not implemented
Access, Implemented) have not been put into
practice.

In this assessment, there is no Data Governance


board with delegated authority and defined Not implemented
responsibilities.

The existing CM process defines 3 tiers of approvals Not implemented


(Std, Critical and Emergency).

In this assessment, there is no established change


management process baseline for the Data Not implemented
Management Program.

In this assessment, there is no established change


management process baseline for the Data Not implemented
Management Program.

In this assessment, there is no established change


management process baseline for the Data Not implemented
Management Program.
In this assessment, there is no established change
management process baseline for the Data Not implemented
Management Program.

In this assessment, the training and organizational


awareness components of the Data Management Not implemented
program are not defined or operationalized.

In this assessment, the training and organizational


awareness components of the Data Management Not implemented
program are not defined or operationalized.

In this assessment, the training and organizational


awareness components of the Data Management Not implemented
program are not defined or operationalized.

In this assessment, the training and organizational


awareness components of the Data Management Not implemented
program are not established or in operation.

In this assessment, the training and organizational


awareness components of the Data Management Not implemented
program are not established or in operation.

In this assessment, the training and organizational


awareness components of the Data Management Not implemented
program are not established or in operation.

In this assessment, the training and organizational


awareness components of the Data Management Not implemented
program are not established or in operation.

In this assessment, no Data Management Audit


framework is in place to ensure compliance with the
Data Management Policy and Standards as defined
by the ADG-DMS and recommended policies. Not implemented
Furthermore, there is no established alignment of
the Data Management Audit Framework with the
Internal Affairs sector.
In this assessment, there is no Data Management
Audit framework to ensure compliance with the
Data Management Policy and Standards defined by
the ADG-DMS and recommended policies. Not implemented
Furthermore, there is no established alignment of
the Data Management Audit Framework with the
Internal Affairs sector.

In this assessment, there is no Data Management


Audit framework in place to ensure compliance with
the Data Management Policy and Standards defined
by the ADG-DMS and ADG recommended policies. Not implemented
Additionally, there is no established alignment of
the Data Management Audit Framework with the
Internal Affairs sector.

In this assessment, there is no Data Management


Audit framework in place to ensure compliance with
the Data Management Policy and Standards as
defined by the ADG-DMS and ADG recommended Not implemented
policies. Additionally, there is no established
alignment of the Data Management Audit
Framework with the Internal Affairs sector.

In this assessment, there is no Data Management


Audit framework in place to ensure compliance with
the Data Management Policy and Standards as
defined by the ADG-DMS and ADG recommended Not implemented
policies. Additionally, there is no established
alignment of the Data Management Audit
Framework with the Internal Affairs sector.

In this assessment, there is no Data Management


Audit framework in place to ensure compliance with
the Data Management Policy and Standards as
defined by the ADG-DMS and recommended Not implemented
policies. Additionally, there is no established
alignment of the Data Management Audit
Framework with the Internal Affairs sector.
In this assessment, there is no Data Management
Audit framework in place to ensure compliance with
the Data Management Policy and Standards as
defined by the ADG-DMS and recommended Not implemented
policies. Additionally, there is no established
alignment of the Data Management Audit
Framework with the Internal Affairs sector.

In this assessment, there is no Data Management


Audit framework in place to ensure compliance with
the Data Management Policy and Standards as
defined by the ADG-DMS and recommended Not implemented
policies. Additionally, there is no established
alignment of the Data Management Audit
Framework with the Internal Affairs sector.

In this assessment, there is no Data Management


Audit framework in place to ensure compliance with
the Data Management Policy and Standards as
defined by the ADG-DMS and recommended Not implemented
policies. Additionally, there is no established
alignment of the Data Management Audit
Framework with the Internal Affairs sector.

In this assessment, there is no Data Management


Audit framework in place to ensure compliance with
the Data Management Policy and Standards as
defined by the ADG-DMS and recommended Not implemented
policies. Additionally, there is no established
alignment of the Data Management Audit
Framework with the Internal Affairs sector.
In this assessment, Data Management performance
metrics have been partially implemented for
Enterprise Service management and Data Security,
but they do not cover all data domains as defined in
the Data Management standards. Additionally, the
Performance Management module is partially Not implemented
implemented in the ERP system to measure the
performance of each employee. Employee
objectives are linked with their Key Performance
Indicators (KPIs), and these KPIs are aligned with
departmental KPIs.

In this assessment, Data Management performance


metrics have been partially implemented for
Enterprise Service management and Data Security,
but they do not cover all data domains as defined in
the Data Management standards. Additionally, the
Performance Management module is partially Not implemented
implemented in the ERP system to measure the
performance of each employee. Employee
objectives are linked with their Key Performance
Indicators (KPIs), and these KPIs are aligned with
departmental KPIs.

In this assessment, Data Management performance


metrics have been partially implemented for
Enterprise Service management and Data Security,
but they do not cover all data domains as defined in
the Data Management standards. Additionally, the
Performance Management module is partially Not implemented
implemented in the ERP system to measure the
performance of each employee. Employee
objectives are linked with their Key Performance
Indicators (KPIs), and these KPIs are aligned with
departmental KPIs.
As of this assessment, Data Management
performance metrices are partially implemented
for Enterprise Service management and Data Not implemented
Security but does not cover all data domains
defined in the Data Management standards

As of this assessment, Data Management


performance metrices are partially implemented
for Enterprise Service management and Data Not implemented
Security but does not cover all data domains
defined in the Data Management standards

As of this assessment, Data Management


performance metrices are partially implemented
for Enterprise Service management and Data Not implemented
Security but does not cover all data domains
defined in the Data Management standards

In the current landscape, the metadata Not implemented


management tool is not available.
In the current landscape, the metadata Not implemented
management tool is not available.

In the current landscape, the metadata Not implemented


management tool is not available.
In the current landscape, the metadata Not implemented
management tool is not available.

In the current landscape, the metadata Not implemented


management tool is not available.
In the current landscape, the metadata Not implemented
management tool is not available.

In the current landscape, the metadata Not implemented


management tool is not available.
In the current landscape, the metadata Not implemented
management tool is not available.

In the current landscape, the metadata Not implemented


management tool is not available.
In the current landscape, the metadata Not implemented
management tool is not available.

In the current landscape, the metadata Not implemented


management tool is not available.
In the current landscape, the metadata Not implemented
management tool is not available.

In the current landscape, the metadata Not implemented


management tool is not available.
In the current landscape, the metadata Not implemented
management tool is not available.

In the current landscape, the metadata


management tool is not available. Not implemented
In the current landscape, the metadata
management tool is not available. Not implemented

In the current landscape, the metadata


management tool is not available. Not implemented

In the current landscape, there is no data catalogue Not implemented


implementation.

In the current landscape, there is no data catalogue Not implemented


implementation.
In the current landscape, there is no data catalogue Not implemented
implementation.

In the current landscape, there is no data catalogue Not implemented


implementation.

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue Not implemented


implementation.
In the current landscape, there is no data catalogue
implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue


implementation. Not implemented

In the current landscape, there is no data catalogue Not implemented


implementation.
In the current landscape, there is no data catalogue
implementation. Not implemented

Purchased/COTS applications come with defined


data model, the information systems which are Not implemented
custom built for ADCMA should have data
modelling as one of the core deliverables.

In the current ADCMA data landscape, no data Not implemented


model specific tools are used.

In the current ADCMA landscape, there are no


programs designed for training and educating Not implemented
ADCMA Team on data modelling.

In the current ADCMA landscape, the conceptual /


logical / physical data models are not maintained. Partially
For some of the source systems, COTS products are Implemented
used and respective products have documented
data models.
currently ADCMA does not maintain link between
structured and unstructured data. Planned

The semi-structured data includes hashtag twits,


folders organized by topic, etc. Currently, Metadata
collection with semi-structured and unstructured Not Implemented
data is not in a standardized in ADCMA.

In the current data landscape, the dataflow to


convert semi-structured or unstructured data into Planned
structured data is not present.

In the current data landscape, there are no


dataflows present in ADCMA to convert semi- Not Implemented
structured or unstructured data into structured
data.

In the current landscape, unstructured content


lifecycle is not governed through workflows. Not Implemented
In the current landscape, the dataflow to convert
semi or unstructured data into structured data is Not Implemented
not present.

In the current ADCMA landscape, the conceptual / Not Implemented


logical / physical data models are not maintained.

In the existing change management process, data Not Implemented


modelling artefacts are not mandatory.

In the current landscape, the data model is not Not Implemented


available for unstructured data.

In the current landscape, the data model evaluation


needs to be implemented. Planned
Currently UML Diagrams are not used for modelling Not Implemented
notation in ADCMA.

In the current data landscape, data governance


board is not established. Planned

In the current landscape, the conceptual / logical / Not Implemented


physical data models are not available.

ADCMA currently does not maintain Data Flow


diagrams to model he movement of data within and Planned
between ADCMA Systems.

In the current landscape, the conceptual / logical /


physical data models are not maintained. Not Implemented
In the current ADCMA landscape, the conceptual /
logical / physical data models are not maintained. Planned

In the current ADCMA landscape, the conceptual / Not Implemented


logical / physical data models are not maintained.

In the current landscape, the MDM is not Planned


implemented.
In the current landscape, business terms of data
object, attributes, relationship and values are not Not Implemented
maintained within ADCMA.

In the current landscape, the business glossary- Planned


technical definition mappings are not available.

In the current landscape, the conceptual / logical /


physical data models are not available. Not applicable

In the current landscape, the data modelling tools Planned


are not available.
In the current landscape, the data modelling tools
are not available. Planned

In the current landscape, the data modelling tools


are not available. Planned

In the current landscape, the conceptual / logical / Not implemented


physical data models are not available.

In the current landscape, the data governance


board is not established. Not implemented

In the current landscape, the data governance


board is not established. Not implemented

In the current landscape, the data governance Not Implemented


board is not established.

In the current landscape, conceptual data model


artefacts are not mandatory for change Not implemented
implementation process.
In the current landscape, the conceptual / logical / Not implemented
physical data models are not available.

In the current landscape, the conceptual / logical /


physical data models are not available. Not implemented

In the current landscape, the conceptual / logical / Not Implemented


physical data models are not available.

In the current ADCMA landscape, the master data


profiles are not available. Planned

In the current landscape, the master data profiles


are not available. Planned

In the current landscape, the master data profiles


are not available. Planned
In the current landscape, the master data profiles
are not available. Not applicable

In the current landscape, the conceptual / logical /


physical data models are not available. Not applicable

In the current landscape, the conceptual / logical / Not applicable


physical data models are not available.

In the current landscape, the conceptual / logical /


physical data models are not available. There are no Not applicable
guidelines provided for logical data modelling.

In the current landscape, the conceptual / logical /


physical data models are not available. There are no Not Implemented
governance checkpoints in SDLC for logical data
model artefacts review.
In the current landscape, the conceptual / logical / Not Implemented
physical data models are not available.

In the current landscape, the conceptual / logical /


physical data models are not available. Not Implemented

In the current landscape, the conceptual / logical /


physical data models are not available. Not Implemented

In the current landscape, the conceptual / logical / Not Implemented


physical data models are not available.

As of this assessment, there is no Data Architecture Not Implemented


framework for ADCMA data environment
As of this assessment, there is no Data Architecture Not Implemented
framework for ADCMA data environment

As of this assessment, there is no Data Architecture


framework for ADCMA data environment Not Implemented

As of this assessment, there is no Data Architecture


defined for ADCMA data environment with
architectural elements classified.

As of this assessment, data architecture standards


are not drawn from centre of excellence within Abu Not Implemented
Dhabi government
As of this assessment, baseline architectures are not
maintained for ADCMA data environment Planned

As of this assessment, baseline architectures are not Planned


maintained for ADCMA data environment

As of this assessment, baseline architectures are not Not Implemented


maintained for ADCMA data environment

As of this assessment, baseline architectures are not Not Applicable


maintained for ADCMA data environment
As of this assessment, only base Architecture is
planned. Target Enterprise data Architecture is Planned
planned in the next phases

As of this assessment, only base Architecture is


planned for business intelligence and analytics
platform. Target Enterprise data Architecture Planned
should planned in subsequent phases

As of this assessment, only base Architecture is


planned. Target Enterprise data Architecture is Planned
planned in the next phases
As of this assessment, only base Architecture is
planned. Target Enterprise data Architecture will be Planned
planned in the next phases

As of this assessment, only base Architecture is


planned. Target Enterprise data Architecture will be Not Implemented
planned in the next phases

As of this assessment, only base Architecture is


planned. Target Enterprise data Architecture will be Not Implemented
planned in the next phases

As of this assessment, only base Architecture is


planned. Target Enterprise data Architecture will be Not Implemented
planned in the next phases

As of this assessment, only base Architecture is


planned. Target Enterprise data Architecture will be Not Implemented
planned in the next phases

As of this assessment, the Data Quality framework,


definitions and plan are not implemented
Not Implemented
Some data is available such as log, system data,
website reports
As of this assessment, the Data Quality framework,
definitions and plan are not implemented Not Implemented

As of this assessment, the Data Quality framework,


definitions and plan are not implemented Not Implemented

As of this assessment, the Data Quality framework,


definitions and plan are not implemented
The Data Catalogue for the ADCMA datasets are not Not Implemented
implemented. There are no Data Quality metadata
defined.

As of this assessment, the Data Quality framework,


definitions and plan are not implemented

As of this assessment, the Data Quality framework,


definitions and plan are not implemented Not implemented

As of this assessment, the Data Quality framework,


definitions, plan and SLA are not defined and Not implemented
implemented

As of this assessment, the Data Quality framework, Not implemented


definitions and plan are not implemented
As of this assessment, the Data Quality framework,
definitions and plan are not implemented Not implemented

As of this assessment, the Data Quality framework,


definitions and plan are not implemented Not implemented

As of this assessment, the Data Quality framework,


definitions and plan are not implemented Not implemented

As of this assessment, the Data Quality framework,


definitions and plan are not implemented Not implemented

As of this assessment, the Data Quality framework,


definitions and plan are not implemented Not implemented

As of this assessment, the Data Quality framework,


definitions and plan are implemented Not implemented

As of this assessment, the Data Quality framework,


definitions and plan are implemented Not implemented

As of this assessment, the Data Quality framework,


definitions and plan are implemented

A Performance management module is Not implemented


implemented in ERP to measure the performance of
each employee. Employee objectives are linked with
their KPI's and KPI's are aligned with Department
KPI
As of this assessment, an approved comprehensive
Information Security Standards applicable to all
data management standards data domains is not
implemented
Not implemented
Currently ADCMA is UAE IA (information Assurance)
compliant certified (ISO 27001). Current scope is
limited to IT but standards will be replicated to
other BU.

As of this assessment, an approved comprehensive


Information Security Standards applicable to all
data management standards data domains is not
implemented
Not implemented
Currently ADCMA is UAE IA (information Assurance)
compliant certified (ISO 27001). Current scope is
limited to IT but standards will be replicated to
other BU.

As of this assessment, an approved comprehensive


Information Security Standards applicable to all
data management standards data domains is not
implemented
Not implemented
Currently ADCMA is UAE IA (information Assurance)
compliant certified (ISO 27001). Current scope is
limited to IT but standards will be replicated to
other BU.

As of this assessment, the ISO 27001 audit report


does not cover data "Privacy" or "Breach". Not implemented

Not applicable Not implemented

Not applicable
As of this assessment, there are no privacy policies
developed for Information Security Standards which
align with ADG DM Policy specifications, around
Structured data (data stored in tables), data Partially
collected from external entities, websites implemented
(internal/external), video data, sensor data,
Surveillance data and metadata

As of this assessment, the Public Privacy are defined


within the ADCMA website as "Privacy Policy" Not implemented

The exiting T&C in ADCMA website covers the


individual rights along with opt-out clause for Not implemented
subscribers.

The exiting T&C in ADCMA website covers the


individual rights along with opt-out clause for Fully implemented
subscribers.

As of this assessment, there are no Privacy


metadata defined for Master profiles (For example,
'Citizen' and 'Service' as master profiles. A master
profile may have a complex structure e.g. a 'Citizen' Not Applicable
profile may include family relationships, multiple
contact details, and the history of name changes.)

As of this assessment, there is no Open Data Policy


aligning with the Data Privacy policy

Open data policy is not defined. We have received Not Applicable


guidelines from Government to share open data on
Abu Dhabi Open Data platform

As of this assessment, there is no organisation wide


Data Privacy awareness programme implemented
Not implemented
There is a Cyber Security awareness program
implemented to provide trainings to end users and
management.
Currently the privacy principles defined within
ADCMA do not follow the design principle "privacy Fully implemented
by Design"?

Currently the privacy principles defined within


ADCMA do not follow the design principle "privacy Fully implemented
by Design"?

Currently the privacy principles defined within


ADCMA do not follow the design principle "privacy Fully implemented
by Design"?

Currently the privacy principles defined within


ADCMA do not follow the design principle "privacy Not implemented
by Design"?

As of this assessment, Privacy Management is


implemented at infrastructure level. There is no Not implemented
Data Privacy management implemented.

As of this assessment, the Public Privacy are defined


within the ADCMA website as "Privacy Policy".
Individuals (Website visitors) are informed about
the Privacy data collection policy but there no such
requirement for website visitors to maintain/correct Not implemented
their private data. Within the organisation (Intranet)
Employees (who are also classified as Individuals)
can maintain/correct their private data.

As such this specification is not applicable, since


there are no Individuals (website visitors) data Not implemented
being stored
As such this specification is not applicable, since
there are no Individuals (website visitors) data Not implemented
being stored

As of this assessment,
DLP : Data Loss Management is not implemented.
The following products are installed for;
- Network End Point Security.
- Trend micro Apex One – Endpoint
DAM Tools: SIEM solution is currently used to
gather DB access/modification data and provide
alerts for exception activities. SIEM is collecting
audit logs from Database
Not implemented
Data Discovery:
There is no Data Discovery tool implemented.

Data classification for IT assets and data leakage


policies are defined but not implemented yet. Data
Classification needs to be defined at 'dataset' level
before implementing DLP solutions. There are no
tools for data labelling and leakage prevention.
Logging mechanism is in place to keep the audit
trail.

As of this assessment, there are no Data masking


(Anonymisation or Pseudonymisation) mechanisms Not implemented
are implemented

Not implemented

Currently ADCMA is maintaining it's IT storage


landscape along with timely audits Not Applicable
Physical inventory audit has been done by ADCMA. Not Applicable
Evidences have been submitted

Currently ADCMA is using Solarwinds but they plan


to shift to OP Manager Manged Engine Not Applicable

Server health checks and performance audits? Any Partially


monitoring done? Any baselines defined? implemented

Capacity planning is being maintained by ADCMA Not implemented

Business Criticality information is being maintained


by ADCMA

All environments are virtualized; portability check Fully implemented


for cloud migration done

Cloud migration is being by ADCMA in near future Fully implemented


for G42 DR

ADCMA has an IT team which is handling the


infrastructure setup and storage capabilities Fully implemented

ADCMA is not considering any hyperscaler based


target architecture for their applications Fully implemented
management. G42-cloud is being considered by
ADCMA in near future for DR ONLY

ADCMA is not considering any byperscaler based


target architecture for their applications Fully implemented
management. G42-cloud is being considered by
ADCMA in near future for DR ONLY

ADCMA has to finalize on the application availability


to be followed as per Cloud policy Fully implemented

ADCMA has to finalize on the data center standards Fully implemented


ADCMA has to finalize on the application availability
to be followed as per Cloud policy Fully implemented

ADCMA has to finalize on the data center standards Fully implemented

Data center costing? Is that a cost hit? Not Applicable

ADCMA data storage requires no change Not Applicable

ADCMA data storage requires no change Not Applicable

ADCMA data storage requires no change Not Implemented

ADCMA data storage requires no change Not Applicable

ADCMA data storage requires no change Not Implemented

ADCMA data storage requires no change Not Implemented

ADCMA data storage requires no change Not Applicable

ADCMA has a backup plan Not Applicable

ADCMA has RPO/RTO information documented Not Applicable

ADCMA has scheduled backup plans Not Applicable

Whether using tape backup media or remote disk,


the Entity shall ensure that backup copies are Fully implemented
stored in an environmentally protected and access-
controlled secure offsite location.

Backup cost/benefit analysis has not been initiated Fully implemented


right now by ADCMA

Business Continuity and Disaster Recovery (BCDR)


plan and Scheduled drills are not being followed by Not Applicable
ADCMA right now

Backup policy defined by ADCMA Fully implemented


0 Fully implemented

Information Lifecycle Management for any form of


recorded information is not designed/implemented Fully Implemented
in ADCMA

Information Lifecycle Management for any form of


recorded information is not designed/implemented Fully Implemented
in ADCMA

Data Ownership is currently not defined within


ADCMA Not Implemented

This specification is currently not applicable since Partially


no Data Governance Board setup exists within Implemented
ADCMA

This specification is currently not applicable since


no Data Governance Board setup exists within Fully implemented
ADCMA

As of this assessment, ADCMA IT systems landscape


manages Data Transfer, Data Transformation as a
silo with peer-to-peer interactions between systems
with manual data collection and hand-offs.

Access auditing is partially implemented for limited


IT systems restricted to Security domain.

ADCMA have implemented access controls for Not Implemented


certain applications / data. Data in transit is
encrypted. Data transfer, Data transformation,
logging users are implemented only for few IT
(Security) application.
No guideline / system is available for Performance
monitoring, Transaction management Data
transformation
This specification uses the data operability
standards defined in ADG-eGIF and in UAE SMART
Data Framework. The expectation for this
specification is to ensure data is easily discoverable,
interoperable, reusable,trustworthy and shareable.
These include documenting metadata of the data
being integrated to efficiently search data and help Not Implemented
users understand the content and context of the
data, and on data formats and data schema to
facilitate interoperability of ADCMA data with
external data.

As of this assessment, Metadata Management


solution or Data Integration framework is not
implemented.

As of this assessment, ADCMA is not having any


policy for usage of its strategic integration platform Not Implemented
or a "Trusted Data Sharing framework".

ADCMA is not having a Strategic Integration


platform defined or operationalized.
The Data Governance Board will be proposed as Not Implemented
part of this project’s recommendations

As of this assessment, ADCMA does not have


information exchanged with other Entities. There is
no blueprint or implementation of the Strategic
Integration Platform within ADCMA to integrate Not Implemented
with external entities via the ADSIC (now ADDA) ESB
platform. The current data exchange pattern
between systems is manual and peer-to-peer
Only manual data exchange is under practice. Few
IT/security applications have an integration for logs.
Data in transit it encrypted

As of the assessment, there is no existing data


exchange methods are present. Few IT/security Not implemented
applications have an integration for logs

As of the assessment, there is only peer-to-peer


application data sharing existing at present, which
work in silos on need basis. Few IT/security Not implemented
applications only are integration for logs

A message integration broker is software that


enables applications, systems, and services to
communicate with each other and exchange
information. The message broker does this by
translating messages between formal messaging
protocols. This allows interdependent services to
“talk” with one another directly, even if they were Not implemented
written in different languages or implemented on
different platforms.

As of the assessment, there is no message broker


interactions being designed or implemented within
ADCMA

As of this assessment, the "Strategic Integration


Platform" design/blueprint does not exist. Data is
being sourced from internal/external systems Not implemented
manually. There are no defined data formats
enforced. Many of the data sourced are in excel
format.

As of the assessment there is no defined formats for


available for data transfer protocols allowed for
connecting information systems to the Strategic Not implemented
Integration Platform. There is no existing IS or SIP
present in ADCMA environment
As of the assessment there is no defined integration
pattern available sharing data with other systems Not implemented

ADCMA so as of this assessment, there is no two-


way or multi-way interaction Not implemented

As of the assessment, we found there is no existing


integration design in ADCMA environment Not implemented

In the assessment, we found there is no existing Not implemented


integration framework

In ADCMA there is limited amount of data shared


between the systems / application e.g.: Logrhythm Not implemented
SIEM, logs from other IT / security systems are
integrated to it

As of assessment, it was found ADCMA has no


integration requirements implemented with ADDA Not implemented
ESB, in-turn no External SLA defined
ADCMA team did not have any documented open Not implemented
data identification and prioritization done.

ADCMA team did not have any documented open


data identification and prioritization done. Not implemented

ADCMA team did not have any documented open Not implemented
data identification and prioritization done.
ADCMA team did not have any documented open Not implemented
data identification and prioritization done.

ADCMA team did not have any documented open


data plan for publishing Open Data. Not implemented
ADCMA team did not have any documented open Not implemented
data plan for publishing Open Data.

ADCMA team did not have any documented open


data plan for publishing Open Data.

ADCMA team did not have any documented open


data plan for publishing Open Data. Not implemented

ADCMA team did not have any documented plan


for publishing Open Data. Not implemented

ADCMA team did not have any documented plan


for publishing Open Data. Not implemented
ADCMA team did not have any documented plan
for publishing Open Data. Not implemented

ADCMA team did not have any documented plan


for publishing Open Data. Not implemented

ADCMA team did not have any documented training


and awareness campaigns for Open Data Awareness Not implemented
within and outisde ADCMA.
ADCMA team did not have any documented training
and awareness campaigns for Open Data Awareness Not implemented
within and outisde ADCMA.

Not implemented

Not implemented

Not implemented

Not implemented
Not implemented

Not implemented

Not implemented

Not applicable
Not applicable

Not applicable

Not applicable

Not applicable

Not applicable
Not applicable

Not applicable

Not applicable

Not applicable

Not applicable
Not applicable

Not applicable

Not applicable

Not applicable

Not applicable
Not applicable

Not applicable

Not applicable

Not applicable

Not applicable
Not applicable

Not applicable

Not applicable

Not applicable

Not applicable
Not applicable

Not applicable

Not applicable
As of this assessment,
1. There is no organisational document
management standard available which mandates
the Quality standards for document and content.
With the ECM programme planned in Q2 2023, it is
recommended that ADCMA establishes
requirements for the programme to establish
organisational document and content management
standards which defines guidelines for document Not applicable
writing, uniform document experience (look-and-
feel), document naming conventions, document
editorial processes.

2. Document version management is currently


implemented within the sharepoint platform
"ADCMA_versionmanagement_sharepoint_1.1.docx
"
As of this assessment,
1. There is no organisational document
management standard available. With the ECM
programme planned in Q2 2023, it is recommended
that ADCMA define requirements for the
programme to establish organisational document
and content management standards

2. Document retrieval, usage and sharing across


business processes within ADCMA organisation is
defined in the
"ADCMA_document_sharing_2.1.docx".
Not applicable
3. Determination of how long documents need to
be kept and Persisting documents and their
avaialbility over time to meet business needs is
defined in the
"ADCMA_sharepoint_retentionpolicy_setting_2.7"

4. ADCMA's ADCMA-ISMS-Data Security Policy


V1.0.pdf governs the document and content safety.

5. Retirement and disposal of documents is defined


in the "ADCMA_document_disposal_2.7.docx"
Authentic: Fully Implemented. Provenance of
documents in Sharepoint are established using the
Unique Document ID, assigned to every document,
which also gets a permanent URL in case a
document is moved to another location within the
same site. Sharepoint also provides traceability of
recorded information. With document access
management in place, ADCMA resources creating
documents in sharepoint can be traced along with
timestamp

Reliable: Fully Implemented. Documents deemed as Not applicable


'deliverables' contain signatory information.

Complete and Unaltered: Partially Implemented.


Document safety against unauthrorised changes is
implemented using
"ADCMA_document_access_management_2.6.docx
"

Useable: Fully Implemented. Sharepoint provides


the required features

As of this assessment, there is no organisational


document management strategy developed as part
of overall strategic plan. It is recommended that
ADCMA develops an overall document and content
management strategy along/before the ECM
programme.
Sharepoint is being used as the document system
which provided inherent functions for document Not applicable
and content repository. ADCMA is in progress of
implementing specialisty tools for content
management (DAM/MAM and CMS)

Retention & Disposition are defined in the


"ADCMA_sharepoint_retentionpolicy_setting_2.7.d
ocx"
As of this assessment, we observed there is a
defined Sharepoint 2013 decommissioniong plan.
The document was created during Sharepoint
upgrade to latest version. It is recommended to Not applicable
include "Document Management system
decommissioning/Migration plan" while defining
the ECM programme implementation requirements

ADCMA currently uses Sharepoint for DMS and uses


retention policy in coherence of sharepoint
features. The evidence provided fully complies with
the specification. The Retention policies and
procedures defined in the Not applicable
ADCMA_sharepoint_retentionpolicy will be
enhanced to implement for any future programs
being planned/implemented (like ECM,
DAM,MAM,CMS)

As of this assessment, there is no "document


classification" scheme other than the default
Sharepoint document classification feature.
Documents are general prefixed with ADCMA_ but a
uniform classification scheme does not exist.

All other specification bullets are fully implemented Not applicable


by deploying the following
ADCMA_document_access_management_x.x.docx
ADCMA_document_sharing_x.x.docx
ADCMA_sharepoint_retentionpolicy_setting_x.x.do
cx

Retirement and disposal of documents is defined in


the "ADCMA_document_disposal_2.7.docx"
The media content is archived in tape form
currently. It is recommended to define appropriate Not applicable
media disposal (physical destruction), overwritting
and secure deletion of media
As of this assessment, there is no 'document
lifecycle and processing" management defined at
organisation level.

As of this assessment, there is no established


process for regularly monitoring and compliance
checking. Monitoring and compliance checking of Partially
document and content management standards and Implemented
policies will be operationalised with the Data
Governance Checkpoint process.

A comprehensive Organisational Awareness of Data


Management Programme is not implemented. Partially
There are documentations provided for document Implemented
systems usage.

ADCMA currently uses Sharepoint as the Document


Management System. The evidence provided fully Partially
complies with the specification. Implemented

ADCMA currently uses Sharepoint as the Document Partially


Management System. The evidence provided fully Implemented
complies with the specification.

As of the assessment, different systems on the


ADCMA landscape are working in silos and the is Fully implemented
little to no data integration present in the system.
In ADCMA there is NO DWH / BIA system in place
As of the assessment, ADCMA there are NO Service Fully implemented
Level Agreements in place.

As of the assessment, ADCMA there is NO DWH / Partially


BIA system in place Implemented

As of the assessment, ADCMA there are NO external Partially


data suppliers. Implemented

As of the assessment, in absence of the data


integration environment, the ADCMA systems are Not implemented
not having any data-staging environment
As of the assessment, ADCMA landscape are
working in silos, there is not DWH / BIA system in Not implemented
place. There is very little or no integration of data
from multiple systems.

As of the assessment, ADCMA there is NO Partially


integration of external data Implemented

As of the assessment, ADCMA there are Commercial


Off The Shelf (COTS) in place. At time they are Fully implemented
configured to get aligned as per requirements

As of the assessment, ADCMA there is NO DWH /


BIA system is in place Fully implemented

As of the assessment, ADCMA there is NO DWH /


BIA system is in place
As of the assessment, ADCMA there is NO DWH /
BIA system is in place Planned

As of the assessment, ADCMA there is NO DWH / Not implemented


BIA system is in place

As of the assessment, ADCMA there is NO DWH / Not implemented


BIA system is in place

As of the assessment, ADCMA there is NO DWH / Not applicable


BIA system is in place

As of assessment in ADCMA environment, no


DWH / BIA present and no quality, volume and Planned
timelines metrics available

As of assessment in ADCMA environment, no Data Planned


Mart, DWH / BIA present

As of assessment in ADCMA environment, no Data


Mart, DWH / BIA present Not applicable
As of assessment in ADCMA environment, no Data
Mart, DWH / BIA present Fully implemented

As of assessment in ADCMA environment, no Data Planned


Mart, DWH / BIA present

As of the assessment, ADCMA landscape are


working in silos, there is not DWH / BIA system in Not implemented
place. In the ADCMA environment, no ODS exists

As of the assessment, ADCMA landscape are


working in silos, there is not DWH / BIA system in Not applicable
place. In ADCMA environment, no ODS and DWH
exists

As of the assessment, ADCMA landscape are


working in silos, there is not DWH / BIA system in Not applicable
place. In ADCMA environment, no ODS exists

As of assessment in ADCMA environment, no BIA


framework / solution exists Not applicable

As of assessment in ADCMA environment, no BIA


framework / solution exists Planned
At the time of assessment, there is no existing
reporting solution for ADCMA to be integrated with Not applicable
new proposed BIA platform

As of assessment in ADCMA environment, no Not applicable


reporting framework

Standard KIP’s and use cases not implemented Not applicable

No statistical data is published as of now Not applicable

No analysis capabilities present (Manually analysis is Not applicable


done when required)

Currently there are no Big Data use cases identified


for ADCMA. Not applicable
Currently there are no event / stream-based
analytics use cases identified for ADCMA. Not applicable

Not applicable

Not applicable

Not implemented

Not applicable

Not applicable

Planned

Not implemented

Not implemented
Not implemented

Planned
Recommendation

The Data Governance Chair and Manager must use the DG structure from DLV 9,
refine policies as needed, and seek approval from the Data Governance Board.

Nominate DG members for DWG, DGB, and DGC.


Empower DG Forums/Members through ADCMA (DG Office).
Kick off the DG SteerCo, define the DG Charter, and introduce DG team roles to the
SteerCo.

Before implementing the DG structure and operating model, identify the Data
Manager role. According to the DG Operating Model, nominate the Data Manager
as part of the IT support services Data Governance core team.

The Data Assessment Program will shape Data Governance, Operating Model, and
Roles & Responsibilities, including Data Architects. ADCMA will identify the Data
Manager before implementing DG structure, model, and R&R.

The Data Assessment Program will establish Data Governance, define the
Operating Model, and Roles & Responsibilities, including Data Stewards for
relevant Data Domains. ADCMA will identify the Data Manager before
operationalizing the DG structure and model.

The Data Assessment Program will outline Data Governance, establish the
Operating Model, and define Roles & Responsibilities, including the role of the
Data Manager. ADCMA will identify the Data Manager before putting the DG
structure, model, and R&R into operation.

Once the Data Governance Board is in place, regularly monitor and enforce
compliance with data domain control policies, standards, and best practices across
Information Systems.

The Data Management program will propose a policy that outlines system control
and roles in alignment with ADG Policies.
The Data Assessment Program will define the Data Governance blueprint,
Operating Model, Roles & Responsibilities, and the policy. The policy will outline
data management, its objectives, scope, and importance in maintaining high data
quality standards. ADCMA aims to operationalize the DG structure, operating
model, and R&R.

The Data Assessment Program will establish the Data Governance blueprint, define
the Operating Model and Roles & Responsibilities. The policy will define data
management, its objectives, scope, and its critical role in maintaining high data
quality standards.

The Data Assessment Program will establish the Data Governance blueprint, define
the Operating Model and Roles & Responsibilities. The policy will define data
management, its objectives, scope, and underscore its essential role in upholding
high data quality standards.

The Data Assessment Program will establish the Data Governance blueprint, define
the Operating Model, and Roles & Responsibilities, including the role of the Data
Manager. The policy will define data management, its objectives, scope, and
underscore its pivotal role in upholding high data quality standards.

The Data Assessment Program will shape the Data Governance blueprint,
Operating Model, and Roles & Responsibilities, including the Data Manager's role.
The policy will define data management, its goals, scope, and underscore its pivotal
role in upholding high data quality standards.

The data management policy will encompass data lifecycle management for
specific data domains, such as Data Quality, Data Security & Privacy, Master &
Reference Data Management, Data Retention & Disposition, Data Access
Management, Data Catalog Management, and data classification.

The proposed data management policy will align with the defined data
management principles as per the ADG-DMS and ADG Policies.

The Data Quality management policy and standards will be established within the
data management program. They will be implemented among ADCMA team
members (Data Stewards) through change management and awareness initiatives.

The Data Management Policy will be expanded to include governance metrics and
process checkpoints in future iterations of the Data Assessment program.

It is recommended to include a process for users to report data-related incidents in


the ITSM program. The Data Management Policy will align with the overall incident
management process.
The Data Governance deliverable will establish the change management process
for data management scenarios.

Once the Data Governance board is operational, members will convene to review
and update policies as needed to ensure their ongoing relevance, adequacy, and
effectiveness.

After the Data Governance board becomes operational, its members will meet to
review and update policies as necessary to ensure alignment with relevant
legislation.

Once the Data Governance board is operational, members will meet to review and
update policies to align with relevant legislation and securely store evidence in the
ADCMA's secured environment.

The Data Management Policy will be updated in future Data Assessment program
iterations to include governance metrics and process checkpoints, ensuring
traceability to Control Standards. It will also incorporate quantifiable metrics, and a
registry will be maintained, allowing for traceability to the applicable Control
Standards.

ADCMA will review the current NDA to potentially incorporate the Data
Management Policy, thereby reinforcing the obligations of policy consumers.

To operationalize the Data Management program, it's recommended to define and


prioritize Business Strategies and Goals, aligning them with proposed Data
Management Policies and standards.

In the next Data Assessment program iteration, define the Data Management
Program implementation plan, ensuring alignment with business strategy, goals,
and objectives. Post-implementation, the DG board will monitor plan effectiveness
and submit it to ADDA as per the applicable process.
In the next Data Assessment iteration, it is recommended to establish the Data
Management Program implementation plan, aligning it with business strategy,
goals, and objectives. After operationalization, the DG board will verify the plan's
effectiveness.

This specification will be considered when developing the Data Management


Program implementation plan.

After the DG board becomes operational, it is advisable to require approval from


the board for future Data Management Program initiatives.

The Data Assessment program will create a baseline capability for the DW and BI
Analytics platform, focusing on Data Governance, Organisational Awareness and
Training, Data Architecture Management, and Inter-Entity Data Integration. It's
suggested to expand the plan to encompass Reference and Master Data
Management if relevant.

The Data Assessment program's deliverables will document recommendations


aligned with the ADG Data Management Model (Owned, Described, Quality,
Access, and Implemented).

The Data Assessment Program will create a Data Governance blueprint, define the
Operating Model and Roles & Responsibilities. The organization will become
operational, and the first board meeting is scheduled for Q1 2024.

Data Governance change management will align with the existing Change
Management model, which includes Standard (pre-approved) changes, Critical
Changes (with defined types), and Emergency change management.

The Data Assessment program will define the roadmap for the BI Platform,
including Data Integration, Data Collection, Quality, Transformation, Storage, and
Visualization. Any proposed changes to these BI layers, the baseline Data
Governance Model, or the Checkpoint Process will necessitate a Change Impact
Assessment before being presented for review and approval by the Data
Management Board.

The data management program's change management process will align with
ongoing ADCMA initiatives and involve a review and approval process by the DG
board.

After the proposed Change Management plan is in place, any changes to the
proposed BI and Analytics platform layers will require a Change Impact Assessment
before submission for review and approval by the Data Management Board. This
assessment process will also apply to changes suggested for the baseline Data
Governance Model and Checkpoint Process.
Upon implementing the proposed Change Management plan, any modifications to
the proposed BI and Analytics platform layers must go through a Change Impact
Assessment before being submitted for review and approval by the Data
Management Board. This assessment process will also be applied to changes
suggested for the baseline Data Governance Model and Checkpoint Process.

An additional detailed training module, in alignment with the ADG-DMS training


plan, will be created to cover relevant data domains.

An additional training module, tailored to relevant data domains, will need to be


developed alongside the ADG-DMS training plan.

An additional training module, specifically addressing the relevant data domains,


should be developed in addition to the ADG-DMS training plan.

An additional training module, focusing on the relevant data domains, should be


developed in addition to the ADG-DMS training plan.

An additional training module, focusing on the relevant data domains, should be


developed in addition to the ADG-DMS training plan.

An additional training module, focusing on the relevant data domains, should be


developed in addition to the ADG-DMS training plan.

An additional training module, focusing on the relevant data domains, should be


developed in addition to the ADG-DMS training plan.

As the next steps for the Data Management program, it's recommended to define
the Data Management Audit framework, ensuring alignment with the Data
Management Program, and align the Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
the Data Management Audit framework to align with the program and aligning the
Audit roles with the Internal Affairs sector.

The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.

The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.

The recommended next steps for the Data Management program include defining
a Data Management Audit framework aligned with the Data Management Program
and aligning the Audit roles with the Internal Affairs sector.

The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.

The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.

The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.

The recommended next steps for the Data Management program include defining
a Data Management Audit framework that aligns with the program and aligning the
Audit roles with the Internal Affairs sector.
The recommended next steps for the Data Assessment program include defining
Data Management performance metrics for the relevant data domains.

The recommended next steps for the Data Assessment program include defining
Data Management performance metrics for the relevant data domains.

The recommended next steps for the Data Assessment program include defining
Data Management performance metrics for the relevant data domains.
The Data Auditor from IA should be nominated as part of the Data Governance
Council who can function as an independent auditor of the Data Management
Programme audit, compliance, Governance checkpoint results.

The Data Governance Checkpoint process defined in DLV 9 provides information on


the Data Domains to be considered as appropriate for the Data Governance
Checkpoint.

The Data Governance Board should closely track the budget, effectiveness,
performance of the overall Data Management Programme and Governance.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.

Data categorization and data governance control will be integrated along with the
BIA system implementation by ADCMA. The precise controls and technology
remedy for this control will be defined with the aid of a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

ADCMA will integrate data categorization and data governance control with the BIA
system implementation. With the help of a data governance architect, the precise
controls and technological solution for this control will be determined.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.

Data categorization and data governance control will be integrated along with the
BIA system implementation by ADCMA. The precise controls and technology
remedy for this control will be defined with the aid of a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.
In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.

Currently for the In-house developed information systems, (ADCMA website CMS,
career portal etc.,), ADCMA is suggested to create and maintain data models.
Governance checkpoint processes should be enforced in SDLC (Software
Development Life Cycle) to ensure that the data models are maintained up to date
for all the projects impacting its data models in an information system. data
models should be promoted as one of core deliverables for an information system.

Data architects of respective information systems are responsible to ensure this


activity.

Data Governance board should recommend data modelling specific tool to be used
across ADCMA information systems. Data governance board should suggest
checkpoints for data model reviews in the SDLC. (Mainly should suggest exit criteria
for Design, Build and pre-implementation phases to validate the data model
changes as applicable.)

Data governance board should recommend a tool and data architects of


information systems should use the data model tool to maintain data models of
respective information systems.

ADCMA should create a governance body for data modelling for each of the
information systems. Data Working group should create appropriate training
modules on data model for different roles that are present within information
systems. Depending upon the roles, training modules should provide information
on reading / interpreting the data models at different levels (Conceptual, Logical
and Physical) and also developing the data models in-line with target state
(Conceptual, Logical and Physical) data models.
Data Working group should be responsible for this activity.

ADCMA should consider developing and maintaining data models at conceptual,


logical and physical level for all information systems.

data architects of respective information systems are responsible for this activity.
ADCMA currently does not enforce maintaining identifiable information for
unstructured data to link it with structured data. Data governance board should
consider defining guidelines and maintaining metadata with identifiable
infromation depending upon the type of unstructured data. (For example, for
document type of unstructured data, identifiable information should be
mandated.)
Data architects of respective information systems are responsible for this activity.
Data governance board should be able to define some comman standards and
guidelines to be followed by different information systems.

Currently, Metadata collection with semi-structured and unstructured data is not in


a standardized way in ADCMA. ADCMA data governance board should provide
guidelines on metadata collection with semi-structured and unstructured data to
ensure uniformity of data collected for a specific type of unstructured or semi-
structured data.

Data architects of respective information systems in-line with guidelines from data
governance board should be responsible for this activity.

Depending upon the business use cases, ADCMA should consider conversion of
semi-structured and unstructured data into structured form through
transformational or analytics conversion techniques. As per the current state of
ADCMA, this could be considered as advanced use case.

Data architects of respective information systems should be responsible for this


activity.

When attempting to convert unstructured data (text, voice or video) into


structured form of data, ADCMA Should align its processes with Unstructured
Information Management Architecture (UIMA). Unstructured data to structured
data conversion could be text extraction from unstructured data (text, voice or
video) etc.,. At the time of this maturity assessment, below website have UIMA
documentation.
https://uima.apache.org/downloads/releaseDocs/2.2.1-incubating/docs/html/
index.html
Data architects of respective information systems should be responsible for this
activity.

Data architects of respective information systems should govern the lifecycle


management of unstructured data in-line with Document and Content
Management Control DCM.2.
ADCMA should maintain flow of unstructured information along with its metadata
and identifying data between systems. Once metadata collection for unstructured
data is standardized to include identifying data associated with unstructured data,
relationships between different entities should be maintained.
Data architects of respective information systems are responsible for this activity.

This specification is for developing and maintaining data models covering all
information systems in the enterprise. For all information systems within ADCMA,
Physical data model information to be mapped to logical models and at a higher
conceptual level. The Data Assessment program will establish the Conceptual Data
Model (CDM) for the newly proposed Business Intelligence and Analytics platform.
ADCMA should consider to develop Logical and Physical data models for Business
Intelligence and Analytics platform in-line with the conceptual data model.

It is recommended that Data governance board to propose Governance Checkpoint


Process to ensure mandatory inclusion of data modelling artefacts in SDLC.
Different phases of the project (Usually, Design, Build and pre-implementation
phases) should review the Data model artefacts rather than in pre-implementation
stage to avoid potential rework.
Data working group is responsible for this activity.

Data models for ADCMA information systems should be created and maintained
with each project impacting the data models of respective information systems
giving equal importance to structured and unstructured data.

Data architects of respective information systems are responsible for this activity.

It is recommended that the data governance board should publish reference data
models for immediate reference of data model teams. Data governance board
should publish data models that could be re-used. When new information systems
are introduced, data model teams should refer to existing reference data models
and re-usable data models that could potentially be used.
Data architects of respective information systems are responsible for this activity.
Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform. While
developing the Logical, Physical data models for Business Intelligence and Analytics
platform, UML diagrams should be used as primary modellin notation.

Data architects of respective information systems are responsible for this activity

The Data Governance Board should create guidelines (a pre-defined templates ,


diagrams and notations) with its data model teams to effectively represent of their
data models with different non-technical teams including business teams.
Data architects of respective information systems should adhere to the guidelines
suggested by data governance board.

Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform. Entity
relationship diagrams and class diagrams to document the structure and
relationships of data objects at logical and physical levels should be developed
during the implementation of Business Intelligence and analytics use cases. For all
other applications of ADCMA, it is recommended to prepare Entity relationships at
Conceptual, Logical and physical level.

Data architects of respective information systems are responsible for this activity.

Data governance board should mandate to maintain data flow diagrams to model
the movement of data within and between the systems including but not limited to
maintaining master profiles. Governance Checkpoints within SDLC should be
established to ensure data flow diagrams are maintained.
Data architects of respective information systems along with change management
team are responsible for this activity.

Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform.
Conceptual model is defined at subject area and entity level. While developing
logical and physical models, subject areas and logical grouping of entities below
subject areas should be created depending upon the number of entities in each
subject areas. While creating data models for other applications in ADCMA, similar
approach should be followed.
Data modellers of respective information systems are responsible for this activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. This should be maintained along with each of the business intelligence
initiatives implemented in business intelligence and analytics platform. Data model
related artefacts should differentiate the components of the model that are
implemented and not implemented. Data modellers should provide guidelines to
data model teams on unified way of representing this.
Data modellers of respective information systems are responsible for this activity.

ADCMA should ensure that the data models are maintained by adhering to the
rules defined by ADCMA DMS.
when designing new conceptual data models below rules should be adhered:
• Data objects are represented by nouns
• Data relationships are represented by verbs

Following rules should be adhered to when designing new logical data models:
• The appropriate data type shall be used for attributes within tables. This shall
take into account performance, storage, and data requirements. Where a String or
other variable character data type is used, consideration must have first been given
for more appropriate data types

Following rules should be adhered to when designing new physical data models:
• Primary keys shall be numeric. Where there is not a suitable numeric candidate
key, a surrogate key in the form of an auto-numbering key shall be used
• Reference data tables shall have a numeric primary key (likewise, tables that use
reference data tables shall use the reference table's numeric primary key in the
foreign key relationship)
• Reference data tables will have, at a minimum, a numeric primary key and a code
value represented as a string. Additional payload information (such as textual
descriptions) may also exist as reference data (See RM.2.3)
• Physical data types that have a length or precision specifier shall have an
appropriate length or precision specified, and not left to the default value
Data modellers of respective information systems are responsible for this activity.

This specification is applicable for Entities that have MDM implemented. Currently
ADCMA does not have MDM needs and this specification is not applicable.
Currently ADCMA does not have business metadata / glossary maintenance
process to define business terms. With Data assessment project, data architecture
considers (Business and Technical) Metadata management for Business Intelligence
and analytics platform. Business terms used in Data model for BIA platform and
business glossary should be in sync. For business terms other than the ones used in
BIA Platform, ADCMA Should ensure that respective business glossary is
maintained.

Data architects of respective information systems should be responsible for this


activity.

Currently ADCMA does not have (Business and Technical) metadata / glossary
maintenance process to define business and technical terms. With Data
assessment project, data architecture considers (Business and Technical) Metadata
management for Business intelligence and analytics platform. Technical definitions
for all business terms under ADCMA's ownership should take input from logical and
physical data models. Technical definitions should be populated within the data
dictionary of ADCMA's data catalogue. Only Business Intelligence and Analytics
platform related technical metadata are planned. For other ADCMA systems also
technical definitions should be planned.
Data architects of respective information systems should be responsible for this
activity.

Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Considering that the conceptual data model delivered will be a living
document, data model versions should be maintained when there are any updates
to the conceptual data model in future. In-line, respective logical and physical data
models should also be maintained versions with appropriate naming conventions.
Data modellers of respective information systems are responsible for this activity.

For all the data models maintained for different information systems of ADCMA
should maintain traceability between different views of the data model. Some of
the standard data model tools allow to maintain traceability links between
different views (Conceptual, Logical and Physical) of the same model. ADCMA
should plan to use a data modelling tool that could allow traceability between
different views of the data model. Lower level identifiers should be used from
subject area to its lowe level.
Data modellers of respective information systems are responsible for this activity.
This specification is applicable for all information systems of ADCMA that maintains
data models. Data modellers of respective information system should define
mandatory metadata to be captured along with data model changes.
Data governance board to provide guidelines on potential metadata to be captured
with data model changes and data modellers of respective information systems
should ensure to review the metadata in SDLC phases.

Data governance board should mandate data model version maintenance. If the
data model tool being used does not have versioning enabled, should use an
external version control repository or document management system to manage
data model versions.
Data modellers of respective information systems should follow the guidelines
from data governance board on version maintenance.

ADCMA should develop an enterprise-wide data model that represents


organization-wide view of all data that is central to ADCMA's core business
functions.
Enterprise architect in co-ordination with data modeller groups is responsible for
this activity.

It is recommended to ensure the changes to the data model and it's metadata goes
through the approval of the data governance board for respective information
systems. The data assessment program will make recommendation on the data
governance board.

ADCMA should work on creating ADCMA enterprise data model. While developing
new data models or amending existing data models for individual information
systems , respective changes should be aligned to ADCMA's enterprise data model.
Data modellers of respective information systems are responsible for this activity.

ADCMA should align Enterprise Data Model with new informations systems within
ADCMA as they emerge.
Enterprise architect is responsible for this activity.

For all the information systems of ADCMA, conceptual data models should be
created to support architecture, development and operational processes. Data
governance board should enforce governance checkpoints during system
development lifecycle to review data model artefacts.
Data modellers of respective information systems are responsible for this activity.
Conceptual Data models for existing ADCMA applications should be created and
maintained with each project impacting the data models of respective information
systems.
Conceptual data model should include, but should not be limited to:
• Interviewing stakeholders, or otherwise undertaking business functional analysis
and requirements gathering to understand all relevant business concepts and
requirements
• Identifying candidate data profiles related to business processes, and capturing
associations between these profiles
• Combining candidate data profiles – as appropriate – into master data profiles,
transactional data profiles and reference data profiles, and modelling the high level
relationships between the data profiles
Data modellers of respective information systems are responsible for this activity.

ADCMA should consider the definition of conceptual model components at


information system and enterprise level. Data modellers should ensure to view the
data model components in-line with the specific information systems view.
Data modellers of respective information systems and enterprise architect are
responsible for this activity.

This specification is applicable to all information systems of ADCMA. The Data


Assessment program will establish the Conceptual Data Model (CDM) for business
intelligence and analytics platform. Respective logical and physical data models
should be defined starting with the conceptual data model. For other information
systems of ADCMA should consider developing conceptual data models. This
should be documented and referenced for creating logical and physical data
models.
Data modellers of respective information systems are responsible for this activity

ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.

ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.

ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.

Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical data model should be created in-line with conceptual data model
describing the data attributes and the relationships rules between the profiles.
Data modellers of respective information systems are responsible for this activity.

Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical and Physical data models should be created in-line with conceptual
data model. While ingesting data in conformed layer, data should ensure no-
duplication to the extent possible. The logical modelling of relationships between
entities should describe referential integrity and normalisation concerns. De-
normalization should be preferred in the data marts rather than in the core model
objects.
Data modellers of respective information systems are responsible for this activity.

Logical models should be independent of technology to be used for physical


implementation of the data models. While defining physical models, depending
upon the physical environment to be used, respective environment elements could
be considered.

Part of Data assessment program, Conceptual data model will be created for new
Business intelligence and analytics platform. Logical data model should be created
in-line with the conceptual data model for Business intelligence and analytics
platform. For other information systems within ADCMA, governance checkpoints
should be enforced to ensure that logical data model artefacts are delivered which
could be used for physical data models, impact assessments and / or gap analysis
between current and target state data models.
Data working group of respective information systems should be responsible for
this activity.
Part of Data assessment program, Conceptual data model will be created for new
Business intelligence and analytics platform. Logical data model should be created
in-line with the conceptual data model for Business intelligence and analytics
platform. For other information systems within ADCMA, governance checkpoints
should be enforced to ensure that physical data model artefacts are delivered
which could be used for impact assessments and / or gap analysis between current
and target state data models.
Data working group of respective information systems should be responsible for
this activity.

Physical data models for respective information systems should be maintained up-
to-date with each of the project implementation which could impact physical data
models. These data models could be utilized to understand the relationships
between different entities.
Data modellers of respective information systems are responsible for this activity.

Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical and Physical data models should be created in-line with conceptual
data model in a standard data modelling tool. For other information systems,
ADCMA should create Conceptual, logical and physical data models. ADCMA is
recommended to use standard Data modelling tools which allow different views
(Logical and physical) of data models to be linked.
Data modellers of respective information systems are responsible for this activity.

Currently ADCMA does not maintain Data models for existing applications. ADCMA
should consider reverse engineering data models from existing supported
information systems to create a baseline physical data models. Then reverse
engineer towards logical and Conceptual data models from the physical data
models.
Data modellers of respective information systems are responsible for this activity.

Considering number of information systems currently managed within ADCMA and


knowing the additional systems to be introduced, it is recommended to create and
maintain Enterprise architecture framework inline with standard Enterprise
architecture frameworks including but not limited to TOGAF etc.,

Enterprise architect is responsible for this activity.


ADCMA should maintain component models for ADCMA information systems.
Currently part of Data Assessment project (DLV3), datasets for different
information systems of ADCMA are created. These datasets should be enhanced to
create and maintain data profile function matrix for ADCMA applications. Data
lifecycle models for ADCMA Information systems should be considered to be
maintained. In-line with data security and compliance standards, ADCMA Should
maintain key security touch points. Measures should be taken to profile ADCMA
Information systems for data quality and ensure that data quality measurement
should be considered part of Change management process. Currently ADCMA
Information systems does not have standard process for making data model
changes. Data Governance board should consider creating stage gates in Systems
Development Life Cycle to ensure Data model changes are reviewed and approved
by respective system's data model governance body.

Data architects for respective systems are responsible for this activity.

As part of Data assessment program, for new Business Intelligence and Analytics
platform, data architecture will be recommended based on the understanding of
ADCMA's business needs. For other systems ADCMA should consider data
architecture deliverables for
• Data quality tooling including data profiling and cleansing
• Data security and privacy systems
• Open data management systems
• Document and content management, or workflow systems
• ERP, CRM, HR, Finance, Procurement, Audit, Legal and any other specialist
information systems appropriate to ADCMA

Data architects for respective information systems should be responsible for this
activity

ADCMA should classify architectural elements of existing ADCMA systems in


Emerging, current, Strategic and Retirement. Where new Business Intelligence and
Analytics platform related architectural elements would be classified as Strategic.
Other architectural elements related to silo reporting's should be classified
accordingly.

Data Architects should be responsible for this activity.

0
Part of Data Assessment project, target state data architecture for new Business
Intelligence and Analytics platform will be proposed. ADCMA Should consider to
create and maintain baseline data architecture for all the information systems.
Data architects of respective information systems should be responsible for this.

Part of Data Assessment project, baseline data architecture for new Business
Intelligence and Analytics platform will be proposed.
Data architecture document covers Business and technical requirements
integration framework covers data architecture themes and
Risk assessment report for enterprise BI Roadmap will covers known constraints
with the Business Intelligence and analytics platform.
ADCMA Should consider to create and maintain Enterprise data architecture for all
the systems supporting Key business functions.
Data architects for respective information systems are responsible for this activity.

For all information systems assets, ADCMA should maintain current state
architecture along with target state architecture. Gaps between current state and
target state architecture should be documented as gaps in the architecture to be
bridged. For all projects that could impact respective information system
architecture capabilities should ensure to update current state architecture
document as well as gaps with target state architecture document. All new
business use cases identified should be run against target state architecture
capabilities of specific information systems and need to make amendments to
target state architecture as required (and Gaps between current state and target
state architecture).
Data governance board should have stage gates in system development life cycle
to ensure that current state architecture and gaps with target state architecture
are maintained.

Data architects of respective information systems are responsible for this activity.

Data governance board along with data architects should enforce stage gates in
System life cycle management to ensure that current state architecture is updated
with all projects that impacts the data architecture. While updating the current
state architectures, versions should be maintained.

Data architects of information systems are responsible with this activity.


In-line with earlier suggestions, ADCMA should consider creating target state
enterprise architecture. All ADCMA information systems should ensure that their
information system architecture is in-line with enterprise architecture.

Enterprise architect is responsible with this activity.

ADCMA should create target state architecture for information systems. Target
state architecture to be reviewed for all business use cases identified for specific
information systems and need to make amendments to target state architecture
(and Gaps between current state and target state architecture). In different phases
of SDLC (Ideally multiple phases including design phase closure, built phase closure
and pre-implementation phases), there should be checkpoints to validate the
changes to current state architecture in-line with target state architecture.

Data architects are responsible with this activity.

target state (Enterprise / system) data architecture to ensure that business and
technology requirements are to be addressed. Any new business and technology
requirements identified should be checked against target state architecture and
need to amend architecture if required. Target state architecture should
• Encourage data integration across the Entity between information systems and
services
• Seek removal of duplication in terminology
• Seek to remove duplication of data processes
• Seek alignment of reference and master data across the Entity's systems
• Align with emerging ADCMA-wide technology platforms
• Integrate with ADCMA-wide reference and master data services and standards as
they emerge
• Show re-use of data and system architectures both within the Entity itself and
through collaboration with other Entities
• Be influenced by the data management requirements emerging from the data
quality, data security, data privacy, data integration and interoperability, and data
storage domains, both within the Entity and as delivered from central government
programmes

Enterprise architect is responsible for enterprise architecture. data architects of


respective information systems are responsible for individual systems enterprise
architecture.
The target data architecture should influence technology and data requirements
for system changes, in addition to the standard business and quality (non-
functional) requirements. Data architects should consult business teams and
should consider target state architecture capabilities to address short and long
term business use cases. Thus influencing the technological options to meet the
data architecture.
Data architects of respective information systems are responsible for this activity.

ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined and implemented, ADCMA should ensure to
create current state enterprise architecture and identify gaps between current
state and target state architectures.

Enterprise Architect is responsible for this activity.

ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined and implemented, ADCMA should ensure to
create current state enterprise architecture and identify gaps between current
state and a target state architectures. Roadmap to reach target state enterprise
architecture should be revisited when there are changes to Current state (With
new initiatives to be implemented) and Target State enterprise architectures (New
information systems to be introduced or existing information systems to be retired
etc.,)

Enterprise Architect is responsible for this activity.

ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined, all information systems within ADCMA should
ensure to align to target state enterprise architecture.
Data architects of information systems are responsible for this.

ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined, effectiveness of the roadmap implementation
should be reported by identifying gaps between current state and target state
enterprise data architectures.
Enterprise architect is responsible for this activity.

Implement a comprehensive Data Quality framework defining the DQ validations


across Data collected from multiple sources for the BI and Analytics platform
including a detailed Data Quality Audit framework. The Data Assessment
programme will provide the DQ best practices and a baseline for the DQ
framework.
Implement a comprehensive Data Quality framework defining the DQ validations
across Data collected from multiple sources for the BI and Analytics platform
including a detailed Data Quality Audit framework

Implement a comprehensive Data Quality framework defining the DQ validations


across Data collected from multiple sources for the BI and Analytics platform
including a detailed Data Quality Audit framework

As next steps to the Data Assessment Program, it is recommended to implement


the Data Catalogue implementation for the identified datasets.
Along with the DQ framework implementation, the DQ metadata identified for the
identified datasets will need to be defined in the data catalogue.
It is also recommended to define and implement DQ monitoring metrics along with
the DQ implementation programme.

The Data Assessment programme will provide the DQ best practices along with
recommended DQ checklist. The DQ checklist will need to be automated along with
the DQ implementation.

As next steps to the Data Assessment Program, it is recommended to implement


Data Quality workstream inclusive of DQ framework definition, DQ audit
mechanisms, DQ SLA and DQ metrics monitoring and reporting as applicable to
systems across ADCMA

As next steps to the Data Assessment Program, it is recommended to implement


Data Quality workstream inclusive of DQ framework definition, DQ audit
mechanisms, DQ SLA and DQ metrics monitoring and reporting as applicable to
systems across ADCMA

As next steps to the Data Assessment Program, it is recommended to implement


Data Quality workstream inclusive of DQ framework definition, DQ audit
mechanisms, DQ SLA and DQ metrics monitoring and reporting as applicable to
systems across ADCMA.
It is recommend, while defining the Data Modelling and Master Data management
design, the Data Quality application to the master profiles and the ability to audit
the implementation with appropriate DQ metrices must be implemented.

It is recommend, while defining the Data Modelling and Master Data management
design, the Data Quality application to the master profiles and the ability to audit
the implementation with appropriate DQ metrices must be implemented.

The Data Architect along with the Technical Data Architect with the direction of the
Data Manager shall define and apply the DQ SLAs to externally procured datasets

The Data Manager will use the recommended DQ tools and build the business case
for implementation of DQ tools across the organization. It is recommended to
prepare a comprehensive roadmap/plan of DQ tool implementation as part of DG
Operationalization programme

While defining the Data Catalogue and Metadata Management design, the Data
Quality measures used for auditing will be stored with the Data Catalogue.

The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.

The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.

The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.
As next steps to the Data Assessment programme, the ADCMA ISMS Data Security
Policy V1.0 needs to be augmented to align with the Information Security
Standards defined in the DSP data domain covering architecture components.

The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.

The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.
Implement Information Security standards while defining the standards required
for sharing datasets as "Open Data". The data security & privacy definitions must
be applied to all datasets/data attributes deemed to shared as "Open Data" and
reviewed/approved by the Data Governance board

0
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.

The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG to define the Data Privacy
"Metadata" for the Master profiles. This activity can be done while implementing
Data Catalogue or Metadata Management "Data Classification" at the attribute
level.

The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.. To comply with this
specification, it is recommended to cover the 'mosaic effect' with the "Data
Classification" process.

The existing Cyber Security awareness programme will need to be integrated with
the awareness module for Data Privacy
Along with the Data Privacy Policy, it is recommended to define the "Privacy by
Design" which is integrated with the Data Privacy Standards and general Data
Management Programme standards

The existing Cyber Security awareness programme will need to be integrated with
the awareness module for Data Privacy

As next steps to the data assessment programme, along with the Data Privacy
Policy, it is recommended to define the "Privacy by Design" which is integrated
with the Data Privacy Standards and general Data Management Programme
standards

It is recommended to perform periodic audit of ADCMA data sources to ensure


data being processed are following the "Privacy by Design" principles

The Data Privacy policy should incorporate the "Privacy by Design" principle which
will be integrated with the Data Governance checkpoint process for review and
approval. The DSP.3.3 specification which defines audit capabilities will need to be
integrated with the data governance checkpoint process.

As next steps to the data assessment programme, it is recommended to define and


implement the Privacy Management process and workflow with specific metrices
built around Privacy Management that can be audited

0
0

The specification is 'Partially Implemented' for the DAM (Database Activity


Monitoring) using the SIEM solution.

For DLP, it is recommended to conduct Data Classification for priority datasets


before implementing the DLP solution.
 As part of Apex One, there are two plugins available (are these plugins already
enabled?) {MA}:- These plugins are not enabled as of now.
• Apex One Data Protection plugin
• Apex One DLP (Data Loss Prevention policies).
 https://success.trendmicro.com/dcx/s/solution/1059472-installing-and-
configuring-officescan-data-loss-prevention-dlp-plug-in?
language=en_US&sfdcIFrameOrigin=null

For Data Discovery across the IT landscape including desktops/laptops :


Once the Data Classification is done and DLP solution is being
considered/implemented, the Data Discovery specification will be addressed. With
the DLP solution from Trend Micro, you can also perform the Data Discovery.
https://docs.trendmicro.com/all/ent/dlp/v5.6/en-us/dlpe_5.6_olh/dat_dat-
dsc.html#id124TL0QH0E9

As next steps of Data System Protection evaluation, it is recommended to consider


classification of data and masking them while sharing data within the organisation
(Production env to lower environments. The Data Quality Standards will address
the need for data masking as applicable to sensitive/classified/PII data

0
0

ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .

ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .

ADCMA needs to conduct an business application acritivality and availabbility


assessment and determine the application that needs a cloud enablement on G42
based on the criticality scoring of the applicaton.

ADCMA to define the infrastructre and data center standards and policy. This will
be process and role to be enabled inline to the governance operating model.
ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .

ADCMA to define the infrastructre and data center standards and policy. This will
be process and role to be enabled inline to the governance operating model.

ADCMA should conduct assessment of G42 cloud enablement and the associate
cost of re-platforming of application and develop a benchmark of the data center
costing.

ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.

ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.

ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.

ADCMA at the current state is managing this effctively. However future


enablement of the cloud to be evaluated based on the BIA assessment.
0
ADCMA at the current state is managing this effctively. However future
enablement of the cloud to be evaluated based on the BIA assessment.

ADCMA at the current state is managing this effctively. However future


enablement of the cloud to be evaluated based on the BIA assessment.

ADCMA need to envaluate the backup strategy based on the cloud enablement
strategy part of the BIA assessment.

ADCMA to work on a data center and application BCP strategy and DR roadmap on
G42 in 2023.

0
ADCMA to work on a data center and application BCP strategy and DR roadmap on
G42 in 2023.

As part of the BIA system governance implementation program, ADCMA must


conduct a thorough review of the data storage controls and develop the data
storage and retention strategy and implementation plan.

As part of the BIA system governance implementation program, ADCMA must


conduct a thorough review of the data storage controls and develop the data
storage and retention strategy and implementation plan.

As part of the BIA system governance implementation program, ADCMA must


conduct a thorough review of the data storage controls, ownership and develop
the data storage and retention strategy and implementation plan.

As part of the BIA implementation, ADCMA will have a well defined governance
operating model, and the data governance architect will lay out the specifics of the
governance procedure and technical framework for implementing data life cycle
management.

As part of the BIA implementation, ADCMA will have a well defined governance
operating model, and the data governance architect will lay out the specifics of the
governance procedure and technical framework for implementing data life cycle
management.

The Data Manager will appoint the Integration Data Architect as part of the BIA
implementation programme. The Integration Architect shall work with Data
Manager to propose a design of the "Data Integration framework/layer" within the
proposed Business Intelligence and Analytics platform. The Data Manager will audit
the requirements called out in this specification.
The Data Integration architect shall document a detailed set of framework,
standards and guidelines ensuring alignment of Data Integration Platform for
Metadata Management.

Further to the "Data Integration framework", it is recommended that the Data


Integration architect will define the design specifications of the Data Integration
Platform and conformance to this specification.

The "Trusted Data Sharing Framework" will use the "Data Integration Framework"
as input to define a comprehensive set of standards for "Data Sharing" with
Internal, external and trusted third parties. It is recommended to cover the
following areas while defining the "Trusted Data Sharing Framework";
- Data Sharing Strategy
- Legal & Regulatory Considerations
- Technical & Organizational Considerations
- Operationalizing Data Sharing

The Data Architect in agreement with business and under the DG board guidance,
should revisit, brainstorm and explore the current and possible future data feeds
which may be required into or out of the system and may be included in Strategic
Integration Platform. Re-use of data feed should also consider

The Data Integration architect shall work with the Data Manager and Data Owners
to identify dataset exchange between Entities and define the process to exchange
datasets using ADDA ESB layer. The data integration document shall describe the
process and policy for ADCMA systems to exchange data with other Entities using
the ESB layer.
The Data Integration Architect shall define the data exchange process and adhere
to Information Security Standards defined in Data Security and Privacy.

The BIA platform's Data Integration layer will define the Data Exchange methods. It
is recommended that while designing the "Strategic Integration Platform" these
specifications on data exchange method are taken into consideration.

Migrating Peer-to-Peer connections via the SIP may not be applicable to ADCMA in
the immediate or near future. Although, the BIA platform will apply this
specification limited to the identified data sources being integrated with the BIA
platform,
The data architect responsible for data integration across ADCMA will be
responsible along with other data architects to adhere to the controls

The BIA platform will have need to have the capability to broker (transform) file-
based data exchanges and message-based data exchanges via its integration layer.
The Data Integration Architect will work with the Data Manager to define the
appropriate broker interactions while working on the BIA Data Integration design
specification

The Data Integration Architect shall work with the Data Manager and comply to
this specification while designing the BIA platform integration layer.

The Data Integration Architect shall work with the Data Manager and comply to
this specification while designing the BIA platform integration layer.
It is recommended to implement one-way integration while designing the BIA
platform. Use the broadcast method to publish the dataset/data service to
downstream applications/systems.

If in future, a requirement arises the BIA platform for the identified data source for
information system, has to be extended for two-way or interactive integration.
Proper justification will be provided, Data Governance Board and respective data
architects will be owning/ driving the activity as and when required

The high level plan for BIA platform planned, for the identified data source for
information system, will be incorporated with the required constraints of Detect
data delivery failure, Repeatable/idempotent retries, Statelessness and High
availability

The data architects defined for every applicable domain (E.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements
SLA best practices and guidelines will be provided as part of deliverables
Existing contracts with service providers should be reviewed in the light of the
guidelines

The data architects defined for every applicable domain (e.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements

Should be planed in next phase

The data architects defined for every applicable domain (e.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements

Escalation metrics to be planed along with Data Governance Board for any failure

Should be planed in next phase


ADCMA should perform review of all of its data sources (structured and
unstructured) in a systematic audit using its Risk Assessment process to consider as
Open Data.
All data sources should be deemed ‘Open’ unless there is a quantifiable reason for
keeping the sources closed.
The criteria and decision log for closing a source are to be documented and
reviewed regularly (usual preference could be annually) by ADCMA Data
Governance Board.
In the event that data quality is a concern for not considering a source as open
data, a remediation plan with a clear open data quality threshold is to be put in
place to allow publication.
ADCMA should define the extent of the data source that is to be made available to
users that are both internal – and external – to ADCMA. ADCMA should include
definitions of what constitutes an internal or an external user.

Data Governance board should define Open data Policies. Data working group
should perform review of all data sources to be considered for Open Data in-line
with Open data policies defined.

ADCMA should keep systematic records of ADCMA opened data sources with a
clear explanation of their Open Status (Open or Closed). ADCMA should provide a
definition in their Data Catalogue for each open data set, written clearly and in
plain language in line with the context of its business.
Data working group should maintain the systematic records for the data sources.

All datasets that are deemed ‘open’ in the Open Data Review exercise of ADCMA
are to be made available through:
• The Open Data Portal (an adjunct of the Abu Dhabi Portal) in machine-readable
form (This could include the formats like Delimiter separated data (csv), XMLs,
JSON along with their metadata)
• The Open Data Portal (an adjunct of the Abu Dhabi Portal) in human-readable
form (where practicable) (i.e., to provide metadata in support of data published as
open data)

Data working group to seek approvals from respective data owners on which
datasets could be considered to publish. Prioritized and approved data sources to
be considered for publication by Data working group.
ADCMA should ensure that to the extent possible all data is made available in the
form closest to the source as possible. i.e., Datasets should be closest to the data
collected.
Data should not be manipulated, aggregated, redacted, anonymized or obfuscated
to the extent possible and allowable, with due regard for privacy and security
concerns.
Where such concerns exist, aggregation, redaction, anonymization obfuscation and
other manipulations should be carried out to the minimum extent possible to
alleviate the concern.
The following should be considered:
• Is it reasonably likely that an individual can be identified from those data and
from other data?
• What other data is available, either to the public or to researchers or other
organizations?
• How and why could your data be linked to other datasets?
• What is the likelihood of re-identification being attempted?
• What is the likelihood the re-identification would be successful?
• Which anonymization techniques are available to use?
• What is the quality of the data after anonymization has taken place, and whether
this will meet the quality gate for this data set’s Open Data release?

Data Architect for Open data publication should ensure that Open data publication
should be in the form closest to the source possible

ADCMA team should develop an Open Data Plan, to release the data that is
identified as Open data to publish through the Open Data Portal.
The Open Data Plan shall allow for:
• The dataset to be reviewed and duly approved by data governance committee for
release as Open Data
• Data Quality assessment should be done for the datasets that are considered to
be published as Open.
• Any aggregation, redaction, anonymization or obfuscation required for privacy or
security concerns has been approved and undertaken
• The dataset to be released once it has passed its Open data review, Data quality
checks.

Data working group should publish Open Data Plan to publish Open data in-line
with the data owners approval and prioritization done by Data Governance Group.
ADCMA should ensure that the Open Data Plan prioritizes the release of Open
Data. Some the criteria that could be used but not limited to are :
• Addressing security and privacy concerns
• Addressing the business priorities of ADCMA
• Addressing the demand from third parties for data
• Addressing the measurable quality of the data
Data working group to prioritize Open data to be published in the Open Data plan.

ADCMA should ensure that the Open Data Plan systematically addresses all of the
datasets identified in the Open Data Review.

Data working group to ensure that open data plan systematically addresses all of
the datasets identified.

ADCMA should ensure that progress against the Open Data Plan is monitored, and
the plan is reviewed at regular frequency.

Data working group is responsible for this activity.

ADCMA should publish its Open Data in the Abu Dhabi Government Open Data
Portal.

Data working group is responsible for this activity.

ADCMA should take care to ensure that all Open Data that is published should be
reviewed regularly (Especially when related datasets are published by ADCMA or
other entities) and ensure that:
• The data continuously continues to meet ADCMA's data quality definition
• Security and privacy concerns are continuously reviewed, specifically:
1. Is it reasonably likely that an individual can be identified from those data and
from other data?
2. What other data are available, either to the public or to researchers or other
organizations?
3. How and why could the published open data be linked to other datasets?
4. What is the likelihood of re-identification being attempted?
5. What is the likelihood the re-identification would be successful?
6. Which anonymization techniques are available to use?
7. What is the quality of the data after anonymization has taken place and whether
this will meet the quality gate for this data set’s Open Data release?

Data working group is responsible for this activity.


In the event that the published Open Data fails to meet its quality level or there are
concerns regarding security or privacy, ADCMA Team should:
• Suspend the publication of that dataset as Open Data
• Undertake a new Open Data Review for that dataset
• Establish and execute a mitigation plan for the new concerns and / or data
quality issue
• If necessary, relist the data as ‘Closed’ until such issues can be resolved

Data working group is responsible for this activity.

The Entity shall capture usage trends and statistics regarding access to the data
published as open data, and report these trends and statistics to the ADCMA Data
Governance Committee.

Data working group is responsible for this activity.

ADCMA should undertake annual awareness campaigns on Open Data to ensure


potential users and stakeholders are aware of the existence, nature and quality of
the Open Data being offered by ADCMA.
The awareness campaign needs to consider providing information on below:
• Progress of the Open Data Plan
• The need to inform and educate internal stakeholders
• The need to inform and educate external stakeholders
• The need to inform and educate the wider public
The awareness campaign should include:
• Details on where to find Open Data
• Details on where to find the Open Data Catalogue
• Information on privacy and security concerns, including (in a general sense) the
provisions made for:
1. Aggregation
2. Redaction
3. Anonymization
4. Obfuscation
• Explanations in plain language on the type of data and its context
• An indication on the Age (or Age Window) of the data
• An Indication on the quality that can be expected form the data

Business teams along with Data Governance group is responsible for this activity.
In the event that an ADCMA does not publish a dataset or datasets, it shall use its
annual awareness campaign to:
• Explain to the extent possible the reasons for withholding a dataset
• Indicate if and/or when a dataset will be published
• To provide a clear statement if a particular dataset is to remain unpublished for
the foreseeable future
Data working group is responsible for this activity.

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
The Data Manager will work with the ECM Architect (Data Architect) and the
Technical Data Steward from the ECM program to identify applicability of the
Quality standards of Document and Content management while deciding the ECM
product. The Quality standards will align with the specifications.

The ECM programme will ensure establishing organisational document and content
management standards which defines guidelines for document writing, uniform
document experience (look-and-feel), document naming conventions, document
editorial processes
The Data Manager will work with the ECM Architect (Data Architect) and the
Technical Data Steward from the ECM program to identify compliance to the DCM
standards namely;
- Document & Content Management Standards
- Document Type usage for business cases
- Ability to capture Document Metadata through document lifecycle
Introduce Document and Content management policies defining what additions,
alterations or redactions can be done to any document and under what scenarios.
This can be added as a policy to the existing
"ADCMA_document_access_management_2.x.docx"

With the implementation of specialty content management tools like DAM/MAM


and CMS including the ECM programme, ADCMA can plan to attain "Fully
Implemented" compliance
0

Develop a Document Classification Scheme along with the ECM programme


establishing;
- Document Standards
- Document Type usage for business cases
- The Document Metadata that needs to be captured through the document
lifecycle (SharePoint is already providing this functionality for document and
contents

Introduce 'media content' retirement and disposal techniques for physical media
While defining requirements for the ECM implementation programme, ensure
establishing the following;
- Document & Content Management Standards
- Document Type usage for business cases
- The Document Metadata that needs to be captured through the document
lifecycle (SharePoint is already providing this functionality for document and
contents)

Monitoring and compliance check of document systems and processes including


auditing and reporting on Performance of document management systems,
document & (media) content retention & disposition policy implementation and
monitoring of User satisfaction will be established with operationalization of Data
Governance . Once the document systems and processes are in place, data
governance board can identify RACI matrix and plan regular audit of the content
processes. Data Architect can also introduce feedback mechanism in line with
specification

Include detailed training plan of document and systems management as part of the
Organizational Awareness of Data Management Programme. Entity should allocate
a training budget for ECM programme to develop self paced learning modules for
the specifications asked. The users should be offered certifications once they have
passed the criteria. Entity should aim for all the resources interacting with ECM to
be certified

ADCMA's Business intelligence and analytics roadmap is based on identified use


cases of ADCMA business teams. Before Business Intelligence and analytics
platform is implemented, business teams should have clear vision on the
anticipated use cases to be implemented. With fine tuned business use cases,
roadmap should be re-validated and implemented. Establishing business vision for
the BIA will be the responsibility of the Business Sponsor of the BIA programme.
The Business Case template provided with the DLV9 shall be used to provide input
on the Business vision, clearly articulating the business value from the BIA
programme.
The BIA Programme Architect with the direction of the Data Manager will define
the Non-Functional Requirements which will form as baseline to define the SLAs.
These SLAs shall be approved by the Data Governance Board before
implementation.
While implementing the Business Intelligence and analytics platform Service level
agreements should be defined by platform owner on below points:
Data warehouse availability (Must consider user needs across locations)
Data load latency
Data Retention period (This should also consider audit and regulatory reporting
needs along with specific use case needs)
Data Quality
Concurrent users (Based on location and peak usage times)

While implementing the BIA Platform in multiple phases, at each logical point,
effectiveness of data warehouse initiatives should be measured by Business
Sponsor of the BIA programme.
While measure effectiveness, should evaluate including but not limited to below
points.
Technical Alignment with the architectural road map.
Implementation and usage experiences (If there is a huge deviation on the quality
of data anticipated or any deviation of performance of the warehouse or any
deviations on the anticipated data volumes impacting performance of the platform
etc.,)

When there are external datasets identified, ADCMA should be aligned on below
aspects with External dataset providers:
1) Have clear interface agreement with Functional and Non-functional
requirements on dataset sharing including data refresh cycles, data quality
requirements and other performance metrics.
2) Service level Agreements on dataset sharing
3) Have clear ownership within ADCMA and within external supplier of the datasets
4) Should clearly define issue resolution workflow (SLA Definitions should call out)

BIA architect should ensure that ADCMA Teams follow guidelines to define
interface agreements.

Depending upon the complexity involved the transformations required from source
data to target data model in Business Intelligence and analytics platform, data-
staging environment should be used in-line with data architecture definition for
Business Intelligence and analytics platform.

BIA Architect should define the guidelines on using data-staging environment while
ingesting data from sources.
Datawarehouse, Business Intelligence and analytics initiatives should considered
many aspects of below data management aspects :
Metadata Management, Data Catalogue, Data modelling and design, Data
Architecture, Data Quality, Data Storage, Data Integration and Interoperability,
Master Data Management, Reference Data Management

for the new BIA platform, Data Assessment program will propose data architecture
which will include the aspects of data management that ADCMA should be
considered for its BIA platform. ADCMA BIA Architect should ensure to maintain
the data architecture to meet new ADCMA business cases identified.

During identification of Detailed use cases in new Business Intelligence and


Architecture platform, ADCMA should exercise opportunities to utilize external
data that could aid and maximize business intelligence.

Open Data available could be considered as one of the External data to be used
along with any identified gaps in the existing ADCMA datasets to suit Business use
cases and is available from external sources (Paid or as Open data).

Data Architect (Integration Architect) to take lead on identifying any external data
to be sourced for meeting the business use cases.

For better supportability, ADCMA should prefer Commercial Off The Shelf (COTS) or
Open Source tooling than internally developed tooling as that could potentially
lead to performance and supportability and scalability issues.

BIA Architect should be responsible to ensure this.

Considering that ADCMA is planning to implement a Business Intelligence and


analytics platform, usability over the ease of implementation should be used. The
platform implementation should be considered for scalability and re-usability.

Data Architect should be responsible to ensure this.

During physical data model creation, special-purpose table types should be


considered for Performance, standardization and data quality purposes.
While designing the marts, should consider creating stating , dimension and fact
tables with re-usability of these objects across difference sources, marts and
reporting needs.

Data Modeler in consultation with data architect should own on special purpose
table types when modelling the data warehouse.
Using Surrogate keys to avoid conflicts with any future data integrations should be
considered. Data flows should be designed to address to some of the common
problems including but not limited to late arriving dimensions etc.,

Data architect should own this activity.

While designing Schemas (Marts) for specific business use cases, should consider
simplest schemas possible. Any specific business process should be considered in
the specific mart tables for re-usability of the schema objects (Dimensions, facts).
Data modelers in consultation with Data architect should own this activity.

ADCMA should identify conformed dimensions while ingesting data into Business
Intelligence and analytics platforms so that these dimensions could be re-used
across multiple schemas.
Data modelers to work along with data architects on identifying confirmed
dimensions to be re-used across multiple fact tables.

While creating data architecture, there should be an area maintained to retain the
source data without any manipulation for debugging purposes in case of any data
quality / process issues identified.

BIA Architect to ensure that un-altered version of source file is maintained within
BIA platform in-line with BIA target state architecture.

ADCMA should develop performance metrics to control the quality, volume and
timeliness of data within the data warehouse. These metrics should be reviewed
on regular basis to identify any potential SLAs to be defined.
BIA Architect should own this and ensure performance metrics are maintained
within data warehouse.

ADCMA should preferably share the tooling and technology for various marts
creation to re-use the processes for common data processing purposes (Like data
standardization, data quality checks etc.,)

BIA Architect in consultation with Enterprise architect should own this.

ADCMA should preferably should use same or compatible technology platforms for
data marts.

BIA Architect in consultation with Enterprise architect should own this.


ADCMA should identify conformed dimensions while ingesting data into Business
Intelligence and analytics platforms so that these dimensions could be re-used
across multiple schemas.
BIA Architect in consultation with Data modeler should ensure to identify and re-
use conformed dimensions across data marts.

While building business intelligence and analytics platform in phases, ADCMA


should identify the most effective and utilized data marts within the organization in
order to develop the ADCMA's maturity and personal competency across the range
of data marts within the ADCMA.

BIA Architect is responsible for this.

As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.

Enterprise architect should own this activity.

As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.

Enterprise architect should own this activity.

As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.

Enterprise architect should own this activity.

Throughout the design and development of business intelligence solutions, ADCMA


should ensure that realistic data is used to the extent possible to provide clarity
when engaging with business stakeholders.
BIA Architects in consultation with business owners should own this activity.

While defining the business intelligence indicatives for implementation, they


should be categorized according to Tactical, Strategic and Operations use cases.

Business teams should classify business intelligence initiatives according to type.


ADCMA should ensure that business intelligence reporting integrates with any
existing enterprise reporting solution (if any enterprise reporting solution is
created before Strategic Business Intelligence and Analytics platform is created)

Enterprise architect should be responsible for this activity and might need
consulting with respective reporting solution architects.

ADCMA teams are not using any non-authoritative Volunteered Geographical


Information (VGI) in compliance with government directives. When any of the
business use case demands, then The same base map data shall be used for all
location-based analytics across government and is provided to Entities by the
ADSIC Spatial Data Centre.

Enterprise architect should provide guidelines on geographical information usage


across ADCMA to comply with government directives.

ADCMA should use business intelligence tooling to produce key metrics,


performance indicators, dashboards and scorecards that show their business
objectives.
Business teams should provide Key Performance indicators and metrics to monitor
ADCMA Business objectives.

ADCMA should develop and publish any identified statistical data in line with the
Statistics Centre Abu Dhabi (SCAD) requirements.

BIA Architect (Integration) should ensure that Service level agreements are placed
in-line with importance and criticality of data consumed from SCAD.

ADCMA should produce an initiative to develop data analysis capabilities suitable


for the types of data within its ownership.
ADCMA should evaluate suitable training opportunities within Data Management
Programme and its roadmap for data architecture, in order to enhance data
analytics capabilities within the organization.

Data Governance Board should own this activity.

ADCMA should identify the Big Data use cases to encourage innovation.
Data Governance Board should identify the big data use cases.
ADCMA should implement event stream-based analytical processing to support
high velocity data analysis (Like ADCMA IT asset related threats etc.,). In Data
assessment program, data architecture will be proposed to meet the business
needs of ADCMA which includes stream-based analytics processing for IT use cases.

BIA Architect should own this activity in-line with data architecture proposed.

The Data Governance Chairperson and Data Manager shall follow the DG structure
and operating model defined in DLV 9, refine the policies as appropriate and get
them approved by the Data Governance Board/Council.

1. Nominate DG members at DWG, DGB and DGC


2. Ensure DG Forums/Members have been sufficiently empowered by ADCMA (DG
Office)
3.Convene a DG SteerCo kick-off and define the DG Charter, introduce Roles and
Responsibilities of DG team to the SteerCo

Identify the Data Manager before operationalising the DG structure, operating


model and R&R.

Using the DG Operating Model definitions, R&R, nominate the Data Manager role
as part of the Data Governance core team within the IT support services section

Identify the key Data Domains applicable to ADCMA and appoint Data Architect (s)
for the appropriate Data Domains.

ADCMA can start with appointment of BI Data Architect for the BIA programme.

Business Data Stewards will be identified from departments.


Technical Data Stewards will be assigned from the IT department with core
technical skills related to corresponding Data Domain (e.g. Data Quality -
Informatica Data Quality engineer)

Data Owners will be identified from the departments and associated with Dataset
ownership. The Data Owner is accountable to ensure compliance for the OWNED
dataset.

Data Governance Checkpoint process is defined in DLV 9 explaining the checkpoint


process for continuous monitoring and compliance of Data Management policy,
standards and best practices

The data management board will be accountable & authorised to maintain,


manage, monitor and enhance the data management policies as appropriate.
The Data Management Policy document will be refined by the Data Manager as
appropriate to ADCMA and submitted for review and approval by the DG
Chairperson (DG Board). The policy document will be published internally and
circulated to all departments/sections

The data management board will be accountable & authorised to maintain,


manage, monitor and enhance the data management policies as appropriate.

The data management board will be accountable & authorised to maintain,


manage, monitor and enhance the data management policies as appropriate.

The data management board will be accountable & authorised to maintain,


manage, monitor and enhance the data management policies as appropriate.

The data manager shall enhance the existing Privacy policy (if required) and align
with the ADG published policies for Open Data.

The data manager shall update the data management policy to address data
lifecycle management for applicable data domains (Data Quality, Data Security &
Privacy, Master & Reference Data management, Data Retention & Disposition,
Data Access Management, Data Catalogue management, data classification)

The Data Manager with the direction from DG Chairperson update the Data
Management Plicy document to include the management intent documenting
support to the DMS Data Domains, Controls and Specifications compliance

The data manager shall enhance the existing Data Management policy and align
with the DMS publsihed Data Quality standards compliance

Data Manager to refine the Data Governance metrics as appropriate to ADCMA


explaining how the metrics can be deployed to measure effectiveness of Data
Management Programme.

DLV9 define the Data Governance checkpoint process defining the process to be
followed by the Data Governance team to resolve 'data issues'

The data manager shall update the policy document with Change Management
process

Responsibility of the Data Manager. Defined as one of the functions of the Data
Manager role in DLV9
DG Board and DWG will periodically convene to review and update/modify the
policies as applicable to ensure alignment with the relevant legislation and
maintain evidence in a secured and shared location within the ADCMA secured
environment. The frequency of meetings, roles and approval mandates are defined
in DLV9.
The Data Manager should work with Legal ensuring compliance of Policies with
relevant legislations.

DG Board and DWG will periodically convene to review and update/modify the
policies as applicable to ensure alignment with the relevant legislation and
maintain evidence in a secured and shared location within the ADCMA secured
environment. The frequency of meetings, roles and approval mandates are defined
in DLV9.

The Data Manager to maintain a traceability of Data Management Policies to Data


Management Standards specifications as appropriate.

The Data Manager shall incorporate the Data Management Policies as appropriate
within the existing NDA such that the internal, external, contractor or other
stakeholders agree to the applicable Data Management Policy. This NDA shall be
agreed to be enforced within the Application Management Programmes, Data
Management Programmes or any Enterprise Capability Management programmes
(e.g The Project Manager os the ECM programme will confirm to compliance to
adherence to applicable Data Policies as defined in the approved/published Data
Management Policy

With direction from DG Chairperson, the Data Manager will document specific,
measurable and scheduled goals supporting the Data Management Programme,
aligning the goals with overall Business Strategy.

The Data Governance Board will decide on the date for reviewing the DG plan. The
same shall be approved by the Data Auditor (IA) from the Data Governance
Council.

The DG board will convene and submit the Data Management Programme plan to
ADDA as per applicable process.

The Data Manager will ensure alignment of the Data Management programme
governance plan with the overall business strategy, plan and objectives.

The Data Manager along with the DWG members will ensure DG plan and other
related artefacts are under version control as defined in the Document and
Content data domain specifications.

The Data Manager along with the DWG to assess the risk to operations while
defining the DG Plan (Operations risk like business criticality, impact to business,
budgeting etc)
DLV9 defines the Data Governance Checkpoint process for Data Governance. The
Data Manager is expected to enhance the DG Checkpoint Process as appropriate.

Organisational Awareness and Training plan to be built/enhanced by the Data


Manager. A baseline is provided with DLV9.

The Data Manager is expected to define the Disaster Recovery governance plan in
accordance with the Data Storage data domain

The Data Manager should align the DG plan with Programmes/Initiatives like
Document and Content management (ECM programme)

The data manager along with the DWG shall ensure adherence to core principles of
DMS

DLV 9 defines the accountability and responsibility matrix which needs to be


refined as appropriate before convening the Data Governance Board meeting for
approval of the DG Policy, Roles & Responsibilities, RACI etc

The Data Manager should work with the organisation Change Management team
to create a common Change Management standard. Examples and references are
provided with DLV 9 (Part III)

The Data Assessment programme will define the roadmap and plan for the BI
Platform which covers Data Integration, Data Collection, Quality, Transformation,
Storage and Visualisation. Future change to any of the above BI layers will need to
undergo a Change Impact Assessment before sending for Data Management Board
review and approval. The change impact assessment will also be applicable to
changes proposed to the baseline Data Governance Model and Checkpoint Process

The Data Manager should work with the organisation Change Management team
to create a common Change Management standard. Examples and references are
provided with DLV 9 (Part III)

The Data Manager should work with the organisation Change Management team
to create a common Change Management standard. Examples and references are
provided with DLV 9 (Part III). Data Manager to follow the recommended
specification
The Data Manager should work with the organisation Change Management team
to create a common Change Management standard. Examples and references are
provided with DLV 9 (Part III). Data Manager to follow the recommended
specification

The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.

The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.

The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.

The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.

The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.

The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.

The Data Manager will work with the DWG to create Organisational Awareness
plan around Data Governance. The training plan will include a detailed list of
Training modules including and tool-specific training.

The Data Management Audit framework is the Data Governance Operating Model
which defines the DG Organisation structure, the operating model, Data
Governance Checkpoint process, Roles & Responsibilities, RACI, Inter-Forum,
departmental communications etc. This specification expects DG audit capability
which is defined with the Operating model.

Upon approval of the DG Structure, plan, policy, operating model by the DG Board.
This specification can be marked as 'Fully Implemented'

Upon approval of the DG Structure, plan, policy, operating model by the DG Board.
This specification can be marked as 'Fully Implemented'
The Data Management Auditors will be nominated from Internal Affairs (IA).

Upon approval of the DG Structure, plan, policy, operating model by the DG Board.
This specification can be marked as 'Fully Implemented'

After the first iteration of Data Management Auditor (IA) review and approval, the
Data Governance Chairperson and Data Manager will upload the Statement of
compliances, audit results etc to ADDA or approved third parties

Data Manager is expected to appropriately follow the recommended specification

The Data Manager to create the Statement of Compliance (DM_Assessment) which


is at a programme level then aggregated at organisation level before submission.

Data Manager is expected to appropriately follow the recommended specification


to assess Data Management Risk arising due to non-compliance or a 'Data issues'
not being resolved.

The Data Manager along with the DWG will ensure that the Data management
audit results (Data Governance Checkpoint results) are versioned, classified and
protected and adhere to the defined Infosec classification guidelines.

The Data Auditor from IA will be responsible to work with the DG Chairperson and
Data Manager to align the Data Management Governance (Audit) with other
ADCMA internal Audit mechanisms

The Data Governance Council is responsible to perform the audit and maintain he
audit results with the help of the Data Manager

The Data Manager is responsible to refine the Data Governance metrics and report
to the Board and Council as applicable.

The Data Manager is responsible to refine the Data Governance metrics and report
to the Board and Council as applicable.
The Data Manager is responsible to refine the Data Governance metrics and report
to the Board and Council as applicable.

The Data Auditor from IA should be nominated as part of the Data Governance
Council who can function as an independent auditor of the Data Management
Programme audit, compliance, Governance checkpoint results.

The Data Governance Checkpoint process defined in DLV 9 provides information on


the Data Domains to be considered as appropriate for the Data Governance
Checkpoint.

The Data Governance Board should closely track the budget, effectiveness,
performance of the overall Data Management Programme and Governance.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.
i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

i. ADCMA Data Governance team (Data Manager along with DWG) should
maintain a master “Business Glossary” which is periodically updated with
applicable Business Terms (Glossary) in a excel file format.
ii. The ownership of maintenance of the “Business Glossary” should be aligned
with Datasets which are OWNED by “Data Owners” (e.g if HR Dept head is
designated as the Data Owner of HR ‘Datasets’, then HR “Business Glossary”
associated with HR datasets will be OWNED by HR dept head and maintained and
managed by the HR dept with the help of Business Data Stewards from the HR
department)
iii. The Data Manager can audit the “Business Glossary” periodically for
conformance.
iv. There is requirement for maintaining Data Lineage or Technical Data
Dictionaries at this point in time. ADCMA can evaluate the Technical Metadata and
Data Lineage at a later point
v. There are no Metadata Management tools required at this point for ADCMA.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.

Data categorization and data governance control will be integrated along with the
BIA system implementation by ADCMA. The precise controls and technology
remedy for this control will be defined with the aid of a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

ADCMA will integrate data categorization and data governance control with the BIA
system implementation. With the help of a data governance architect, the precise
controls and technological solution for this control will be determined.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.
ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.

Data categorization and data governance control will be integrated along with the
BIA system implementation by ADCMA. The precise controls and technology
remedy for this control will be defined with the aid of a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

Along with the implementation of the BIA system, ADCMA will develop data
cataloging and data governance control. The definition of the fine-grained
restrictions and the technical solution for these controls will be helped by a data
governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

When the BIA system is implemented, ADCMA will incorporate data cataloging and
data governance control. Determining the specific controls and technological
solution for these controls will be helped by a data governance architect.

ADCMA will implement data cataloging and data governace control with the
implementation of BIA system. Data governance architect will assist in defining the
fine grain controls and technology solution to this control.

In addition to implementing the BIA system, ADCMA will also integrate data
categorization and data governance control. A data governance architect will help
define the specific controls and technological solution for this control.
Currently for the In-house developed information systems, (ADCMA website CMS,
career portal etc.,), ADCMA is suggested to create and maintain data models.
Governance checkpoint processes should be enforced in SDLC (Software
Development Life Cycle) to ensure that the data models are maintained up to date
for all the projects impacting its data models in an information system. data
models should be promoted as one of core deliverables for an information system.

Data architects of respective information systems are responsible to ensure this


activity.

Data Governance board should recommend data modelling specific tool to be used
across ADCMA information systems. Data governance board should suggest
checkpoints for data model reviews in the SDLC. (Mainly should suggest exit criteria
for Design, Build and pre-implementation phases to validate the data model
changes as applicable.)

Data governance board should recommend a tool and data architects of


information systems should use the data model tool to maintain data models of
respective information systems.

ADCMA should create a governance body for data modelling for each of the
information systems. Data Working group should create appropriate training
modules on data model for different roles that are present within information
systems. Depending upon the roles, training modules should provide information
on reading / interpreting the data models at different levels (Conceptual, Logical
and Physical) and also developing the data models in-line with target state
(Conceptual, Logical and Physical) data models.

Data Working group should be responsible for this activity.

ADCMA should consider developing and maintaining data models at conceptual,


logical and physical level for all information systems.

data architects of respective information systems are responsible for this activity.

ADCMA currently does not enforce maintaining identifiable information for


unstructured data to link it with structured data. Data governance board should
consider defining guidelines and maintaining metadata with identifiable
infromation depending upon the type of unstructured data. (For example, for
document type of unstructured data, identifiable information should be
mandated.)

Data architects of respective information systems are responsible for this activity.
Data governance board should be able to define some comman standards and
guidelines to be followed by different information systems.
Currently, Metadata collection with semi-structured and unstructured data is not in
a standardized way in ADCMA. ADCMA data governance board should provide
guidelines on metadata collection with semi-structured and unstructured data to
ensure uniformity of data collected for a specific type of unstructured or semi-
structured data.

Data architects of respective information systems in-line with guidelines from data
governance board should be responsible for this activity.

Depending upon the business use cases, ADCMA should consider conversion of
semi-structured and unstructured data into structured form through
transformational or analytics conversion techniques. As per the current state of
ADCMA, this could be considered as advanced use case.

Data architects of respective information systems should be responsible for this


activity.

When attempting to convert unstructured data (text, voice or video) into


structured form of data, ADCMA Should align its processes with Unstructured
Information Management Architecture (UIMA). Unstructured data to structured
data conversion could be text extraction from unstructured data (text, voice or
video) etc.,. At the time of this maturity assessment, below website have UIMA
documentation.

https://uima.apache.org/downloads/releaseDocs/2.2.1-incubating/docs/html/
index.html

Data architects of respective information systems should be responsible for this


activity.

Data architects of respective information systems should govern the lifecycle


management of unstructured data in-line with Document and Content
Management Control DCM.2.

ADCMA should maintain flow of unstructured information along with its metadata
and identifying data between systems. Once metadata collection for unstructured
data is standardized to include identifying data associated with unstructured data,
relationships between different entities should be maintained.
Data architects of respective information systems are responsible for this activity.
This specification is for developing and maintaining data models covering all
information systems in the enterprise. For all information systems within ADCMA,
Physical data model information to be mapped to logical models and at a higher
conceptual level. The Data Assessment program will establish the Conceptual Data
Model (CDM) for the newly proposed Business Intelligence and Analytics platform.
ADCMA should consider to develop Logical and Physical data models for Business
Intelligence and Analytics platform in-line with the conceptual data model.

It is recommended that Data governance board to propose Governance Checkpoint


Process to ensure mandatory inclusion of data modelling artefacts in SDLC.
Different phases of the project (Usually, Design, Build and pre-implementation
phases) should review the Data model artefacts rather than in pre-implementation
stage to avoid potential rework.
Data working group is responsible for this activity.

Data models for ADCMA information systems should be created and maintained
with each project impacting the data models of respective information systems
giving equal importance to structured and unstructured data.

Data architects of respective information systems are responsible for this activity.

It is recommended that the data governance board should publish reference data
models for immediate reference of data model teams. Data governance board
should publish data models that could be re-used. When new information systems
are introduced, data model teams should refer to existing reference data models
and re-usable data models that could potentially be used.
Data architects of respective information systems are responsible for this activity.

Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform. While
developing the Logical, Physical data models for Business Intelligence and Analytics
platform, UML diagrams should be used as primary modellin notation.

Data architects of respective information systems are responsible for this activity
The Data Governance Board should create guidelines (a pre-defined templates ,
diagrams and notations) with its data model teams to effectively represent of their
data models with different non-technical teams including business teams.
Data architects of respective information systems should adhere to the guidelines
suggested by data governance board.

Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform. Entity
relationship diagrams and class diagrams to document the structure and
relationships of data objects at logical and physical levels should be developed
during the implementation of Business Intelligence and analytics use cases. For all
other applications of ADCMA, it is recommended to prepare Entity relationships at
Conceptual, Logical and physical level.

Data architects of respective information systems are responsible for this activity.

Data governance board should mandate to maintain data flow diagrams to model
the movement of data within and between the systems including but not limited to
maintaining master profiles. Governance Checkpoints within SDLC should be
established to ensure data flow diagrams are maintained.
Data architects of respective information systems along with change management
team are responsible for this activity.

Scope of the specification is to cover the data model for all applications in the
enterprise. The Data Assessment program will establish the Conceptual Data Model
(CDM) for the newly proposed Business Intelligence and Analytics Platform.
Conceptual model is defined at subject area and entity level. While developing
logical and physical models, subject areas and logical grouping of entities below
subject areas should be created depending upon the number of entities in each
subject areas. While creating data models for other applications in ADCMA, similar
approach should be followed.
Data modellers of respective information systems are responsible for this activity.
Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. This should be maintained along with each of the business intelligence
initiatives implemented in business intelligence and analytics platform. Data model
related artefacts should differentiate the components of the model that are
implemented and not implemented. Data modellers should provide guidelines to
data model teams on unified way of representing this.
Data modellers of respective information systems are responsible for this activity.

ADCMA should ensure that the data models are maintained by adhering to the
rules defined by ADCMA DMS.
when designing new conceptual data models below rules should be adhered:
• Data objects are represented by nouns
• Data relationships are represented by verbs

Following rules should be adhered to when designing new logical data models:
• The appropriate data type shall be used for attributes within tables. This shall
take into account performance, storage, and data requirements. Where a String or
other variable character data type is used, consideration must have first been given
for more appropriate data types

Following rules should be adhered to when designing new physical data models:
• Primary keys shall be numeric. Where there is not a suitable numeric candidate
key, a surrogate key in the form of an auto-numbering key shall be used
• Reference data tables shall have a numeric primary key (likewise, tables that use
reference data tables shall use the reference table's numeric primary key in the
foreign key relationship)
• Reference data tables will have, at a minimum, a numeric primary key and a code
value represented as a string. Additional payload information (such as textual
descriptions) may also exist as reference data (See RM.2.3)
• Physical data types that have a length or precision specifier shall have an
appropriate length or precision specified, and not left to the default value
Data modellers of respective information systems are responsible for this activity.

This specification is applicable for Entities that have MDM implemented. Currently
ADCMA does not have MDM needs and this specification is not applicable.
Currently ADCMA does not have business metadata / glossary maintenance
process to define business terms. With Data assessment project, data architecture
considers (Business and Technical) Metadata management for Business Intelligence
and analytics platform. Business terms used in Data model for BIA platform and
business glossary should be in sync. For business terms other than the ones used in
BIA Platform, ADCMA Should ensure that respective business glossary is
maintained.

Data architects of respective information systems should be responsible for this


activity.

Currently ADCMA does not have (Business and Technical) metadata / glossary
maintenance process to define business and technical terms. With Data
assessment project, data architecture considers (Business and Technical) Metadata
management for Business intelligence and analytics platform. Technical definitions
for all business terms under ADCMA's ownership should take input from logical and
physical data models. Technical definitions should be populated within the data
dictionary of ADCMA's data catalogue. Only Business Intelligence and Analytics
platform related technical metadata are planned. For other ADCMA systems also
technical definitions should be planned.
Data architects of respective information systems should be responsible for this
activity.

Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Considering that the conceptual data model delivered will be a living
document, data model versions should be maintained when there are any updates
to the conceptual data model in future. In-line, respective logical and physical data
models should also be maintained versions with appropriate naming conventions.
Data modellers of respective information systems are responsible for this activity.

For all the data models maintained for different information systems of ADCMA
should maintain traceability between different views of the data model. Some of
the standard data model tools allow to maintain traceability links between
different views (Conceptual, Logical and Physical) of the same model. ADCMA
should plan to use a data modelling tool that could allow traceability between
different views of the data model. Lower level identifiers should be used from
subject area to its lowe level.
Data modellers of respective information systems are responsible for this activity.
This specification is applicable for all information systems of ADCMA that maintains
data models. Data modellers of respective information system should define
mandatory metadata to be captured along with data model changes.
Data governance board to provide guidelines on potential metadata to be captured
with data model changes and data modellers of respective information systems
should ensure to review the metadata in SDLC phases.

Data governance board should mandate data model version maintenance. If the
data model tool being used does not have versioning enabled, should use an
external version control repository or document management system to manage
data model versions.
Data modellers of respective information systems should follow the guidelines
from data governance board on version maintenance.

ADCMA should develop an enterprise-wide data model that represents


organization-wide view of all data that is central to ADCMA's core business
functions.
Enterprise architect in co-ordination with data modeller groups is responsible for
this activity.

It is recommended to ensure the changes to the data model and it's metadata goes
through the approval of the data governance board for respective information
systems. The data assessment program will make recommendation on the data
governance board.

ADCMA should work on creating ADCMA enterprise data model. While developing
new data models or amending existing data models for individual information
systems , respective changes should be aligned to ADCMA's enterprise data model.
Data modellers of respective information systems are responsible for this activity.

ADCMA should align Enterprise Data Model with new informations systems within
ADCMA as they emerge.
Enterprise architect is responsible for this activity.

For all the information systems of ADCMA, conceptual data models should be
created to support architecture, development and operational processes. Data
governance board should enforce governance checkpoints during system
development lifecycle to review data model artefacts.
Data modellers of respective information systems are responsible for this activity.
Conceptual Data models for existing ADCMA applications should be created and
maintained with each project impacting the data models of respective information
systems.
Conceptual data model should include, but should not be limited to:
• Interviewing stakeholders, or otherwise undertaking business functional analysis
and requirements gathering to understand all relevant business concepts and
requirements
• Identifying candidate data profiles related to business processes, and capturing
associations between these profiles
• Combining candidate data profiles – as appropriate – into master data profiles,
transactional data profiles and reference data profiles, and modelling the high level
relationships between the data profiles
Data modellers of respective information systems are responsible for this activity.

ADCMA should consider the definition of conceptual model components at


information system and enterprise level. Data modellers should ensure to view the
data model components in-line with the specific information systems view.
Data modellers of respective information systems and enterprise architect are
responsible for this activity.

This specification is applicable to all information systems of ADCMA. The Data


Assessment program will establish the Conceptual Data Model (CDM) for business
intelligence and analytics platform. Respective logical and physical data models
should be defined starting with the conceptual data model. For other information
systems of ADCMA should consider developing conceptual data models. This
should be documented and referenced for creating logical and physical data
models.
Data modellers of respective information systems are responsible for this activity

ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.

ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.

ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.
ADCMA currently does not maintain any information that could be categorized as
master profiles. Digital products extract data from multiple digital platforms. There
are some posts for which responses are collected from different systems.

Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical data model should be created in-line with conceptual data model
describing the data attributes and the relationships rules between the profiles.
Data modellers of respective information systems are responsible for this activity.

Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical and Physical data models should be created in-line with conceptual
data model. While ingesting data in conformed layer, data should ensure no-
duplication to the extent possible. The logical modelling of relationships between
entities should describe referential integrity and normalisation concerns. De-
normalization should be preferred in the data marts rather than in the core model
objects.
Data modellers of respective information systems are responsible for this activity.

Logical models should be independent of technology to be used for physical


implementation of the data models. While defining physical models, depending
upon the physical environment to be used, respective environment elements could
be considered.

Part of Data assessment program, Conceptual data model will be created for new
Business intelligence and analytics platform. Logical data model should be created
in-line with the conceptual data model for Business intelligence and analytics
platform. For other information systems within ADCMA, governance checkpoints
should be enforced to ensure that logical data model artefacts are delivered which
could be used for physical data models, impact assessments and / or gap analysis
between current and target state data models.
Data working group of respective information systems should be responsible for
this activity.
Part of Data assessment program, Conceptual data model will be created for new
Business intelligence and analytics platform. Logical data model should be created
in-line with the conceptual data model for Business intelligence and analytics
platform. For other information systems within ADCMA, governance checkpoints
should be enforced to ensure that physical data model artefacts are delivered
which could be used for impact assessments and / or gap analysis between current
and target state data models.
Data working group of respective information systems should be responsible for
this activity.

Physical data models for respective information systems should be maintained up-
to-date with each of the project implementation which could impact physical data
models. These data models could be utilized to understand the relationships
between different entities.
Data modellers of respective information systems are responsible for this activity.

Data models for existing ADCMA applications should be created and maintained
with each project impacting the data models. For new Business Intelligence and
Analytics platform, conceptual data model will be created with Data Assessment
project. Logical and Physical data models should be created in-line with conceptual
data model in a standard data modelling tool. For other information systems,
ADCMA should create Conceptual, logical and physical data models. ADCMA is
recommended to use standard Data modelling tools which allow different views
(Logical and physical) of data models to be linked.
Data modellers of respective information systems are responsible for this activity.

Currently ADCMA does not maintain Data models for existing applications. ADCMA
should consider reverse engineering data models from existing supported
information systems to create a baseline physical data models. Then reverse
engineer towards logical and Conceptual data models from the physical data
models.
Data modellers of respective information systems are responsible for this activity.

Considering number of information systems currently managed within ADCMA and


knowing the additional systems to be introduced, it is recommended to create and
maintain Enterprise architecture framework inline with standard Enterprise
architecture frameworks including but not limited to TOGAF etc.,

Enterprise architect is responsible for this activity.


ADCMA should maintain component models for ADCMA information systems.
Currently part of Data Assessment project (DLV3), datasets for different
information systems of ADCMA are created. These datasets should be enhanced to
create and maintain data profile function matrix for ADCMA applications. Data
lifecycle models for ADCMA Information systems should be considered to be
maintained. In-line with data security and compliance standards, ADCMA Should
maintain key security touch points. Measures should be taken to profile ADCMA
Information systems for data quality and ensure that data quality measurement
should be considered part of Change management process. Currently ADCMA
Information systems does not have standard process for making data model
changes. Data Governance board should consider creating stage gates in Systems
Development Life Cycle to ensure Data model changes are reviewed and approved
by respective system's data model governance body.

Data architects for respective systems are responsible for this activity.

As part of Data assessment program, for new Business Intelligence and Analytics
platform, data architecture will be recommended based on the understanding of
ADCMA's business needs. For other systems ADCMA should consider data
architecture deliverables for
• Data quality tooling including data profiling and cleansing
• Data security and privacy systems
• Open data management systems
• Document and content management, or workflow systems
• ERP, CRM, HR, Finance, Procurement, Audit, Legal and any other specialist
information systems appropriate to ADCMA

Data architects for respective information systems should be responsible for this
activity

ADCMA should classify architectural elements of existing ADCMA systems in


Emerging, current, Strategic and Retirement. Where new Business Intelligence and
Analytics platform related architectural elements would be classified as Strategic.
Other architectural elements related to silo reporting's should be classified
accordingly.

Data Architects should be responsible for this activity.

All applicable Statistical and geospatial data will be considered as part of


suggestions in the planned Data Architecture framework for new Business
Intelligence and Analytics platform. Outside Business Intelligence and Analytics
platform, adherence to specialist data architecture standards should be taken care
by ADCMA.

Data architects should be responsible for this activity.


Part of Data Assessment project, target state data architecture for new Business
Intelligence and Analytics platform will be proposed. ADCMA Should consider to
create and maintain baseline data architecture for all the information systems.
Data architects of respective information systems should be responsible for this.

Part of Data Assessment project, baseline data architecture for new Business
Intelligence and Analytics platform will be proposed.
Data architecture document covers Business and technical requirements
integration framework covers data architecture themes and
Risk assessment report for enterprise BI Roadmap will covers known constraints
with the Business Intelligence and analytics platform.
ADCMA Should consider to create and maintain Enterprise data architecture for all
the systems supporting Key business functions.
Data architects for respective information systems are responsible for this activity.

For all information systems assets, ADCMA should maintain current state
architecture along with target state architecture. Gaps between current state and
target state architecture should be documented as gaps in the architecture to be
bridged. For all projects that could impact respective information system
architecture capabilities should ensure to update current state architecture
document as well as gaps with target state architecture document. All new
business use cases identified should be run against target state architecture
capabilities of specific information systems and need to make amendments to
target state architecture as required (and Gaps between current state and target
state architecture).
Data governance board should have stage gates in system development life cycle
to ensure that current state architecture and gaps with target state architecture
are maintained.

Data architects of respective information systems are responsible for this activity.

Data governance board along with data architects should enforce stage gates in
System life cycle management to ensure that current state architecture is updated
with all projects that impacts the data architecture. While updating the current
state architectures, versions should be maintained.

Data architects of information systems are responsible with this activity.


In-line with earlier suggestions, ADCMA should consider creating target state
enterprise architecture. All ADCMA information systems should ensure that their
information system architecture is in-line with enterprise architecture.

Enterprise architect is responsible with this activity.

ADCMA should create target state architecture for information systems. Target
state architecture to be reviewed for all business use cases identified for specific
information systems and need to make amendments to target state architecture
(and Gaps between current state and target state architecture). In different phases
of SDLC (Ideally multiple phases including design phase closure, built phase closure
and pre-implementation phases), there should be checkpoints to validate the
changes to current state architecture in-line with target state architecture.

Data architects are responsible with this activity.

target state (Enterprise / system) data architecture to ensure that business and
technology requirements are to be addressed. Any new business and technology
requirements identified should be checked against target state architecture and
need to amend architecture if required. Target state architecture should
• Encourage data integration across the Entity between information systems and
services
• Seek removal of duplication in terminology
• Seek to remove duplication of data processes
• Seek alignment of reference and master data across the Entity's systems
• Align with emerging ADCMA-wide technology platforms
• Integrate with ADCMA-wide reference and master data services and standards as
they emerge
• Show re-use of data and system architectures both within the Entity itself and
through collaboration with other Entities
• Be influenced by the data management requirements emerging from the data
quality, data security, data privacy, data integration and interoperability, and data
storage domains, both within the Entity and as delivered from central government
programmes

Enterprise architect is responsible for enterprise architecture. data architects of


respective information systems are responsible for individual systems enterprise
architecture.
The target data architecture should influence technology and data requirements
for system changes, in addition to the standard business and quality (non-
functional) requirements. Data architects should consult business teams and
should consider target state architecture capabilities to address short and long
term business use cases. Thus influencing the technological options to meet the
data architecture.
Data architects of respective information systems are responsible for this activity.

ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined and implemented, ADCMA should ensure to
create current state enterprise architecture and identify gaps between current
state and target state architectures.

Enterprise Architect is responsible for this activity.

ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined and implemented, ADCMA should ensure to
create current state enterprise architecture and identify gaps between current
state and a target state architectures. Roadmap to reach target state enterprise
architecture should be revisited when there are changes to Current state (With
new initiatives to be implemented) and Target State enterprise architectures (New
information systems to be introduced or existing information systems to be retired
etc.,)

Enterprise Architect is responsible for this activity.

ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined, all information systems within ADCMA should
ensure to align to target state enterprise architecture.
Data architects of information systems are responsible for this.

ADCMA currently does not have Enterprise architecture defined. Once target state
enterprise architecture is defined, effectiveness of the roadmap implementation
should be reported by identifying gaps between current state and target state
enterprise data architectures.
Enterprise architect is responsible for this activity.

Implement a comprehensive Data Quality framework defining the DQ validations


across Data collected from multiple sources for the BI and Analytics platform
including a detailed Data Quality Audit framework. The Data Assessment
programme will provide the DQ best practices and a baseline for the DQ
framework.
Implement a comprehensive Data Quality framework defining the DQ validations
across Data collected from multiple sources for the BI and Analytics platform
including a detailed Data Quality Audit framework

Implement a comprehensive Data Quality framework defining the DQ validations


across Data collected from multiple sources for the BI and Analytics platform
including a detailed Data Quality Audit framework

As next steps to the Data Assessment Program, it is recommended to implement


the Data Catalogue implementation for the identified datasets.

Along with the DQ framework implementation, the DQ metadata identified for the
identified datasets will need to be defined in the data catalogue.

It is also recommended to define and implement DQ monitoring metrics along with


the DQ implementation programme.

The Data Assessment programme will provide the DQ best practices along with
recommended DQ checklist. The DQ checklist will need to be automated along with
the DQ implementation.

As next steps to the Data Assessment Program, it is recommended to implement


Data Quality workstream inclusive of DQ framework definition, DQ audit
mechanisms, DQ SLA and DQ metrics monitoring and reporting as applicable to
systems across ADCMA

As next steps to the Data Assessment Program, it is recommended to implement


Data Quality workstream inclusive of DQ framework definition, DQ audit
mechanisms, DQ SLA and DQ metrics monitoring and reporting as applicable to
systems across ADCMA

As next steps to the Data Assessment Program, it is recommended to implement


Data Quality workstream inclusive of DQ framework definition, DQ audit
mechanisms, DQ SLA and DQ metrics monitoring and reporting as applicable to
systems across ADCMA.

It is recommend, while defining the Data Modelling and Master Data management
design, the Data Quality application to the master profiles and the ability to audit
the implementation with appropriate DQ metrices must be implemented.

It is recommend, while defining the Data Modelling and Master Data management
design, the Data Quality application to the master profiles and the ability to audit
the implementation with appropriate DQ metrices must be implemented.

The Data Architect along with the Technical Data Architect with the direction of the
Data Manager shall define and apply the DQ SLAs to externally procured datasets
The Data Manager will use the recommended DQ tools and build the business case
for implementation of DQ tools across the organization. It is recommended to
prepare a comprehensive roadmap/plan of DQ tool implementation as part of DG
Operationalization programme

While defining the Data Catalogue and Metadata Management design, the Data
Quality measures used for auditing will be stored with the Data Catalogue.

The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.

The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.

The DQ Architect (part of the DG DWG) will table the recommendations to the DB
board for DQ improvement initiatives for review and approval.

As next steps to the Data Assessment programme, the ADCMA ISMS Data Security
Policy V1.0 needs to be augmented to align with the Information Security
Standards defined in the DSP data domain covering architecture components.

The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.

The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.
Implement Information Security standards while defining the standards required
for sharing datasets as "Open Data". The data security & privacy definitions must
be applied to all datasets/data attributes deemed to shared as "Open Data" and
reviewed/approved by the Data Governance board
The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.

The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG to define the Data Privacy
"Metadata" for the Master profiles. This activity can be done while implementing
Data Catalogue or Metadata Management "Data Classification" at the attribute
level.

The Data Manager shall work with the Data Architect/Technical Steward from the
Data Security and Privacy domain as part of the DWG and align the ADCMA ISMS
Data Security Policy V1.0 with the Information Security Standards defined in the
DSP data domain covering architecture components.. To comply with this
specification, it is recommended to cover the 'mosaic effect' with the "Data
Classification" process.

The existing Cyber Security awareness programme will need to be integrated with
the awareness module for Data Privacy

Along with the Data Privacy Policy, it is recommended to define the "Privacy by
Design" which is integrated with the Data Privacy Standards and general Data
Management Programme standards

The existing Cyber Security awareness programme will need to be integrated with
the awareness module for Data Privacy

As next steps to the data assessment programme, along with the Data Privacy
Policy, it is recommended to define the "Privacy by Design" which is integrated
with the Data Privacy Standards and general Data Management Programme
standards

It is recommended to perform periodic audit of ADCMA data sources to ensure


data being processed are following the "Privacy by Design" principles

The Data Privacy policy should incorporate the "Privacy by Design" principle which
will be integrated with the Data Governance checkpoint process for review and
approval. The DSP.3.3 specification which defines audit capabilities will need to be
integrated with the data governance checkpoint process.
As next steps to the data assessment programme, it is recommended to define and
implement the Privacy Management process and workflow with specific metrices
built around Privacy Management that can be audited

The specification is 'Partially Implemented' for the DAM (Database Activity


Monitoring) using the SIEM solution.

For DLP, it is recommended to conduct Data Classification for priority datasets


before implementing the DLP solution.
 As part of Apex One, there are two plugins available (are these plugins already
enabled?) {MA}:- These plugins are not enabled as of now.
• Apex One Data Protection plugin
• Apex One DLP (Data Loss Prevention policies).
 https://success.trendmicro.com/dcx/s/solution/1059472-installing-and-
configuring-officescan-data-loss-prevention-dlp-plug-in?
language=en_US&sfdcIFrameOrigin=null

For Data Discovery across the IT landscape including desktops/laptops :


Once the Data Classification is done and DLP solution is being
considered/implemented, the Data Discovery specification will be addressed. With
the DLP solution from Trend Micro, you can also perform the Data Discovery.
https://docs.trendmicro.com/all/ent/dlp/v5.6/en-us/dlpe_5.6_olh/dat_dat-
dsc.html#id124TL0QH0E9

As next steps of Data System Protection evaluation, it is recommended to consider


classification of data and masking them while sharing data within the organisation
(Production env to lower environments. The Data Quality Standards will address
the need for data masking as applicable to sensitive/classified/PII data
ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .

ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .
ADCMA needs to conduct an business application acritivality and availabbility
assessment and determine the application that needs a cloud enablement on G42
based on the criticality scoring of the applicaton.

ADCMA to define the infrastructre and data center standards and policy. This will
be process and role to be enabled inline to the governance operating model.

ADCMA to define the infrastructre and data center and cloud enabledment policy
across the business application based on the business application criticality
assesmenrt and define the cloud policy .

ADCMA to define the infrastructre and data center standards and policy. This will
be process and role to be enabled inline to the governance operating model.

ADCMA should conduct assessment of G42 cloud enablement and the associate
cost of re-platforming of application and develop a benchmark of the data center
costing.

ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.

ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.

ADCMA at the current state do not have this requirement. However future
enablement of the G42 cloud to be evaluated based on the BIA assessment.

ADCMA at the current state is managing this effctively. However future


enablement of the cloud to be evaluated based on the BIA assessment.

ADCMA at the current state is managing this effctively. However future


enablement of the cloud to be evaluated based on the BIA assessment.
ADCMA at the current state is managing this effctively. However future
enablement of the cloud to be evaluated based on the BIA assessment.

ADCMA need to envaluate the backup strategy based on the cloud enablement
strategy part of the BIA assessment.

ADCMA to work on a data center and application BCP strategy and DR roadmap on
G42 in 2023.

ADCMA to work on a data center and application BCP strategy and DR roadmap on
G42 in 2023.

As part of the BIA system governance implementation program, ADCMA must


conduct a thorough review of the data storage controls and develop the data
storage and retention strategy and implementation plan.
As part of the BIA system governance implementation program, ADCMA must
conduct a thorough review of the data storage controls and develop the data
storage and retention strategy and implementation plan.

As part of the BIA system governance implementation program, ADCMA must


conduct a thorough review of the data storage controls, ownership and develop
the data storage and retention strategy and implementation plan.

As part of the BIA implementation, ADCMA will have a well defined governance
operating model, and the data governance architect will lay out the specifics of the
governance procedure and technical framework for implementing data life cycle
management.

As part of the BIA implementation, ADCMA will have a well defined governance
operating model, and the data governance architect will lay out the specifics of the
governance procedure and technical framework for implementing data life cycle
management.

The Data Manager will appoint the Integration Data Architect as part of the BIA
implementation programme. The Integration Architect shall work with Data
Manager to propose a design of the "Data Integration framework/layer" within the
proposed Business Intelligence and Analytics platform. The Data Manager will audit
the requirements called out in this specification.

The Data Integration architect shall document a detailed set of framework,


standards and guidelines ensuring alignment of Data Integration Platform for
Metadata Management.

Further to the "Data Integration framework", it is recommended that the Data


Integration architect will define the design specifications of the Data Integration
Platform and conformance to this specification.

The "Trusted Data Sharing Framework" will use the "Data Integration Framework"
as input to define a comprehensive set of standards for "Data Sharing" with
Internal, external and trusted third parties. It is recommended to cover the
following areas while defining the "Trusted Data Sharing Framework";
- Data Sharing Strategy
- Legal & Regulatory Considerations
- Technical & Organizational Considerations
- Operationalizing Data Sharing
The Data Architect in agreement with business and under the DG board guidance,
should revisit, brainstorm and explore the current and possible future data feeds
which may be required into or out of the system and may be included in Strategic
Integration Platform. Re-use of data feed should also consider

The Data Integration architect shall work with the Data Manager and Data Owners
to identify dataset exchange between Entities and define the process to exchange
datasets using ADDA ESB layer. The data integration document shall describe the
process and policy for ADCMA systems to exchange data with other Entities using
the ESB layer.

The Data Integration Architect shall define the data exchange process and adhere
to Information Security Standards defined in Data Security and Privacy.

The BIA platform's Data Integration layer will define the Data Exchange methods. It
is recommended that while designing the "Strategic Integration Platform" these
specifications on data exchange method are taken into consideration.

Migrating Peer-to-Peer connections via the SIP may not be applicable to ADCMA in
the immediate or near future. Although, the BIA platform will apply this
specification limited to the identified data sources being integrated with the BIA
platform,
The data architect responsible for data integration across ADCMA will be
responsible along with other data architects to adhere to the controls

The BIA platform will have need to have the capability to broker (transform) file-
based data exchanges and message-based data exchanges via its integration layer.
The Data Integration Architect will work with the Data Manager to define the
appropriate broker interactions while working on the BIA Data Integration design
specification

The Data Integration Architect shall work with the Data Manager and comply to
this specification while designing the BIA platform integration layer.

The Data Integration Architect shall work with the Data Manager and comply to
this specification while designing the BIA platform integration layer.
It is recommended to implement one-way integration while designing the BIA
platform. Use the broadcast method to publish the dataset/data service to
downstream applications/systems.

If in future, a requirement arises the BIA platform for the identified data source for
information system, has to be extended for two-way or interactive integration.
Proper justification will be provided, Data Governance Board and respective data
architects will be owning/ driving the activity as and when required

The high level plan for BIA platform planned, for the identified data source for
information system, will be incorporated with the required constraints of Detect
data delivery failure, Repeatable/idempotent retries, Statelessness and High
availability

The data architects defined for every applicable domain (E.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements
SLA best practices and guidelines will be provided as part of deliverables
Existing contracts with service providers should be reviewed in the light of the
guidelines

The data architects defined for every applicable domain (e.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements

Should be planed in next phase

The data architects defined for every applicable domain (e.g. Data Integration, data
modelling, metadata data) should define the enterprise level data operability and
SLA's. This is to be done with the business requirements

Escalation metrics to be planed along with Data Governance Board for any failure

Should be planed in next phase


ADCMA should perform review of all of its data sources (structured and
unstructured) in a systematic audit using its Risk Assessment process to consider as
Open Data.
All data sources should be deemed ‘Open’ unless there is a quantifiable reason for
keeping the sources closed.
The criteria and decision log for closing a source are to be documented and
reviewed regularly (usual preference could be annually) by ADCMA Data
Governance Board.
In the event that data quality is a concern for not considering a source as open
data, a remediation plan with a clear open data quality threshold is to be put in
place to allow publication.
ADCMA should define the extent of the data source that is to be made available to
users that are both internal – and external – to ADCMA. ADCMA should include
definitions of what constitutes an internal or an external user.

Data Governance board should define Open data Policies. Data working group
should perform review of all data sources to be considered for Open Data in-line
with Open data policies defined.

ADCMA should keep systematic records of ADCMA opened data sources with a
clear explanation of their Open Status (Open or Closed). ADCMA should provide a
definition in their Data Catalogue for each open data set, written clearly and in
plain language in line with the context of its business.
Data working group should maintain the systematic records for the data sources.

All datasets that are deemed ‘open’ in the Open Data Review exercise of ADCMA
are to be made available through:
• The Open Data Portal (an adjunct of the Abu Dhabi Portal) in machine-readable
form (This could include the formats like Delimiter separated data (csv), XMLs,
JSON along with their metadata)
• The Open Data Portal (an adjunct of the Abu Dhabi Portal) in human-readable
form (where practicable) (i.e., to provide metadata in support of data published as
open data)

Data working group to seek approvals from respective data owners on which
datasets could be considered to publish. Prioritized and approved data sources to
be considered for publication by Data working group.
ADCMA should ensure that to the extent possible all data is made available in the
form closest to the source as possible. i.e., Datasets should be closest to the data
collected.
Data should not be manipulated, aggregated, redacted, anonymized or obfuscated
to the extent possible and allowable, with due regard for privacy and security
concerns.
Where such concerns exist, aggregation, redaction, anonymization obfuscation and
other manipulations should be carried out to the minimum extent possible to
alleviate the concern.
The following should be considered:
• Is it reasonably likely that an individual can be identified from those data and
from other data?
• What other data is available, either to the public or to researchers or other
organizations?
• How and why could your data be linked to other datasets?
• What is the likelihood of re-identification being attempted?
• What is the likelihood the re-identification would be successful?
• Which anonymization techniques are available to use?
• What is the quality of the data after anonymization has taken place, and whether
this will meet the quality gate for this data set’s Open Data release?

Data Architect for Open data publication should ensure that Open data publication
should be in the form closest to the source possible

ADCMA team should develop an Open Data Plan, to release the data that is
identified as Open data to publish through the Open Data Portal.
The Open Data Plan shall allow for:
• The dataset to be reviewed and duly approved by data governance committee for
release as Open Data
• Data Quality assessment should be done for the datasets that are considered to
be published as Open.
• Any aggregation, redaction, anonymization or obfuscation required for privacy or
security concerns has been approved and undertaken
• The dataset to be released once it has passed its Open data review, Data quality
checks.

Data working group should publish Open Data Plan to publish Open data in-line
with the data owners approval and prioritization done by Data Governance Group.
ADCMA should ensure that the Open Data Plan prioritizes the release of Open
Data. Some the criteria that could be used but not limited to are :
• Addressing security and privacy concerns
• Addressing the business priorities of ADCMA
• Addressing the demand from third parties for data
• Addressing the measurable quality of the data
Data working group to prioritize Open data to be published in the Open Data plan.

ADCMA should ensure that the Open Data Plan systematically addresses all of the
datasets identified in the Open Data Review.

Data working group to ensure that open data plan systematically addresses all of
the datasets identified.

ADCMA should ensure that progress against the Open Data Plan is monitored, and
the plan is reviewed at regular frequency.

Data working group is responsible for this activity.

ADCMA should publish its Open Data in the Abu Dhabi Government Open Data
Portal.

Data working group is responsible for this activity.

ADCMA should take care to ensure that all Open Data that is published should be
reviewed regularly (Especially when related datasets are published by ADCMA or
other entities) and ensure that:
• The data continuously continues to meet ADCMA's data quality definition
• Security and privacy concerns are continuously reviewed, specifically:
1. Is it reasonably likely that an individual can be identified from those data and
from other data?
2. What other data are available, either to the public or to researchers or other
organizations?
3. How and why could the published open data be linked to other datasets?
4. What is the likelihood of re-identification being attempted?
5. What is the likelihood the re-identification would be successful?
6. Which anonymization techniques are available to use?
7. What is the quality of the data after anonymization has taken place and whether
this will meet the quality gate for this data set’s Open Data release?
Data working group is responsible for this activity.
In the event that the published Open Data fails to meet its quality level or there are
concerns regarding security or privacy, ADCMA Team should:
• Suspend the publication of that dataset as Open Data
• Undertake a new Open Data Review for that dataset
• Establish and execute a mitigation plan for the new concerns and / or data
quality issue
• If necessary, relist the data as ‘Closed’ until such issues can be resolved

Data working group is responsible for this activity.

The Entity shall capture usage trends and statistics regarding access to the data
published as open data, and report these trends and statistics to the ADCMA Data
Governance Committee.

Data working group is responsible for this activity.

ADCMA should undertake annual awareness campaigns on Open Data to ensure


potential users and stakeholders are aware of the existence, nature and quality of
the Open Data being offered by ADCMA.
The awareness campaign needs to consider providing information on below:
• Progress of the Open Data Plan
• The need to inform and educate internal stakeholders
• The need to inform and educate external stakeholders
• The need to inform and educate the wider public
The awareness campaign should include:
• Details on where to find Open Data
• Details on where to find the Open Data Catalogue
• Information on privacy and security concerns, including (in a general sense) the
provisions made for:
1. Aggregation
2. Redaction
3. Anonymization
4. Obfuscation
• Explanations in plain language on the type of data and its context
• An indication on the Age (or Age Window) of the data
• An Indication on the quality that can be expected form the data

Business teams along with Data Governance group is responsible for this activity.
In the event that an ADCMA does not publish a dataset or datasets, it shall use its
annual awareness campaign to:
• Explain to the extent possible the reasons for withholding a dataset
• Indicate if and/or when a dataset will be published
• To provide a clear statement if a particular dataset is to remain unpublished for
the foreseeable future
Data working group is responsible for this activity.

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
a. Identification and classification of Datasets as Reference Data or Master Data
should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)

a. Identification and classification of Datasets as Reference Data or Master Data


should be a recommended activity, since this is subsequent to defining “Datasets”.
A Dataset contains “Data Elements” which can be of having a characteristic of
“Master Data” or “Reference/Lookup Data”.
b. MDM or RDM tools are not applicable, but creation and maintenance of
‘Datasets” and their classification as “Master Data”, “Transactional Data” etc must
be maintained by the Data Governance team (namely Data Manager along with the
Data Architect, Data Stewards from DWG)
The Data Manager will work with the ECM Architect (Data Architect) and the
Technical Data Steward from the ECM program to identify applicability of the
Quality standards of Document and Content management while deciding the ECM
product. The Quality standards will align with the specifications.

The ECM programme will ensure establishing organisational document and content
management standards which defines guidelines for document writing, uniform
document experience (look-and-feel), document naming conventions, document
editorial processes

The Data Manager will work with the ECM Architect (Data Architect) and the
Technical Data Steward from the ECM program to identify compliance to the DCM
standards namely;
- Document & Content Management Standards
- Document Type usage for business cases
- Ability to capture Document Metadata through document lifecycle

Introduce Document and Content management policies defining what additions,


alterations or redactions can be done to any document and under what scenarios.
This can be added as a policy to the existing
"ADCMA_document_access_management_2.x.docx"

With the implementation of specialty content management tools like DAM/MAM


and CMS including the ECM programme, ADCMA can plan to attain "Fully
Implemented" compliance

Develop a Document Classification Scheme along with the ECM programme


establishing;
- Document Standards
- Document Type usage for business cases
- The Document Metadata that needs to be captured through the document
lifecycle (SharePoint is already providing this functionality for document and
contents

Introduce 'media content' retirement and disposal techniques for physical media
While defining requirements for the ECM implementation programme, ensure
establishing the following;
- Document & Content Management Standards
- Document Type usage for business cases
- The Document Metadata that needs to be captured through the document
lifecycle (SharePoint is already providing this functionality for document and
contents)

Monitoring and compliance check of document systems and processes including


auditing and reporting on Performance of document management systems,
document & (media) content retention & disposition policy implementation and
monitoring of User satisfaction will be established with operationalization of Data
Governance . Once the document systems and processes are in place, data
governance board can identify RACI matrix and plan regular audit of the content
processes. Data Architect can also introduce feedback mechanism in line with
specification

Include detailed training plan of document and systems management as part of the
Organizational Awareness of Data Management Programme. Entity should allocate
a training budget for ECM programme to develop self paced learning modules for
the specifications asked. The users should be offered certifications once they have
passed the criteria. Entity should aim for all the resources interacting with ECM to
be certified

ADCMA's Business intelligence and analytics roadmap is based on identified use


cases of ADCMA business teams. Before Business Intelligence and analytics
platform is implemented, business teams should have clear vision on the
anticipated use cases to be implemented. With fine tuned business use cases,
roadmap should be re-validated and implemented. Establishing business vision for
the BIA will be the responsibility of the Business Sponsor of the BIA programme.
The Business Case template provided with the DLV9 shall be used to provide input
on the Business vision, clearly articulating the business value from the BIA
programme.
The BIA Programme Architect with the direction of the Data Manager will define
the Non-Functional Requirements which will form as baseline to define the SLAs.
These SLAs shall be approved by the Data Governance Board before
implementation.
While implementing the Business Intelligence and analytics platform Service level
agreements should be defined by platform owner on below points:
Data warehouse availability (Must consider user needs across locations)
Data load latency
Data Retention period (This should also consider audit and regulatory reporting
needs along with specific use case needs)
Data Quality
Concurrent users (Based on location and peak usage times)

While implementing the BIA Platform in multiple phases, at each logical point,
effectiveness of data warehouse initiatives should be measured by Business
Sponsor of the BIA programme.
While measure effectiveness, should evaluate including but not limited to below
points.
Technical Alignment with the architectural road map.
Implementation and usage experiences (If there is a huge deviation on the quality
of data anticipated or any deviation of performance of the warehouse or any
deviations on the anticipated data volumes impacting performance of the platform
etc.,)

When there are external datasets identified, ADCMA should be aligned on below
aspects with External dataset providers:
1) Have clear interface agreement with Functional and Non-functional
requirements on dataset sharing including data refresh cycles, data quality
requirements and other performance metrics.
2) Service level Agreements on dataset sharing
3) Have clear ownership within ADCMA and within external supplier of the datasets
4) Should clearly define issue resolution workflow (SLA Definitions should call out)

BIA architect should ensure that ADCMA Teams follow guidelines to define
interface agreements.

Depending upon the complexity involved the transformations required from source
data to target data model in Business Intelligence and analytics platform, data-
staging environment should be used in-line with data architecture definition for
Business Intelligence and analytics platform.

BIA Architect should define the guidelines on using data-staging environment while
ingesting data from sources.
Datawarehouse, Business Intelligence and analytics initiatives should considered
many aspects of below data management aspects :
Metadata Management, Data Catalogue, Data modelling and design, Data
Architecture, Data Quality, Data Storage, Data Integration and Interoperability,
Master Data Management, Reference Data Management

for the new BIA platform, Data Assessment program will propose data architecture
which will include the aspects of data management that ADCMA should be
considered for its BIA platform. ADCMA BIA Architect should ensure to maintain
the data architecture to meet new ADCMA business cases identified.

During identification of Detailed use cases in new Business Intelligence and


Architecture platform, ADCMA should exercise opportunities to utilize external
data that could aid and maximize business intelligence.

Open Data available could be considered as one of the External data to be used
along with any identified gaps in the existing ADCMA datasets to suit Business use
cases and is available from external sources (Paid or as Open data).

Data Architect (Integration Architect) to take lead on identifying any external data
to be sourced for meeting the business use cases.

For better supportability, ADCMA should prefer Commercial Off The Shelf (COTS) or
Open Source tooling than internally developed tooling as that could potentially
lead to performance and supportability and scalability issues.

BIA Architect should be responsible to ensure this.

Considering that ADCMA is planning to implement a Business Intelligence and


analytics platform, usability over the ease of implementation should be used. The
platform implementation should be considered for scalability and re-usability.

Data Architect should be responsible to ensure this.

During physical data model creation, special-purpose table types should be


considered for Performance, standardization and data quality purposes.
While designing the marts, should consider creating stating , dimension and fact
tables with re-usability of these objects across difference sources, marts and
reporting needs.

Data Modeler in consultation with data architect should own on special purpose
table types when modelling the data warehouse.
Using Surrogate keys to avoid conflicts with any future data integrations should be
considered. Data flows should be designed to address to some of the common
problems including but not limited to late arriving dimensions etc.,

Data architect should own this activity.

While designing Schemas (Marts) for specific business use cases, should consider
simplest schemas possible. Any specific business process should be considered in
the specific mart tables for re-usability of the schema objects (Dimensions, facts).
Data modelers in consultation with Data architect should own this activity.

ADCMA should identify conformed dimensions while ingesting data into Business
Intelligence and analytics platforms so that these dimensions could be re-used
across multiple schemas.
Data modelers to work along with data architects on identifying confirmed
dimensions to be re-used across multiple fact tables.

While creating data architecture, there should be an area maintained to retain the
source data without any manipulation for debugging purposes in case of any data
quality / process issues identified.

BIA Architect to ensure that un-altered version of source file is maintained within
BIA platform in-line with BIA target state architecture.

ADCMA should develop performance metrics to control the quality, volume and
timeliness of data within the data warehouse. These metrics should be reviewed
on regular basis to identify any potential SLAs to be defined.
BIA Architect should own this and ensure performance metrics are maintained
within data warehouse.

ADCMA should preferably share the tooling and technology for various marts
creation to re-use the processes for common data processing purposes (Like data
standardization, data quality checks etc.,)

BIA Architect in consultation with Enterprise architect should own this.

ADCMA should preferably should use same or compatible technology platforms for
data marts.

BIA Architect in consultation with Enterprise architect should own this.


ADCMA should identify conformed dimensions while ingesting data into Business
Intelligence and analytics platforms so that these dimensions could be re-used
across multiple schemas.
BIA Architect in consultation with Data modeler should ensure to identify and re-
use conformed dimensions across data marts.

While building business intelligence and analytics platform in phases, ADCMA


should identify the most effective and utilized data marts within the organization in
order to develop the ADCMA's maturity and personal competency across the range
of data marts within the ADCMA.

BIA Architect is responsible for this.

As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.

Enterprise architect should own this activity.

As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.

Enterprise architect should own this activity.

As per current understanding of ADCMA's business use cases, ODS might not be
required. This is to be confirmed by Data Architecture deliverable.

Enterprise architect should own this activity.

Throughout the design and development of business intelligence solutions, ADCMA


should ensure that realistic data is used to the extent possible to provide clarity
when engaging with business stakeholders.
BIA Architects in consultation with business owners should own this activity.

While defining the business intelligence indicatives for implementation, they


should be categorized according to Tactical, Strategic and Operations use cases.

Business teams should classify business intelligence initiatives according to type.


ADCMA should ensure that business intelligence reporting integrates with any
existing enterprise reporting solution (if any enterprise reporting solution is
created before Strategic Business Intelligence and Analytics platform is created)

Enterprise architect should be responsible for this activity and might need
consulting with respective reporting solution architects.

ADCMA teams are not using any non-authoritative Volunteered Geographical


Information (VGI) in compliance with government directives. When any of the
business use case demands, then The same base map data shall be used for all
location-based analytics across government and is provided to Entities by the
ADSIC Spatial Data Centre.

Enterprise architect should provide guidelines on geographical information usage


across ADCMA to comply with government directives.

ADCMA should use business intelligence tooling to produce key metrics,


performance indicators, dashboards and scorecards that show their business
objectives.
Business teams should provide Key Performance indicators and metrics to monitor
ADCMA Business objectives.

ADCMA should develop and publish any identified statistical data in line with the
Statistics Centre Abu Dhabi (SCAD) requirements.

BIA Architect (Integration) should ensure that Service level agreements are placed
in-line with importance and criticality of data consumed from SCAD.

ADCMA should produce an initiative to develop data analysis capabilities suitable


for the types of data within its ownership.
ADCMA should evaluate suitable training opportunities within Data Management
Programme and its roadmap for data architecture, in order to enhance data
analytics capabilities within the organization.

Data Governance Board should own this activity.

ADCMA should identify the Big Data use cases to encourage innovation.
Data Governance Board should identify the big data use cases.
ADCMA should implement event stream-based analytical processing to support
high velocity data analysis (Like ADCMA IT asset related threats etc.,). In Data
assessment program, data architecture will be proposed to meet the business
needs of ADCMA which includes stream-based analytics processing for IT use cases.

BIA Architect should own this activity in-line with data architecture proposed.
Quick Wins
The Data Governance organization struture ( Chairperson and Data Manager …etc)
Roles and Responsibilities of DG team
Name of Data Manager as part of the Data Governance core team within the IT support services section
Key Data Domains applicable to ADCMA and Data Architect (s) for the appropriate Data Domains.K
Appointment of BI Data Architect for the BIA programme.
Business Data Stewards from departments & Technical Data Stewards from the IT department
Data Owners list of departments
The data management board
Data Management Policy
Data Governance metrics
Policy document with Change Management process
Responsibility of the Data Manager.
Plan of DG Board of reviewing and update/modify the policies as applicable to ensure alignment with the relevant
legislation and maintain evidence in a secured and shared location within the ADCMA secured environment.
Data Management Standards specifications .
The accountability and responsibility matrix of the data strategy
The change management team
The Data Assessment programme road map
The approval of the DG Structure, plan, policy, operating model by the DG Board
The Data Management Auditors list
Statement of compliance availablity
Master “Business Glossary” with applicable Business Terms (Glossary) in a excel file format.
Data Manager plan audit the “Business Glossary”
Definitions of quality data.
Data security policy
Q1 Q2 Q3 Q4

You might also like