7143-Gaurav - Formatting V4

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

WHAT IS UNIFIED ENTERPRISE DATA LIFECYCLE

MANAGEMENT? ALSO CALLED UNIFIED EDM


Successful IT organizations must be able to incorporate the best talent and tools in order to stay ahead of the
game. With so many moving parts in the Data Integration and Migration process, visibility and collaboration
are now more important than ever.

Unified Enterprise Data Lifecycle


Management is the unification of:
1. Data Governance
2. Metadata Management - Business &
Technical with Data Lineage
3. Data Quality
4. Enterprise Development
Methodologies/Lifecycle Agile &
SDLC/Waterfall
5. Code and Collaboration Automation

DATA ATTRIBUTES

1.
Data
Governance

2.
Business Metadata
Dictionaries
Glossary

6.
Data
Development
Lifecycle

3.
Technical
Metadata

5.
Data
Lineage

4.
Data
Quality

Without an intentional focus on things like Data Governance, Business and Technical Metadata, Data Lineage
and Data Quality, the entire Data Development Lifecycle deliverables may be at risk.
There is a necessary synergy between people, processes and technology that must be considered when
planning for success. However, fragmentation or silos within organizations can impact our ability to see things
in a unified way. They can prevent us from working with maximum transparency and collaboration. This can
result in confusion, churn, and inefficiencies. Resources struggle to determine where each part of the lifecycle
begins and ends among the various organizations. Ultimately it is a breakdown of the Unified Enterprise Data
Management Life Cycle, leading to a lack of trust among the business users who depend on it.
Although Unified Enterprise Data Lifecycle Management is comprised of many parts- there is ONE that is a key
element to the success of any IT organization:

1. Data Governance
Without standards and practices that drive the proper use and format of the data, there can be no clear path to
success.
Many organizations do not know where to begin. Others start making rules and assigning Data Stewards, which
can ultimately lead to a negative connotation of the Governance Initiative, leading to short-term rejection and
long-term failure.
Successful organizations know that data needs to be governed through a strategic approach. Data stewards
must be knowledgeable, and have the right tools and methodologies at their disposal, making the data
governance initiative a passively enacted part of every step of the process.
It also requires appropriate organizational structure, support and technology tools to set the trajectory for
success. Maintaining the relevant policies, processes and standards with regard to change management, Data
Stewardship and Subject Matter Experts (SMEs), is key.
It is as an effective means of measuring and monitoring the data. Making it part of the process will ensure
adoption and long-term growth of the overall initiative.
Page 1 of 8

DATA GOVERNANCE
Strategy

Organization

Policies,
Processes
Standards

Stewardship

Collaboration &
Communication

Measurements
Monitoring

Page 2 of 8

2. Metadata Management - Business & Technical with Data Lineage


Other key components of the Unified Enterprise Data Lifecycle Management approach include Metadata
Management, which is, in turn made up of Business Metadata captured in a shared Business Glossary, as well
as Technical Metadata collected and centralized with context that gives it meaning across business units and in
relation to the business process.

Metadata Management
Automated
Parsers/Scanners
Business
Metadata
Terms linked Data Lineage &
to Metadata Impact Analysis

To enable proper Technical Metadata use that fits with an existing architecture, there
must be flexibility in integration, and interoperability with a broad set of data
management tools and technologies.
This ensures that as you work to instill better practices with purpose-built tools, the
people who currently maintain and benefit from your existing architecture, can
continue to do so as the organization evolves.

Technical
Metadata

UNIFIED ENTERPRISE DATA LIFECYCLE MANAGEMENT

DATA LINEAGE

IMPACT ANALYSIS
NEW DEVELOPMENT ANALYSIS
BI REPORTS Requirements
REGULATORY COMPLIANCE

DATA SOURCES

SMES

INGEST

PREPARE

AUTOMATED
DATA LINEAGE

REVIEW &
APPROVAL

PUBLISH DATA
LINEAGE
Scheduled Automated
Refreshes

The scanning and parsing of code to accurately harvest and create data lineage is made possible by integrating
with Metadata Connectors, ETL Parsers, Scripting Parsers, Metadata Tool Integrators, ETL Connectors, Big Data
Connectors, Modeling and Testing Tool Integrators with the ability to connect and share data freely inside the
enterprise architecture.

Page 3 of 8

The next best step to maintaining the accuracy across these systems is to have automated source-to-target
mapping, which can be used to generate ETL jobs for a broad variety of ETL platforms, and automatically
updates lineage views and impact analysis views as the source and target metadata changes over time.

3. Data Quality
Continuing further into the Unified Enterprise Data Lifecycle Management recipe for success, we come to Data
Quality, which also plays a key role in building trust with the business stakeholders.
Every report and decision point can lead to questions about the quality of the data. Where did it come from?
How was it transformed along the way? Can I trust this report and its precision or is it just directionally correct?
Data Quality can mean a broad variety of things from organization to organization. Ensuring ongoing
consistency in Data Quality is the next big challenge.
The ability to systematically scan, profile, assesses, and fix the data will play an important part in the overall
outcome both in the short and longer term.
Traditionally, this would mean a long and arduous process that is a one time effort, and not repeated until it
becomes necessary. This creates a pendulum of good to bad and back again creating frustration among the
business users, and reinforcing the belief that the data cannot be trusted
In order to put a solution in place that is initially tactical and becomes strategic, you need to have a robust set of
capabilities that go beyond identification of the problem, and dig deeper into solving the problem in a way that
can be replicated on any interval that may be necessary.
You may find yourself in a situation with no effective and integrated way to remediate, enhance, match and
consolidate the results. The process will be at best, incomplete. This is where workflow enabled Data Quality
Assessment Tools is changing the landscape for companies around the world.
Page 4 of 8

WORKFLOW BASED DATA QUALITY ASSESSMENT MANAGER

re

A n alyze

asu
Me

C
ia

ed
te

St

En

nc

ha

Rep
ort
e
Match & C
ons
oli
Code Generate
o
t
Au

m
Re
Enhance

Au
t
Ge o Cod
ne
rat e
e

Data
Profiling

an

me

t)

/Sc
an
Pa
rse

t
en
(Ru Stan
d
les
Ma ard
na ize
gem
en
t)

onitoring

Reporte M

&
ch te
at lida
M so
n
Co

em

Measure

DQAM

n
/Sca
rse
Pa

yz

al

An

Data
So
u
Issue Manag
em
en
onit
ous M oring
u
n
ti
on
g
orin
nit
o
M
te
DQAM
da

ing
rc

ag
an
M

cing
Data Sour

ue
Iss

Co
n
Mo tinuo
nit
ori us
ng

Data Quality
Management

dard

Rem

i z e ( R ules M a n

e diate

a ge

Data Quality
Remediation

End-to-end data quality process tool with Workflow capabilities and auto code generation for data remediation
allows you to manage data quality for the business, with visibility, measurement, and a collaborative approach for
maintenance. This process starts with the end in mind, and is uniquely comprehensive, complete and reusable.
Once the foundation is formed, you will have a continuous process to monitor and report on the Data Quality
through Issues Management, Visualization and Dashboards enabling ongoing and accurate Data Quality that
ensures precision reporting and decision-making that ultimately builds trust

4. Enterprise Development Methodologies- Agile & SDLC/Waterfall


We must also consider the Development Methodologies that enable New Development or maintenance of the
existing repositories... ...with approaches like Agile Framework and traditional System Development Lifecycle
(SDLC) or Waterfall.

DATA LIFECYCLE MANAGEMENT & AUTOMATION

Page 5 of 8

Each of these methodologies are commonly used practices for Data Exchange, Integration,
Transformation, Migration, Extraction and Conversions for Technology Modernization. These may
include Database Platform Migration, ETL Platform Migration, Data Vault Methodology and
Automation, Data Lake Automation, and New ETL Development, as well as Business Intelligence
enablement.
These enable organizations to truly deliver competencies of Data Management Process Design, Data
Transformation Process, Information Literacy, Operational Forensics, Data Compounding, Data Harmonization,
Contextual Analytics, Outcome Analysis, Enterprise Data Inventory, and lots more.

With all of these critical process components being satisfied in a way that is sound, robust and repeatable, the
next natural step is Automation.

5. Code and Collaboration Automation


Traditionally these processes have been accomplished through practices that are less-than-efficient, time
consuming, and often not reusable. But with advanced technology we can reuse, reorient, and redeploy nearly
everything, writing the future of data management with the flexibility to be out front in every chapter!
The traditional one-off approach requires additional justification and funding each time new updates or
changes are required. It is a painful cycle that costs time and money.
The future of data warehousing, data integration, and data management is in future-proof, efficient automation
and reusability.
The acceleration and time savings simply eliminate half of the repetitive funding cycles and instead leverage the
existing automation built into the architecture year over year.
An integrated ecosystem delivers the ability to keep pace with data practices and business demands. Best of all,
the setup and learning curve for data management professionals and business users is nearly flat.
Page 6 of 8

Key Areas of Automation


1. Enterprise Source-to-Target Mapping
2. Integrated Development Environment known as Code Automation Template Frameworks. This enables you
to access metadata through a series of custom APIs with unlimited automation options and code-generation
capabilities. Purpose built modules takes automation to a new level and offers a wider variety of output
types than any product on the market giveing you the freedom to generate all brands of ETL Code, all types \
of SQL Stored Procedures, as well as Big Data Scripting for Pig, Python, MapReduce and many more.
3. Automated Design, Development and Testing
4. Impact Assessments
5. Requirements-to-Delivery Traceability Matrix
6. Data Quality Standardization & Reporting
7. Data Lineage Mapping and Refreshes
8. Release Management
9. Collaboration, Work Queues & Workflows
10. Performance & Status Reporting
11. LDAP Integration to meet security requirements
12. And more
Leveraging Metadata Driven platform and internal APIs, automation for all parts of the lifecycle in agile or
traditional methodologies is a reality now.
This gives you a comprehensive approach to Automating the overall Data Management process in a way that
is highly collaborative, visible and reusable.
It ensures that key benefits are not "lost in translation", through silos systems, and creates bridges with the
Business Teams who seek to better leverage and understand the projects they are funding every day.
The Unified Enterprise Data Lifecycle Management process includes key features that make for a simplified
approach to standardization, True Data Governance, and fast adaptation to a broad variety of Regulatory and
Compliance requirements for just about every industry.
Through increased visibility and collaboration, we create an environment that inspires idea sharing and a best
practices approach naturally and seamlessly, for everyone in the enterprise.

Page 7 of 8

The Unified EDM Platform Solution approach generates exponential value, at a very low cost of entry. It gives
the freedom to implement in a way that truly fits his business today, and well into the future with LDAP
Integration to meet security requirements, Local install, Cloud, or Hybrid options available, there are no
limits.
Alternative solutions that meet the combined value proposition of the Unified EDM Platform can be three or
four times the cost, with additional risk & incomplete functionality, and no centralization.
Considering all of the advantages and efficiencies Unified EDM creates for both the Business Organizations
and IT, its no wonder it is the way of the future for Data Management.
Author: Gaurav Mangal
Dec 2016/Jan 2017
[email protected]
Page 8 of 8

You might also like