Lila Rao-Graham (Author)_ Maurice L. McNaughton (Author)_ Gunjan Mansingh (Author) - Business Intelligence for Small and Medium-Sized Enterprises_ an Agile Roadmap Toward Business Sustainability-Auerb
Lila Rao-Graham (Author)_ Maurice L. McNaughton (Author)_ Gunjan Mansingh (Author) - Business Intelligence for Small and Medium-Sized Enterprises_ an Agile Roadmap Toward Business Sustainability-Auerb
Lila Rao-Graham (Author)_ Maurice L. McNaughton (Author)_ Gunjan Mansingh (Author) - Business Intelligence for Small and Medium-Sized Enterprises_ an Agile Roadmap Toward Business Sustainability-Auerb
This book contains information obtained from authentic and highly regarded sources.
Reasonable efforts have been made to publish reliable data and information, but the
author and publisher cannot assume responsibility for the validity of all materials or the
consequences of their use. The authors and publishers have attempted to trace the copyright
holders of all material reproduced in this publication and apologize to copyright holders if
permission to publish in this form has not been obtained. If any copyright material has not
been acknowledged please write and let us know so we may rectify in any future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted,
r eproduced, transmitted, or utilized in any form by any electronic, mechanical, or other
means, now known or hereafter invented, including photocopying, microfilming, and
recording, or in any information storage or retrieval system, without written permission
from the publishers.
For permission to photocopy or use material electronically from this work, please access
www.copyright.com (http://www.copyright.com/) or contact the Copyright Clearance
Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-
for-profit organization that provides licenses and registration for a variety of users. For
organizations that have been granted a photocopy license by the CCC, a separate system of
payment has been arranged.
F o r e w o r d xi
A u t h o r s xv
I n t r o d u c t i o n C o n c e p t i o n s of Agilit y and Business
Intelligence for S ME s 1
Digital Business Trends 1
Conceptions of Agility 2
Business Intelligence as an Enabler of Agility 3
SMEs, Business Intelligence and Agility 4
About this Book 5
Pa r t I B
I L a n d s c a p e – O p p o r t u n i t i e s for SME s
C h a p t e r 1 B a r r i e r s a n d S t r at e g i e s f o r E n t e r p r i s e
ICT A d o p t i o n i n S ME s 9
Introduction (Digital Economy – Implications for Business) 9
SMEs and ICT in Developing Economies 11
SMEs and ICT Adoption – Barriers & Challenges 11
BI – Value Opportunities for SMEs 13
Distribution and Retail 13
Credit and Micro-Finance Services 14
C h a p t e r 2 A n A g i l e I n t e g r at e d M e t h o d o l o gy f o r
S t r at e g i c B u s i n e s s I n t e l l i g e n c e ( AIMS - BI) 17
Introduction 17
Why the Need for New Methodologies? 19
vii
viii C o n t en t s
Pa r t II N av i g at i n g the A g i l e BI P r o c e s s
C h a p t e r 3 I n f o r mat i o n M a n a g e m e n t (IM) M at u r i t y
A s s e s s m e n t : E va l uat i n g B u s i n e s s
C a pabi l i t i e s a n d G a p s 37
Introduction 37
Maturity Models in Organizations 39
Implementing the IM Maturity Model 40
Applying the AIMS-BI IM Maturity Assessment 43
Example of the Application of IM Maturity Assessment 43
Conclusion 48
C h a p t e r 4 C r e at i n g BI P o r t f o l i o s 49
Introduction 49
The Discovery of BI Opportunities 49
BI Portfolio Evaluation 51
Example: BI Opportunity Discovery and Portfolio Evaluation 53
Conclusion 56
C h a p t e r 5 Th e P r o c e s s a n d Va l u e o f B u i l d i n g
P r o o f - o f - C o n c e p t P r o t o t y p e s 59
Introduction 59
Why Develop PoCs? 60
BI – The Different Layers of Analytics 61
Monitoring Layer 62
What Is Data Visualization? 62
Data Visualization Process Models (VPM) 62
Understanding the Business Context 63
Get Data 64
Visualization Design Considerations 65
How Will the User Respond to the Visualization? 65
How Will the Visualization Be Used? 65
How Should the Data Be Encoded? 66
Building Visualizations 67
C o n t en t s ix
Business Insights 67
Decisions and Actions 68
Prediction Layer 68
What Is Data Mining? 68
IKDDM Process Model for Data Mining 70
Business Understanding Phase 70
Data Understanding 70
Data Preparation 70
Modeling (Data Mining) 71
Evaluation 71
Deployment 72
Lessons Learned from Applying the Process Models to the
Development of the PoCs 72
Conclusion 73
C h a p t e r 6 D ata G o v e r n a n c e a n d D ata Q ua l i t y
M a n a g e m e n t 75
Introduction 75
Data Governance and Data Quality 76
Data Quality Management Process 79
Stage 1: Defining Data Standards Quality Metrics 79
Stage 2: Data Quality Assessment 81
Stage 3: Data Maintenance/Stewardship 85
Stage 4: Data Life-Cycle Process Review 88
Conclusion 88
C h a p t e r 7 D ata I n t e g r at i o n : M a n a g i n g D ata S o u r c e s
f o r A g i l i t y 91
Introduction 91
Opportunities through Data Integration 92
Key Deliverables from Realizing Integration Opportunities 93
Common Approach to Data Integration 94
Possible Integration Approaches 94
Examples of Data Integration Techniques 95
Physical Integration 95
Virtual Integration 96
Physical vs. Virtual Integration 96
Selecting a Data Integration Approach 98
Recommendations for SMEs 100
Conclusion 101
C h a p t e r 9 C r e at i n g B u s i n e s s Va l u e f r o m
D ata A s s e t s 117
Introduction 117
Measuring Value of your Data Assets 118
Method of Information Value Assessment 119
Managing the Value of your Data Assets 121
Benefits Realization Management (BRM) 122
Conclusion/Recommendations 124
E p i lo g u e: L e s s o n s L e a rn e d 125
A p p e n d i x : C a s e S t u d i e s 129
G l o s s a r y 135
B ib l i o g r a p h y 139
I n d e x 145
Foreword
xi
x ii F o re w o rd
xv
xvi Au t h o rs
In the new world, it is not the big fish which eats the small fish,
it’s the fast fish which eats the slow fish.
Klaus Schwab
1
2 BUSINE S S IN T EL LI G EN C E F O R SM E s
fish, it's the fast fish which eats the slow fish”. In other words, size doesn’t
matter, agility does!
Conceptions of Agility
Being Agile has become a recurrent theme in the modern business dis-
course, and many executives now see this as an increasingly important
organizational capability necessary to be able to compete effectively
in the Digital economy. Business agility is generally defined as “the
capacity of firms to sense and respond readily to opportunities and threats in
a dynamic business environment”. However, there are several identifi-
able dimensions to Agility within the organizational context:
Firstly, Agility has its roots in the Software Development
discipline, where the Agile Manifesto prescribes the following key
principles to improve success rates in project implementation:
• Incremental delivery: Deliver functionality in increments with
the involved customer validating and specifying the require-
ments to be included in each increment.
• People not process: Empower the project team and allow team
members to develop their own ways of working without
prescriptive processes.
• Embrace change: Expect the business requirements to evolve
and design the solution to accommodate changes.
• Maintain simplicity: Focus on simplicity in both the product
being developed and in the development process. Wherever
possible, actively work to eliminate complexity from the system.
These practices are not exclusive to software development, but are
generally applicable to any kind of business activity.
Secondly, modern Human Resource practitioners increasingly
embrace Agile organizational behaviors such as:
• Encouraging a fail-fast, learn-fast culture.
• Talent recruitment through social media platforms such as
LinkedIn and Facebook.
• Flexi-work policies that allow employees to “work when, how,
where you want”.
• Building a workplace environment and culture to attract
millennials and digital natives.
AGILIT Y A N D BI F O R SM E s 3
7
8 BUSINE S S IN T EL LI G EN C E F O R SM E s
For some time now, business analysts and practitioners have con-
templated the onset of the Knowledge Economy in which economic
growth and competitiveness become increasingly dependent on the
capacity to create, process, accumulate and disseminate knowledge.
This includes knowledge about customers, products, processes and
competitors, or even knowledge itself as a competitive, tradable asset.
Businesses that acquired knowledge by way of closed innovation pro-
cesses or through the accumulation of knowledge assets (e.g. patents)
were able to maintain sustained competitive differentiation.
The more recent emergence of the digital economy has accelerated
the onset of the knowledge economy and changed its c ompetitive
dynamics by significantly lowering the cost of acquiring and
processing of data, information and knowledge. This change has cre-
ated opportunities for greater participation by organizations with less
resources.
Among the disruptive digital technologies that are driving this
evolution are the following:
Cloud Computing: It allows businesses to outsource their com-
puting infrastructure and access high quality servers, net-
work and application resources on demand and without the
9
10 BUSINE S S IN T EL LI G EN C E F O R SM E s
1 https://oecd.org/industry/smes/31919278.pdf.
12 BUSINE S S IN T EL LI G EN C E F O R SM E s
Wilson (1999)
Introduction
17
18 BUSINE S S IN T EL LI G EN C E F O R SM E s
2 https://gartner.com/doc/1209327/gartners-business-intelligence-analytics-
performance.
A IM S - BI 19
3 http://umsl.edu/~sauterv/DSS4BI/links/pdf/BI/gartners_business_intelligen_
142827.pdf.
A IM S - BI 21
Design of AIMS-BI
Description of AIMS-BI
Step 2: B
I Opportunity Discovery
Step 3: B
I Portfolio Evaluation
Step 4: P
roof of Concept Prototypes
Conclusion
It is being recognized that the traditional methods for BI are too com-
plex and rigid, costly, time consuming and resource intensive for many
organizations and there is, thus, a need for lower cost, less resource
intensive, agile approaches to strategic BI. AIMS-BI provides an agile
methodology for strategic BI. The method integrates various existing
techniques to generate a strategic BI roadmap that reflects the key BI
opportunities, a visible program of implementation, and secures the
executive commitment and resources required to put the organization
on the path to becoming a serious “Analytics Competitor”.
The first step of AIMS-BI, IM assessment is critical as it identifies
a number of key areas in which the organization score may be low but
that are essential for BI success (e.g. data quality). Without addressing
32 BUSINE S S IN T EL LI G EN C E F O R SM E s
the issues in these areas, the business value that could be achieved
through BI investments would not be realized and senior manage-
ment would consider the project a failure. The second step provided
valuable insight into the current practices and opportunities within
the organization. In the third step, the use of the subjective group
decision-making technique is extremely useful as it engages senior
management and ensures their commitment to the BI initiatives. The
final output, a strategic BI Roadmap, identifies the key BI oppor-
tunities, sets out a visible program of implementation, and positions
the organization to secure the executive commitment and resources
required to maximize the BI business value. AIMS-BI provides a
systematic and structured approach that is enterprise-wide in reach
yet agile in execution and focused on organizational needs and capa-
bilities. Although the primary output of AIMS-BI is the strategic
BI roadmap, an important output of Step 4 would be the working
prototypes that are at an advanced stage of development and will
not require a great deal more effort for full deployment. AIMS-BI is
enterprise in scope yet agile in execution and, if followed, can ensure
that an organization maximizes its investments in strategic BI.
Although this methodology can be applied in any organization
looking at developing enterprise-wide analytic capabilities, it is partic-
ularly applicable to organizations that may not be suited to traditional
BI methods due to limited resources and the need to demonstrate
value quickly, such as SMEs.
Part II
N av ig ating
the A g ile BI
P ro ces s
33
34 BUSINE S S IN T EL LI G EN C E F O R SM E s
37
38 BUSINE S S IN T EL LI G EN C E F O R SM E s
1 h t t p s : // l e a r n i n g. o r e i l l y. c o m / l i b r a r y / v i e w /o r g a n i z a t i o n a l - p r o j e c t-
management/9781628250305/.
2 https://cio.com/article/2437864/developer/process-improvement-capability-
maturity-model-integration-cmmi-definition-and-solutions.html.
IM M AT URIT Y A S SE S SM EN T 39
Figure 3.3 below shows the overall capability levels derived across
the six high-level dimensions, through the IM assessment. The
weighted result of this assessment is 1.6 out of a maximum of 5; the
highest scores being assessed for Technology (2.0) and the lowest for
Policy and Measurement (1.3).
Using the five levels of maturity for an organization, based on their
IM practices, as proposed by META Group (see Table 3.1), the over-
all score of 1.6 would be categorized as reactive – meaning that there
is minimal formalized Enterprise Information Management practices,
except in isolated silos, and the organization reacts to data quality
issues as they arise. More significantly, the higher levels of maturity
in the Technology and Compliance categories suggest the existence
of relatively sound traditional technology competencies and capabili-
ties. However, the lower maturity scores for Policy and Measurement
signal clear gaps in formal Data/Information Governance mechanisms
and the minimal use of best practice behaviors and techniques such
as data quality metrics and profiling/measurement. The overall need
for improved Data Governance practices in areas such as Data qual-
ity management, ownership and stewardship is also manifest in the
qualitative comments recorded from the various user group interviews.
these technology functional areas are not improved the benefits will
not be realized.
The institution’s investment in transactional and operational sys-
tems resulted in Technology being assessed the highest of the six
domains, however, the weaknesses in policy, people and practices
limit the effectiveness of the IT investments. This perspective will
enable business leadership to look beyond the technical issues to iden-
tify and address the critical areas of attention if it is to realize greater
value from its data assets and information resources.
Aggregating the same assessment results using the IBM-
DGCMM dimensions (see Figure 3.5) reflects very clearly where the
strengths and weaknesses of the organization lie within its Enterprise
Information Management capabilities. The result of this alternative
assessment is illustrated in Figure 3.5. This perspective reinforces
the earlier insights derived from the IM Assessment. The institution
is not creating/realizing maximum value from its data/information
assets and requires greater attention to organizational enablers such as
improved Information Governance practices in Data ownership and
stewardship. Executive management needs to signal the importance
of BI through more effective messaging, policy and organizational
mechanisms (roles/responsibilities).
Conclusion
The second and the third steps of AIMS-BI involve discovering and
prioritizing BI opportunities in order to create a prioritized BI port-
folio of possible Proof of Concepts (PoCs) initiatives. This need for a
prioritized portfolio is based on the premise that it is unlikely that
organizations, especially SMEs, have the resources to address all BI
opportunities at once. AIMS-BI allows organizations to build and
evaluate a portfolio of BI initiatives to identify those that are perceived
as likely to provide the greatest strategic value to the organization.
This chapter describes how these initiatives can be identified and pri-
oritized, then, depending on the resources available and the resources
needed for each, the top initiatives can be selected for implementation
as PoCs.
The development of this prioritized portfolio requires an under-
standing of the stakeholder needs and identifying possible opportu-
nities for the organization in terms of strategic BI. It is important to
gain a thorough understanding of the organization’s business, assess
any current BI initiatives that the organizations have in place or are
planning to invest in, and discover BI opportunities that align with
the strategic initiatives of the organization.
49
50 BUSINE S S IN T EL LI G EN C E F O R SM E s
BI Portfolio Evaluation
When a portfolio of PoC initiatives has been created using the guide-
lines above, it may be necessary to prioritize the initiatives given the
likelihood of limited resources. It may be the case that some of the
PoCs are mandatory as they are prerequisites to the success of the other
initiatives. For example, based on the IM Maturity Assessment (see
Chapter 3) and the discussions with the stakeholders, the issue of data
quality may keep reoccurring. Given that the success of the initiatives
is likely to be highly dependent on the quality of the data, it may have
to be agreed that a data quality PoC is mandatory. The remaining
PoCs would then be prioritized to help in determining which should
be developed into prototypes.
Different stakeholders will have their own preferences as to which
PoCs are most important and there are a number of criteria to consider in
this prioritization process. A subjective, multi-criteria decision-making
technique is suited to this process. Based on these needs, the multi-
criteria decision making technique, Analytic Hierarchical Processing
(AHP) (Saaty, 1980), can be used as it is specially designed
52 BUSINE S S IN T EL LI G EN C E F O R SM E s
Ranking by C3 - Efficiency
A1 A2 A3 A4
A1 1.00 0.13 0.14 0.13
A2 8.00 1.00 7.00 5.00
A3 7.00 0.14 1.00 0.14
A4 8.00 0.20 7.00 1.00
Payments Analytics were the two chosen for the development of ana-
lytics prototypes.
Conclusion
If you can’t describe what you are doing as a process, you don’t
know what you’re doing.
W. Edwards Deming
Introduction
59
60 BUSINE S S IN T EL LI G EN C E F O R SM E s
The various layers of analytics (see Figure 5.2) include reporting, analy-
sis, monitoring and prediction. The techniques available in each layer
answer specific types of questions. The lowest level, the Reporting level,
focuses on “What happened?” questions and so includes techniques
that facilitate querying and searching (e.g. SQL). The next layer up is
the Analysis layer which focuses on the question “Why did it happen?”
and allows the decision-makers to view the data from multiple dimen-
sions using Online Analytical Processing (OLAP) and data visualiza-
tion tools. The next layer, the Monitoring layer, focuses on the question
“What is happening now?” and uses tools such as dashboards and score-
cards. The Prediction layer looks into the future to answer the question
“What might happen?” which requires data mining techniques and tools.
Finally, the Prescription layer is about finding appropriate actions, based
on the results of the prediction combined with the decision-makers’
business insights, to ensure the benefits of BI initiatives.
The techniques and technologies recommended to answer the ques-
tion related to each layer should not be applied in an ad hoc way, but
should follow one of the process models that have been used to build
BI initiatives. Such models provide a systematic and focused approach
to the development of the BI PoCs. The following section discusses
the application of models to the Monitoring and the Prediction layers.
Monitoring Layer
the analyst needs to interact with the business user and identify the
story that needs to be told through the visualization. Numbers con-
vey a powerful message; therefore the important metrics will need to
be identified. A good starting point for identifying these metrics is
the organization’s Key Performance Indicators (KPIs) as these met-
rics align with strategic objectives and measure actual performance
against targets, thresholds and timelines. KPIs are critical to any
performance management system so they need to be incorporated
in the dashboards or stories of the visualization. Also these metrics
should be represented in the visualizations as many leaders recognize
that “What gets measured gets done” and so the inclusion of the KPIs in
the visualization will improve accountability. The business questions
should focus on the following aspects of the organization:
• What – e.g. what are the best-selling products?
• When – e.g. at what times of the year do the sales of a certain
product spike?
• Where – e.g. in which regions/countries is a particular product
doing best?
• As compared to – e.g. how are the sales of Brand X’s 50 mL
water compared to Brand Y’s over the last 3 months?
• By how much – e.g. what is the percentage increase in the sales
of Product X over the last 3 months?
Get Data
Having identified the key business questions and metrics for the visu-
alization, the next step is to identify and retrieve the data needed
to do the visualization. It is likely that this data resides in disparate
sources throughout the organization, and some may even be external.
Therefore, identified data items will need to be integrated. As a part
of this integration process, the issue of data preparation and cleansing
will have to be addressed. In terms of preparation, some visuals may
require the data to be in a different format to the one it is stored in at
the source so the data will have to be transformed into a form suited to
the visualization. Additionally, any quality issues with the source data
must be resolved before it is included in the data for the visualization.
Chapters 6 and 7 provide details of both data integration and data
cleaning and preparation.
P R O C E S S & VA LUE O F BUIL D IN G P o C P R O T O T Y P E S 65
red and big can be used to represent a highly dramatic narrative. The
behavioral experience focuses on issues such as readability and usabil-
ity. These visualizations are interactive in nature and allow users
to literally have a conversation with the data. Analytical dashboards
will have a higher degree of interactivity whereas strategic dash-
boards, which present highly summarized data primarily for senior
management decision-makers, require little interactivity. A reflective
user experience occurs in visualizations when the graphics result in
decisions and/or actions. Operational dashboards should be action-
oriented and strategic dashboards should be decision-oriented. Using
size and color for key variables on strategic dashboards will tend to
focus the attention of the decision-makers on the key performance
indicators. All these aspects of the design need to be considered in
order to ensure that the visualization is fit for its purpose and also
geared toward the specific decision-makers.
Building Visualizations
In building visualizations, the data is encoded as a visual object with
multiple components. The analysts will have to select visualization
techniques and group the individual visualizations to form a strong
narrative. The business questions, the data and the design consider-
ations will be used to create the individual graphical encodings. For
example, if the focus of an organization is on customer satisfaction,
the analyst will need to graphically encode the current customer sat-
isfaction levels, customer satisfaction across branches/regions, the
trend of customer satisfaction over years and quarters, the relation-
ship between sales and customer satisfaction etc. For each graph the
analyst will have to determine the most appropriate representation
(i.e. bar, pie, line, heat map) and connect them so that the big picture
emerges.
Business Insights
The visualizations enable decision-makers to gain evidence-based
insights into the organization. They interact with the visualizations to
derive insights based on the compelling stories that emerge from the
graphics. The graphical, summarized and multiple views of the data
enable the decision-makers to understand the complete picture that is
embedded within data.
68 BUSINE S S IN T EL LI G EN C E F O R SM E s
Prediction Layer
the target variable (i.e. the class it belongs to is not known), the model
is used to predict the class. Examples of predictive analytic techniques
are classification, regression and value prediction.
Classification is the most commonly applied data mining technique
in predictive modeling. It consists of three stages: model construc-
tion, model evaluation and model use. It uses a set of pre-classified
examples (i.e. training data set) to develop a model that can classify
the population in the future. In the model construction or learning
phase the training data is used by the selected classification algorithm
to build a classifier model. In the evaluation phase the generated
model is checked for accuracy using test or validation data and, if the
model is acceptable, it is used to classify new data. Common clas-
sification algorithms include decision trees, logistic regression, k-NN
and neural networks.
Descriptive techniques, on the other hand, are unsupervised learn-
ing methods. They do not have a test data set with a known target vari-
able, rather they focus on describing behavior. Common descriptive
techniques include sequence mining and association rule mining.
Association rule induction is a commonly used technique in data
analytics which is used to find recurring patterns in data of the form
X => Y (i.e. X implies Y), where X and Y are concepts or sets of
concepts which frequently occur together in the data set. Association
rules have been used successfully for market basket analysis which is
based on the theory that if a customer buys a certain group of items,
they are more (or less) likely to buy another group of items. It is a tech-
nique that has become extremely popular in the retail business and
to understand customers purchase behaviors and to identify relation-
ships between the items that people buy. The generation of association
rules is a two-step process. First all the itemsets, where each itemset
is a set of items a customer buys in one transaction, that satisfy a user-
specified minimum support criterion are extracted from the data set.
The associations between the items that occur frequently together are
then identified using a user-specified minimum confidence criterion.
The two criteria, confidence and minimum support, are significant as
they generate a large number of possible association rules and there-
fore require techniques that are able to identify the most useful ones.
As it is unlikely that there is only one possible data mining tech-
nique that can be applied to a given data set, choosing which are the
70 BUSINE S S IN T EL LI G EN C E F O R SM E s
most appropriate for a given problem and a given data set is not a
trivial task. Therefore, it is important that this choice is guided by a
multiphase knowledge discovery process model such as IKDDM –
Integrated Knowledge Discovery and Data Mining.
The need for formal data mining process models that prescribe a path
from data to knowledge discovery has been recognized. These models
provide a framework that allows for the identification of all the inputs
and outputs associated with tasks as well as their dependencies within
and across the phases of the process. Following such a process model
provides a mechanism for applying data mining techniques in a sys-
tematic and structured manner thereby increasing the likelihood that
the results will be accurate and reliable and the process is repeatable.
One such process model is the IKDDM model (see Figure 5.4)
which consists of the following phases: Business (or Application
Domain) Understanding (which includes definition of business and
data mining goals), Data Understanding, Data Preparation, Data
Mining/Analytics (or Modeling), Evaluation (evaluation of results
based on Data Mining goals) and Deployment. Each of these phases
is described below:
Data Understanding
This phase starts with an initial data collection and proceeds to activities
which familiarize analysts with the data and allow them to identify
data quality problems, to discover first insights into the data or to
detect interesting subsets to form hypotheses for hidden information.
Data Preparation
This phase covers all activities associated with constructing the final
data set (data that will be fed into the modeling tool(s)) from the initial
P R O C E S S & VA LUE O F BUIL D IN G P o C P R O T O T Y P E S 71
Figure 5.4 Phases of the IKDDM model (Sharma and Osei-Bryson, 2010).
raw data. Data preparation tasks are likely to be performed multiple times
and not in any prescribed order. These activities include table, record and
attribute selection as well as the transformation and cleaning of data.
Evaluation
This phase of the project consists of thoroughly evaluating the model
and reviewing the steps executed to construct the model to be certain
that it properly achieves the business objectives. A key objective is to
72 BUSINE S S IN T EL LI G EN C E F O R SM E s
determine if there is some important business issue that has not been
sufficiently considered. At the end of this phase, a decision on the use
of the Data Mining results should be reached.
Deployment
The creation of the model is not the end of the process as simply
extracting the knowledge from the data is of little value if it is not
organized and presented in a way that the decision-makers can use
of it. The deployment can be as simple as generating a report or as
complex as implementing a repeatable Data Mining process across the
enterprise. It is important that the decision-makers are clear about the
actions that they can take to make use of the models.
Conclusion
Introduction
Data Quality and Data Governance have been identified as two factors
that have constrained the efficiency and effectiveness of data-driven
analytics initiatives and decision-making processes. The Information
Management Maturity assessment, administered in the first step of
AIMS-BI, rates both Data Quality (DQ ) and Data Governance and
highlights issues surrounding them. Beyond the obvious issue of accu-
racy (i.e. degree to which data values are correct), inconsistent data
formats, missing values, same field names with different meanings
from different data sources, and the identification of logical related-
ness between specific data instances based on values are all commonly
occurring issues that highlight the need for a systematic approach to
DQ. Additionally, even if current DQ issues are addressed, without
proper Data Governance and the accountability and ownership it pro-
vides, these errors will reoccur.
This chapter outlines what both Data Governance and DQ entail,
explains why they are critical to Business Intelligence (BI) and then
goes on to describe, in detail, a systematic approach to DQ stan-
dards definition and DQ measurement. The chapter will also dem-
onstrate the use of DQ Dashboards for empowering data stewards in
formulating and overseeing adherence to DQ standards and policies
across the organization. The emphasis will also be on the systemic
approaches to measurement, ranking and quantification of DQ and
75
76 BUSINE S S IN T EL LI G EN C E F O R SM E s
asset value and how these can be maintained through proper Data
Governance practices.
1 http://download.101com.com/pub/tdwi/Files/DQReport.pdf.
2 https://pwc.fr/fr/assets/files/pdf/2016/05/pwc_a4_data_governance_results.pdf.
78 BUSINE S S IN T EL LI G EN C E F O R SM E s
Stage 1: D
efining Data Standards Quality Metrics
The first stage of the process involves the definitions of data standards
and quality metrics. It requires the engagement of key stakeholders
80 BUSINE S S IN T EL LI G EN C E F O R SM E s
one such data item standard (First Name of the Customer) is shown
in Table 6.3.
The data standards defined in the catalog must have stakeholder
buy-in, therefore they must be signed-off by the members of the Data
Management Group. It is important to note that different stakehold-
ers may have different notions of acceptable quality standards. For
instance, 100% completeness of customer contact information is seen
as a required standard by the Compliance group, due to KYC (Know-
Your-Customer) Compliance obligations, while the Marketing/Sales
department may perceive a 75% completeness as an acceptable stan-
dard for effective customer targeting.
Stage 2: D
ata Quality Assessment
source of data under consideration should be identified and used for the
DQ assessment phase. A good starting point, in terms of which fields
in the tables to use for the profiling, is the unique identifier key (e.g.
Customer_ID in a Customer table or Employee_ID in an Employee
table). The quality of the unique identifier field is critical to the success
of many of the analytics Proof of Concepts as it is often the common
field used to join the tables needed for the analytics from across mul-
tiple systems. Errors, inconsistencies and missing values in these unique
identifier fields will severely limit the results (i.e. number of rows) real-
ized by joining this table with others and thus can significantly reduce
the size of the data set to which the analytics will be applied.
The Data Profiling activity conducts an assessment of actual data
and data structures and helps to identify DQ issues at the individual
attribute level, at the table level, and between tables. Profiling also
captures metadata as a by-product of the process.
Two examples of types of profiling analyses that should be carried
out on the tables are column profiling and table profiling:
Column Profiling involves describing and analyzing the data
1.
found in each single column/field of a data table. For example,
assume an organization wants to ensure that it can contact its
customers and therefore decides to focus on the data relating
to “Right Party Contact”, Table 6.4 shows the profile for key
contact fields in a possible set from aggregated tables. While
the column analysis should be performed across all the columns
Table 6.4 Example Profile of Contact Data
VARIABLE ROWS FILLED (%) MISSING (%) UNIQUE (%)
First name 69,884 100.00 0.00 26.44
Last name 69,884 100.00 0.00 18.34
Address line 1 69,818 99.91 0.09 60.33
Address line 2 56,519 80.88 19.12 27.95
Country of residency 69,664 99.69 0.31 0.05
Nationality 69,664 99.69 0.31 0.16
Branch 69,884 100.00 0.00 0.06
Primary email address 34,227 48.98 51.02 97.93
Primary contact 52,736 75.46 24.54 97.43
Secondary contact 41,974 60.06 39.94 59.53
Customer_TRN 68,630 98.21 1.79 99.83
ID type 69,884 100.00 0.00 0.01
84 BUSINE S S IN T EL LI G EN C E F O R SM E s
Stage 3: D
ata Maintenance/Stewardship
inputs as the steward must also ensure that standardized data element
definitions and formats are being adhered to and that the metadata
is being maintained. Stewards also manage the process of identify-
ing and resolving any emerging DQ issues and oversee periodic batch
cleansing activities.
DQ dashboards have increasingly become an effective means of
creating the essential visibility needed to facilitate monitoring and
reporting on DQ metrics and compliance over time. They provide
an important tool for enabling and empowering data stewards to
assess the state of DQ , and address root cause process issues within
their designated jurisdiction. Key to the design of DQ dashboards is
understanding and identifying the key quality indicators (KQI) that
the organization needs to track, monitor and manage. Figure 6.6 dis-
plays an example DQ dashboard that emphasizes the quality status
for a key “Unique Identifier” and “Rights Party Contact” fields that
are essential for maintaining effective customer analytics and con-
tact management. As highlighted on the “Dashboard”, an organiza-
tion would be able to establish a baseline, then monitor progressive
improvements in these key data items over time, based on various
DQ interventions. These quality attributes can be stored as part of an
enterprise metadata repository, in order to facilitate historical trend-
ing of DQ improvements as part of a comprehensive data governance/
information life-cycle management process.
For DQ dashboards to be most effective, especially for Executives,
it is useful to combine the data profiling information from individ-
ual variables into aggregate business metrics. For example, in the
prototype dashboard in Figure 6.6, a simple business metric has been
created called RPC_ndx, which provides a quantitative indicator
of the organization’s “ability to contact” its customers. The metric is
computed from Name, Address, Telephone#, Email as follows:
In this case, it has been determined that having valid email contact
information for a customer contributes twice as much value to this
business metric than any other contact attribute.
DATA G O V ERN A N C E & DATA Q UA LIT Y M A N AG EM EN T 87
Stage 4: D
ata Life-Cycle Process Review
Conclusion
It’s difficult to imagine the power that you’re going to have when
so many different sorts of data are available.
Tim Berners-Lee
Introduction
91
92 BUSINE S S IN T EL LI G EN C E F O R SM E s
For many organizations, their primary data assets reside in several enter-
prise systems. For example, a financial institution may have some data
in the core banking application, some with the credit card application,
some with loan application and some with the Customer Relationship
Management System. These systems may be using, for example, Oracle
or SQL Server database platforms. It is likely that, in the existing setup,
various mechanisms (e.g. Redpoint, Microsoft SQL Server Integration
Services (SSIS) and native SQL queries) are being used to extract
data from the sources to serve business analytics and decision-support
requirements. In addition, end-user departments may be using a vari-
ety of their own customized tools (e.g. Excel spreadsheets) to access,
retrieve, clean and analyze data relevant to their needs. However, this
practice often constrains the flexibility and timeliness with which the
organization can respond to new requests for access to, and integration
of, data from multiple heterogeneous sources (internal and external).
Virtual Integration
Given the number of approaches that are now available for Data
Integration, an organization needs to be deliberate in determining
which option is most appropriate for its needs. An example of the
functionality for Data Integration is illustrated in Figure 7.2.
This target functionality can be achieved by:
1. Evaluating available Open Source platforms using an estab-
lished Open Source Maturity Model to select an appropriate
Data Integration technology solution.
Data In t eg r ati o n 99
Conclusion
10 3
8
D e v elo pin g a Road map
fo r S tr ateg i c B usines s
I ntelli g en ce
If you don’t know where you are going, any road will get
you there.
Lewis Carroll
Introduction
10 5
10 6 BUSINE S S IN T EL LI G EN C E F O R SM E s
Data Quality
Data Integration
1 https://gartner.com/imagesrv/summits/docs/na/business-intelligence/gartners_
business_analytics__219420.pdf.
110 BUSINE S S IN T EL LI G EN C E F O R SM E s
Performance
People
Process
Platform
Conclusion
Introduction
Wilson (1999)
1 https://pwc.fr/fr/assets/files/pdf/2016/05/pwc_a4_data_governance_results.pdf.
117
118 BUSINE S S IN T EL LI G EN C E F O R SM E s
Like any other business asset, estimating and ascribing a value to data
assets helps organizations to determine the appropriateness of mea-
sures for managing and protecting the asset. However this activity
alone does not provide a basis for realizing the future economic bene-
fits from the utilization of the asset. Benefits Realization Management
(BRM) is a collective set of processes, practices and tools that can help
managers to increase the likelihood that benefits are realized from BI
and other ICT-enabled initiatives.
BRM is a continuous process that includes investment decision-
making using cost-benefit analysis, project management, implementa-
tion, monitoring and continuous adjustment. It is a disciplined approach
that is based on the central tenets that technology alone cannot deliver
business benefits, and benefits rarely happen completely according to
plans or intent. Thus BRM prescribes a systematic process of manage-
ment activities that consists of the following stages (Figure 9.2):
Conclusion/Recommendations
Think big and don’t listen to people who tell you it can’t be done.
Life’s too short to think small.
Tim Ferriss
Small and Medium-Sized Enterprises (SMEs) must not be intimidated
by the notion of becoming data-driven organizations, rather they
should recognize the opportunity to become leaders in this space.
They should not be stymied by the perception that it is extremely
resource intensive to become data-driven. This books seeks to dispel
the perception by providing a methodology, AMIS-BI, that SMEs
can adopt and adapt, taking into account nuances that are specific to
their own enterprises.
There is a plethora of case studies that describe the effect that strategic
BI initiatives have had on organizations and the areas in which these
initiatives have been applied. These case studies identify a number of
factors that have been critical to the success of the BI initiatives, many
of which were supported by our own experiences working with SMEs.
12 5
12 6 E PIL O GUE: L E S S O NS L E A RN ED
12 9
13 0
• The institution wanted to identify academic • The Information Management (IM) assessment was administered. • AIMS-BI is useful for educational institutions
analytics opportunities. • Low maturity scores for policy and measurement signaled clear gaps in • AIMS-BI provides a systematic approach to academic
• They needed to identify their barriers to formal data governance mechanisms and the minimal use of best practice analytics and in so doing can maximize the benefits
strategic Business Intelligence (BI) adoption. behaviors such as data quality metrics and profiling/measurement. analytics can provide.
• The finding from IM assessment was corroborated by the qualitative • The PoCs must be aligned with the strategic objectives of
comments recorded from the various user group interviews. the institution.
• The main strategic objectives of the organization were also elicited. They • Capability gaps that are essential for successful analytics
were based on student experience, finance, efficiency and research. are identified from the IM assessment, plans to address
• A portfolio of initiatives was developed that included the development of a these become a part of the roadmap.
student life-cycle dashboard, the establishment of data stewardship, a • A significant amount of time and effort was spent to get
research productivity dashboard and data quality management processes. the data from the form in the heterogeneous databases to
App en d i x : C a se S t ud ie s
• Senior management was asked to prioritize those Proof of concepts (PoCs). a form suitable for the various modeling techniques.
• The student life cycle was chosen as the prototype to be developed. This highlighted the need for data standardization policies.
• This initiative was considered to be critical as it aligned with two key • The primary output of this process is the roadmap; however,
strategic objectives: other important outputs are the prototypes that do not require
• Improving academic and administrative process efficiency. much more effort for full deployment.
• Improving student experience. • The student life cycle provides a basis for managing and
• The Knowledge Discovery and Data Mining (KDDM) process model was used optimizing the student experience by tracking students as
to develop this prototype. they progress through the institution.
• A number of dashboards, that focused on application processing and
student throughput were developed.
Data-Driven Credit Risk Models for a Financial Institution
A leading financial institution in the Caribbean is interested in analyzing payments data to determine if this data can provide a further understanding of customer
behavior e.g. can the size and/or frequency of utility bill payments or retail transactions provide additional predictor variables for customer risk profiling? This
institution is interested in reviewing its credit card scoring models to determine if payments data can be used to improve the quality of these models. They think reliable
proxies from nontraditional sources could be used to determine the customers’ ability to repay and willingness to pay. Six sources were identified as reliable:
telecommunications providers, utilities, wholesale suppliers, retailers, government and the institutions’ own previously overlooked data.
AIMS-BI was applied in the institution and data driven credit risk models were identified as one of the top priority PoCs that needed to be
prototyped.
• The institution wants to consider other reliable • AIMS-BI was applied and in prioritizing the portfolio of PoCs, improving • These non-traditional sources proved to be important in
and high-quality data they have access to with credit risk models was identified as a priority. determining the customers’ ability to repay.
the aim of improving the credit card scoring • The key stakeholders were interviewed and their concerns about the existing • Analysis of the payments data can provide a further
models currently being used. scoring models were discussed. understanding of customers.
• These additional sources can be used as • Included business analysts from payments and credit risk divisions • Payment data can be used to provide additional predictor
reliable proxies for customers’ ability to repay. variables to improve the performance of credit-risk scoring
• Key decision-makers identified potential sources of data that could be used
• They need to build new credit card scoring as proxies. models.
models using variables from payments data. • This data represents behavior which can be a better predictor
App en d i x : C a se S t ud ie s
• This data was profiled to determine its quality and transformed so that it
than the salary/profit data reported by persons.
could be integrated into the scoring models.
• Derived variables were created to build these models which
• The predictive modeling technique (decision trees) was used to build
require a good understanding of the domain knowledge and data.
profiles based on existing data.
• The institution recognized that they had issues with
• These prototype models were developed and verified with existing data.
organizational metadata and data quality and this led to the
inclusion of initiatives to address this being included in the BI
roadmap.
• The prototype developed was near implementation ready so in
the development of the BI roadmap, this was included as a
131
short-term activity.
13 2
• The institution wants to determine whether • AIMS-BI was applied to this institution and in building the portfolio of PoCs, • Non-traditional sources proved to be important in
customers bill payment patterns determine this need to understand the credit card portfolio was identified as one of determining the customers’ ability to repay.
their propensity to either be a transactor or the top priorities. • Credit card portfolio management was previously being
revolver and to determine how to make their • Further discussions were held with the affected decision-makers to identify done at the portfolio level this model supports, describing
transactors into revolvers yet ensure that potential sources of data that could be used as proxies. the portfolio at the customer level as each customer can
revolvers will not become delinquent. be profiled.
• This data was profiled to determine its quality and transformed so that it
• The institution needs to consider other reliable would be useful for integration in the scoring models. • The organization identified three groups – transactor, normal
and high-quality data they have access to that revolver and delinquent revolver.
• Prototype models were developed and verified with existing data.
can improve the understanding of the credit
• The models can be used to determine the credit worthiness of
card portfolio.
a particular customer/potential customer whose credit risk is
• Institutions want to be able to determine how to not yet known.
App en d i x : C a se S t ud ie s
• The focus of this customer analytics initiative • AIMS-BI was applied to this institution and this initiative was included in • Both association rules and sequential mining help to
was to improve marketing strategies. the portfolio of PoCs. increase the effectiveness of sales campaign management
• Maximize the customer value to the • The business leader and business analyst were interviewed to get a better and targeting processes.
organization by cross-selling/upselling various business understanding. • The concept hierarchies developed were extremely useful in
products to existing customers. • To improve the success rate of current targeted marketing campaigns, classifying products into subgroups which were then used in
• This will ultimately contribute to the bank’s market basket analysis was employed as a customer analytic technique. the development of the models.
revenue targets by improving the efficiency of • This technique uses association rule mining to identify the products that • The inclusion of demographic and product data together in a
sales initiatives customers currently purchase together, which can be used to identify basket was novel and facilitates the discovery of
bundles (i.e. those products that go well together) and therefore that should multidimensional rules and frequent patterns in buying
be marketed accordingly. products.
App en d i x : C a se S t ud ie s
• Customers don’t necessarily buy financial products all at one time, • The data preparation phase is reliant on the knowledge and
therefore a basket contains products bought over time. experience of the data mining analysts and importantly their
understanding of the business objectives and the
• Sequential rule mining was used as it shows not only which products were
corresponding data required.
bought together but also the sequence in which they were bought.
• The prototype developed was near implementation ready so
• Concept hierarchies for the domain were developed to enable analysis at
this could be included as a short-term activity in the
different levels of granularity.
development of the BI roadmap.
• Numeric variables were discretized in consultation with financial business
analysts who provided the linguistic terms used to describe each of the
data ranges (e.g. age values were discretized to young, middle-aged, etc.).
13 3
Glossary
13 5
13 6 G l o s s a ry
13 9
14 0 Bib li o g r a p h y
Cios, K. J., Teresinska, A., Konieczna, S., Potocka, J., & Sharma, S. (2000).
Diagnosing myocardial perfusion from PECT bull’s-eye maps-A
knowledge discovery approach. IEEE Engineering in Medicine and
Biology Magazine, 19(4), 17–25.
Cockburn, A. & Highsmith, J. (2001). Agile software development, the peo-
ple factor. Computer, 34(11), 131–133.
Cooper, B., Watson, H., Wixom, B., & Goodhue, D. (2000). Data ware-
housing supports corporate strategy at first American corporation. MIS
Quarterly, 24(4), 547–567.
Davenport, T. H. & Harris, J. G. (2007). Competing on Analytics: The New
Science of Winning. Boston, MA: Harvard Business School.
De Bruin, T., Freeze, R., Kaulkarni, U., & Rosemann, M. (2005).
Understanding the main phases of developing a maturity assessment
model. Paper Presented at the 16th Australasian Conference on Information
Systems (ACIS), Australia, New South Wales, Sydney.
Demirkan, H. & Delen, D. (2013). Leveraging the capabilities of service-
oriented decision support systems: Putting analytics and big data in
cloud. Decision Support Systems, 55(1), 412–421.
Eck, A., Keidel, S., Uebernickel, F., Schneider, T., & Brenner, W. (2014).
Not all information systems are created equal: Exploring IT resources
for agile systems delivery.
Elzinga, J., Horak, T., Lee, C.-Y., & Bruner, C. (1995). Business process man-
agement: Survey and methodology. IEEE Transactions on Engineering
Management, 42(2), 119–128.
Evelson, B. (2011). Trends 2011 and Beyond: Business Intelligence, Vol. 31.
Cambridge MA: Forrester Research, Inc.
Evelson, B. (2014). The Forrester Wave™: Agile business intelligence plat-
forms, Q3 2014. Forrester Research Inc.
Fayyad, U., Piatetsky-Shapiro, G., & Smyth, P. (1996). From data mining to
knowledge discovery in databases. AI Magazine, 17(3), 37–54.
Gregor, S. & Hevner, A. R. (2013). Positioning and presenting design science
research for maximum impact. MIS Quarterly, 37(2), 337–356.
Hannula, M. & Pirttimaki, V. (2003). Business intelligence empirical
study on the top 50 Finnish companies. Journal of American Academy of
Business, 2(2), 593–599.
Hevner, A. R., March, S. T., Park, J., & Ram, S. (2004). Design science in
information systems research. MIS Quarterly, 28(1), 75–105.
Highsmith, J. (2009). Agile Project Management: Creating Innovative Products.
London, UK: Pearson Education.
Hostmann, B., Rayner, N., & Friedman, T. (2006). Gartner’s Business
Intelligence and Performance Management Framework. Stamford, CT:
Gartner.
IBM. (2007). The IBM Data Governance Council Maturity Model: Building a
Roadmap for Effective Data Governance. IBM Software Group. Retrieved
from https://studylib.net/doc/8219376/the-ibm-data-governance-council-
maturity-model--building-a
Bib li o g r a p h y 141
Inmon, W.H. (2005). Building the Data Warehouse. New York: John Wiley &
Sons.
Jacobs, A. (2009). The pathologies of big data. Communications of the ACM,
52(8), 36–44.
Jourdan, Z., Rainer, R. K., & Marshall, T. E. (2008). Business intelligence:
An analysis of the literature. Information Systems Management, 25(2),
121–131.
Jukic, N. (2006). Modeling strategies and alternatives for data warehousing
projects. Communications of the ACM, 49(4), 83–88.
Katal, A., Wazid, M., & Goudar, R. H. (2013, August). Big data: Issues,
challenges, tools and good practices. 2013 Sixth International Conference
on Contemporary Computing (IC3), Noida, New Delhi (pp. 404–409).
IEEE.
Kimball, R. & Ross, M. (2011). The Data Warehouse Toolkit: The Complete
Guide to Dimensional Modeling. New York: John Wiley & Sons.
Knabke, T. & Olbrich, S. (2013). Understanding information system
agility—the example of business intelligence. Paper Presented at the 46th
Hawaii International Conference on System Sciences, Hawaii.
Kotelnikov, V. (2007). Small and medium enterprises and ICT, United
Nations Development Program-Asia Pacific Development Information
Program. Asian and Pacific Training Center for Information and
Communication Technology for Development.
Kurgan, L. A. & Musilek, P. (2006). A survey of knowledge discovery and
data mining process models. The Knowledge Engineering Review, 21(1),
1–24.
Lee, Y. & Koza, K. A. (2006). Investigating the effect of website quality on
e-business Success: An Analytic Hierarchy Process (AHP) approach.
Decision Support Systems, 42, 1383–1401.
Loveman, G. (2003). Diamonds in the data mine. Harvard Business Review,
81(5), 109–113.
Mansingh, G., Osei-Bryson, K.-M., & Asnani, M. (2015). Exploring the
antecedents of the quality of life of patients with sickle cell disease:
Using a knowledge discovery and data mining process model based
framework. Health Systems: Palgrave McMillan, 5(1), 52–65.
Mansingh, G., Osei-Bryson, K.-M., & Reichgelt, H. (2010). Using ontolo-
gies to facilitate post-processing of association rules by domain experts.
Information Sciences, 181(3), 419–434.
Mansingh, G. & Rao, L. (2014). Enhancing the decision making process: An
ontology based approach. Paper Presented at the International Conference
on Information Resources Management (Conf-IRM), Ho Chi Minh City,
Vietnam.
Mansingh, G., Rao, L., Osei-Bryson, K.-M., & Mills, A. (2015). Profiling
internet banking users: A knowledge discovery in data mining process
model based approach. Information Systems Frontiers, 17(1), 193–215.
March, S. T. & Smith, G. F. (1995). Design and natural science research on
information technology. Decision Support Systems, 15(4), 251–266.
14 2 Bib li o g r a p h y
Mariscal, G., Marban, O., & Fernandez, C. (2010). A survey of data min-
ing and knowledge discovery process models and methodologies. The
Knowledge Engineering Review, 25(2), 137–166.
Melon, M. G., Beltran, P. A., & Cruz, M. C. G. (2008). An AHP-based
evaluation procedure for innovative educational projects: A face-to-face
vs. computer-mediated case study. Omega, 36(5), 754–765.
Meredith, R., Remington, S., O’Donnell, P., & Sharma, N. (2012).
Organisational transformation through Business Intelligence: Theory,
the vendor perspective and a research agenda. Journal of Decision Systems,
21(3), 187–201.
Mettler, T. (2009). A design science research perspective on maturity models
in information systems tech. report BE IWI/HNE/03: Universität St.
Gallen.
Moin, K. I. & Ahmed, Q. B. (2012). Use of data mining in banking.
International Journal of Engineering Research and Applications, 2(2),
738–742.
Mooney, J., Beath, C., Fitzgerald, G., Ross, J., & Weill, P. (2003). Managing
information technology for strategic flexibility and agility: Rethinking
conceptual models, architecture, development, and governance. Paper
Presented at the International Conference on Information Systems (ICIS),
Seattle, Washington.
Muntean, M. & Surcel, T. (2013). Agile BI–the future of BI. Revista
Informatica Economică, 17(3), 114–124.
Negash, S. (2004). Business intelligence. Communications of the Association for
Information Systems, 13, 177–195.
Negash, S. & Gray, P. (2008). Business intelligence. In Handbook on Decision
Support Systems, Vol. 2, pp. 175–193. Berlin, Heidelberg: Springer.
Ngai, E. W. T. (2003). Selection of web sites for online advertising using the
AHP. Information and Management, 40, 1–10.
Olson, D. L., Dursun D., & Yanyan M. (2012). Comparative analysis of data
mining methods for bankruptcy prediction. Decision Support Systems,
52(2), 464–473.
Osei-Bryson, K. M., Mansingh, G., & Rao, L. (Eds.). (2014). Knowledge
Management for Development: Domains, Strategies and Technologies for
Developing Countries, Vol. 35. Springer Science & Business Media.
Overby, E., Bharadwaj, A., & Sambamurthy, V. (2006). Enterprise agility
and the enabling role of information technology. European Journal of
Information Systems, 15(2), 120–131.
Paranape-Voditel, P. & Deshpande, U. (2013). A stock market portfolio
recommender system based on association rule mining. Applied Soft
Computing, 13, 1055–1063.
Popovič, A., Hackney, R., Coelho, P. S., & Jaklič, J. (2012). Towards business
intelligence systems success: Effects of maturity and culture on analyti-
cal decision making. Decision Support Systems, 54, 729–739.
Rouse, W. B. (2007). Agile information systems for agile decision making. Agile
Information Systems, KC DeSouza, Editor, Elsevier, New York, 16–30.
Bib li o g r a p h y 14 3
Rygielski, C., Jyun-Cheng, W., & Yen, D. C. (2002). Data mining tech-
niques for customer relationship management. Technology in Society, 24,
483–502.
Saaty, T. (1980). The Analytic Hierarchy Process: Planning, Priority Setting,
Resource Allocation. New York: McGraw-Hill.
Sambamurthy, V., Bharadwaj, A., & Grover, V. (2003). Shaping agility
through digital options: Reconceptualizing the role of information tech-
nology in contemporary firms. MIS Quarterly, 27, 237–263.
Sein, M. K., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011).
Action design research. MIS Quarterly, 35, 37–56.
Sharma, S. & Osei-Bryson, K.-M. (2010). Toward an integrated knowledge
discovery and data mining process model. The Knowledge Engineering
Review, 25(1), 49–67.
Sharma, S., Osei-Bryson, K.-M., & Kasper, G. M. (2012). Evaluation of an
integrated knowledge discovery and data mining process model. Expert
Systems with Applications, 39, 11335–11348.
Steinmueller, W. E. (2001). ICTs and the possibilities for leapfrogging by
developing countries. International Labour Review, 140(2), 193–210.
Thompson, S. & Brown, D. (2008). Change agents intervention in e-business
adoption by SMEs: evidence from a developing country. AMCIS 2008
Proceedings, Paper# 242.
Thomsen, C. & Pedersen, T. B. (2009). A survey of open source tools for busi-
ness intelligence. International Journal of Data Warehousing and Mining
(IJDWM), 5(3), 56–75.
Trkman, P., McCormack, K., Valadares de Oliveira, M. P., & Ladeira, M. B.
(2010). The impact of business analytics on supply chain performance.
Decision Support Systems, 49, 318–327.
Van der Lans, R. (2012). Data Virtualization for Business Intelligence Systems:
Revolutionizing Data Integration for Data Warehouses. Waltham, MA:
Elsevier.
van Steenbergen, M., Bos, R., Brinkkemper, S., van de Weerd, I., &
Bekkers, W. (2010). The design of focus area maturity models. In:
Winter, R., Zhao, J. L., Aier, S. (eds.) Global Perspectives on Design Science
Research. DESRIST 2010. Lecture Notes in Computer Science, Vol.
6105. Berlin, Heidelberg: Springer.
Watson, H. J. (2009). Tutorial: business intelligence-past, present, and future.
Communications of the Association for Information Systems, 25(1), 39.
Williams, S. & Williams, N. (2006). The Profit Impact of Business Intelligence.
San Francisco, CA: Morgan Kaufmann.
Watson, H. J. & Wixom, B. H. (2007). The current state of business intel-
ligence. Computer, 40(9), 96–99.
Watson, H. J., Fuller, C., & Ariyachandra, T. (2004). Data warehouse gov-
ernance: Best practices at blue cross and blue shield of North Carolina.
Decision Support Systems, 38(3), 435–450.
Wilson, E. O. (1999). Consilience: The Unity of Knowledge, Vol. 31. New York:
Vintage.
14 4 Bib li o g r a p h y
Wilson, N., Summers, B., & Hope, R. (2000). Using payment behaviour
data for credit risk modelling. International Journal of the Economics of
Business, 7(3), 333–346.
Witter, M. & Kirton, C. (1990). The informal economy in Jamaica: Some
empirical exercises (No. 36). Institute of Social and Economic Research,
The University of the West Indies.
Index
14 5
14 6 In d e x