Management
Management
Management
Abstract
In order to ensure that technology supports business needs and that IT investments deliver the
desired value, it is fundamental to define an Information System Architecture (ISA) and measure its
accurateness to the business model and existing technologies. Thus, in this paper we are concern on
evaluating ISA by measuring its qualities (relevant at enterprise level).
Since software architecture (SA) is part of the information system architecture and the evaluation topic
is a quite mature issue on the software engineering domain, we enumerate and classify several
software evaluation approaches in order to consider its applicability to ISA evaluation. Therefore, in
this paper, we present and classify the most significant software evaluation issues, namely: software
qualities, software evaluation approaches, and software metrics.
Our preliminary findings indicate that: the quality attributes relevant for SA evaluation are generally
applicable for ISA evaluation, the SA evaluation approaches are also useful for ISA evaluation, and
the SA metrics are not applicable to ISA evaluation.
In this paper we propose a set of metrics for ISA evaluation, considering the most experienced and
tested software engineering metrics. We apply the ISA evaluation approach, qualities and metrics to a
health-care project case study.
1. Introduction
Nowadays organizations business environment presents new challenges where information,
innovation and the agility to continuously rethink business are key success factors
(Nilsson_et_al._1998). In order to address these new business needs, organizations usually try to find
on IT the solution. Despite the significant technological progresses, organizations investments on IT
frequently do not provide the expected returns (Boar_1999).
In this paper we argue that, in order to ensure that technology supports business needs and that IT
investments deliver the desired value, it is fundamental to define an Information System Architecture
(ISA) and measure its accurateness to the business model and existing technologies. Thus, in this
paper we are concern on evaluating ISA by measuring its qualities.
Page 1
The authors believe that ISA a distinct concept from Software Architecture (SA) has a vital role in
the development of Enterprise Information Systems that are capable of staying fully aligned with
organization strategy and business needs.
However, having an ISA does not make clear to the architect: how much an IS (or IS components) will
respect some business properties/qualities; or which qualities (and how) an architectural decision
might affect; or which ISA decision is more adequate (Vasconcelos_et_al._2004c)
The ISA evaluation is concern on inferring the ISA accurateness to a business model and existing
technologies for example, the alignment between business and IT is an issue that should be
considered when determining the ISA quality.
In order to address the ISA evaluation topic, considering the similar roots between Information System
and Software Engineering areas and that the evaluation topic is a quite mature issue on the Software
Engineering domain, in this paper we will enumerate and classify several software evaluation
approaches and analyze its suitability for ISA evaluation.
The next section of this paper presents the major concepts relevant for ISA evaluation, namely the
enterprise architecture definitions and some evaluation definitions. Section 3 describes Software
Engineering approach for SA evaluation (as SA evaluation methodologies, software qualities and
software metrics). In section 4 we propose our ISA Evaluation approach. Section 5 presents a case
study where we apply the evaluation metrics to an ISA in the Portuguese health-care system. Finally,
section 6 draws some conclusions and presents future work.
2. Key Concepts
In this section we introduce the main notions, definitions and problems on enterprise, business,
information system and software architectures, which will support the remaining sections of this paper.
Page 2
architecture level, IS are consider simple resources used in business (as people, equipment and
material, etc.) e.g., Eriksson_et_al._(2000) and Marshall_(2000).
Finally, ISA addresses the representation of the IS components structure, its relationships, principles
and directives (Garlan_et_al._1995), with the main propose of supporting business
(Maes_et_al._2000).
Quoting IEEE Architecture Working Group (1998), ISA level should be high. Thus, ISA is
distinguished from software representation and analysis methods (as E-R diagrams, DFD), presenting
an abstraction of internal system details and supporting organization business processes
(Zijden_et_al._2000).
ISA usually distinguish three aspects, defining three sub architectures (Spewak_et_al._1992):
Informational Architecture, or Data Architecture, represents the main data types that support
business (Spewak_et_al._1992), (DeBoever_1997).
Application Architecture, defines applications needed for data management and business
support.
Technological Architecture, represents the main technologies used in application
implementation and the infrastructures that provide an environment for IS deployment as
network, communication, distributed computation, etc. (Spewak_et_al._1992), (Open_2003)
ISA description is a key step in ensuring that IT provides access to data when, where and how is
required at business level (Spewak_et. al. 1992).
However, having an ISA does not ensure these benefits just by existing; the representation of the
information systems and its dependencies to business is a necessary, but not sufficient, step towards
addressing key problems as the IS integrity and flexibility, IS ROI, IS and business alignment, among
others.
2.2. Evaluation
Clements_et_al_(2002) ask How can you be sure whether the architecture chosen for your software
is the right one? How can you be sure that it wont lead to calamity but instead will pave the way
through a smooth development and successful product?. The architecture is the foundation for
deducing the system quality attributes (as modifiability, performance, availability, reliability). The
process of analyzing and deducting the architectural potential for implementing a system capable of
supporting the major business requirements and identifying major risks and trade-offs is evaluation
main concern.
The evaluation process will consider the architectural attributes, the properties that characterize the
system e.g., CPU speed (in a technological architecture), or the development language (in a SA).
The characteristics that we pretend to verify in the architecture are defined as quality attributes (or
quality requirements or external attributes) such as modifiability, performance, availability, reliability.
In the evaluation process the quantitatively interpretation of the observable architectural attributes are
defined as metrics e.g., number of lines of code, function points.
Page 3
SAAM
ATAM
ALMA
ARID
SBAR
Methods
Activities
6 activities
9 activities in 4
phases
9 activities in 2
phases
Methods
Goals
Risk
Identification
Suitability
analysis
Sensitivity &
Trade-off
analysis
Validating
designs viability
for insights
3 activities
carried out
iteratively
Evaluate ability
of SA to achieve
quality attributes
Quality
attributes
Mainly
Modifiability
Multiple
attributes
5 activities
carried out
sequentially
Change impact
analysis,
predicting
maintenance
effort
Maintainability
Suitability of the
designs
Multiple
attributes
utility tree, and analyze architectural approaches), Testing (brainstorm and prioritize scenarios, and
analyze architectural approaches) and Reporting
One of ATAM most important steps is the generation of a quality attribute utility tree. The quality
attributes that comprise system utility (performance, availability, security, modifiability, usability, and
so on) are elicited specified down to the level of scenarios. An example of an ATAM utility tree is
shown in Figure 1 (Clements_et_al._2002).
Utility is the root node (father of all the quality attributes); the second level represents the quality
attributes (as performance, modifiability, availability and security); the third level is a refinement of the
quality attributes (as hardware failure and COTS Software failures for availability, or data
confidentially and data integrity for security); the leaves of the utility tree are concrete scenarios, that
are then prioritized along two dimensions: importance of each scenario to the success of the system,
and by the degree of difficulty posed by the achievement of the scenario (letters H,M,L next to
scenarios description in Figure 1).
For further detail on the rest of ATAM approach please refer to Clements_et_al._(2002) book.
McCabe_(1976), considering that the higher the number of paths in a program, the higher its control
flow complexity probably will be, proposed the Cyclomatic Complexity metric. McCabe metric counts
the paths from the start point to the end point of the program whose linear combinations provide all
the paths in the program. Based on graph theory, the number of base paths in a program is computed
as v(G) = e n + 2, where e and n are the number of edges and nodes in the control glow graph,
respectively.
With the emerging of the Objected Oriented (OO) paradigm, new concepts and abstractions appear
such as classes, methods, messages, inheritance, polymorphism, overloading and encapsulation,
which were not addressed in previous metrics. Chidamber_and_Kemerer_(1995) and Basili_(1996)
proposed and tested a set of software metrics for OO development. These metrics are:
Weighted Methods per Class (WMC), measures the complexity of an individual class.
Coupling Between Object classes (CBO) A class is coupled to another one if it uses its
member functions and/or instance variables.
Response For a Class (RFC), is the number of methods that can potentially be executed in
response to a message received.
Several other metrics on OO development exist such as Average dimension of methods, Average
number of methods per class, Number of executable statements, Number of classes, Total number of
methods, Number of times the method is inherited, Number of times the method is overloaded,
medium
time
between
(consecutive)
failures
(for
further
OO
metrics
see
Abreu_and_Carapua_(1994) or Briand_et_al._(1998)).
Page 6
contradictory
achieves
goal
Goal
process
controls, executes
Process
resource *
Resource
produces, consumes,
uses, refines
block
supports
Block
*
sends,
receives
In order to model ISA key concepts the Block component was specialized. The key concepts for the
ISA are:
Information Entity person, place, physical thing or concept that is relevant in the business
context;
IT Platform Block stands for the implementation of the services used in the IT
application deployment.
Operation, the abstract description of an action supported by a service (the minor level concept
in an ISA).
Figure 3 describes how these high-level primitives are related, in a UML profile for ISA. For further
detail please refer to Vasconcelos_et_al_(2003).
Page 7
In order to describe these ISA primitives we are currently working on consolidating a set of attributes
(the architectural attributes), that will characterize the properties of the Information System for
example, at ISA integration level, we have proposed a set of architectural attributes in
Vasconcelos_et_al_(2004b).
The quality attributes that are relevant for SA evaluation are generally applicable (and relevant)
for ISA evaluation, such as performance, reliability, availability, modifiability, portability, security,
etc.;
The SA evaluation approaches (namely ATAM) are reasonably independent of the software
engineering domain, thus theirs steps might be used for ISA evaluation;
SA metrics are not applicable to ISA evaluation, since SA metrics deal directly with software
attributes (as classes, lines of code, variables, etc.).
4.2.1.
Qualities
Software engineering domain has a set of qualities that are commonly used when evaluating a
software program (such as the ones described in section 3.1). In the information system domain
(specifically in the enterprise information system architecture) this consensus does not exist, mostly
Page 8
because the research on this subject is younger (and as a consequence of some confusion, until
recently, between the information system and software engineering domains, in the authors opinion).
Nevertheless, in the information system technological (or technical) architecture the Khaddaj and
Horgan_(2004) article presents an effort to identify quality attributes for information systems (still from
a software point of view), such as performance, scalability, cost/benefit, usability, portability,
robustness, correctness, and reliability. TOGAF framework (Open_2003) also presents an important
research concerning information system qualities from a technical view point, namely: availability,
manageability, performance, reliability, recoverability, locatability, security, integrity, credibility,
usability, adaptability, interoperability, scalability, portability, and extensibility.
If we compare these IS technical qualities to software qualities (presented in section 3.1), we can
conclude that the qualities attributes that are important in SA evaluation are also significant in ISA
evaluation.
Although that at information system technological level the software qualities are applicable for ISA
evaluation, the application and information sub-architectures (and its relations to business level) are
not directly addressed by software qualities. For instance: which quality attributes are pertinent to
assess the business support and alignment (in an ISA)? Or, which qualities are relevant in order to
verify the alignment between the Information System strategy and the ISA?
4.2.2.
Metrics
As described before, metrics are quantitatively interpretation of the observable architectural attributes.
Since the architectural attributes are distinct in information systems and software, the ISA metrics and
SA metrics can not be the same.
As presented in section 4.1, at ISA level the key concepts (primitives) relevant for application,
information and technological architectures such as IS Block, Information Entity, IT Block,
Service, Business Service, IT Infrastructure Block, Server are divergent from the SA
attributes, more focuses on how programs or application components are internally built in order to
implement a software program such as lines of code, objects, classes, methods, variables, software
algorithms, etc.
Therefore, despite the fact that the key ISA qualities (or requirements) are similar to SA qualities
(such as performance, reliability, performance, etc.), the ISA metrics and SA metrics are distinct.
Thus software metrics such as the Number of Lines Of Code or the McCabes Cyclomatic Complexity
metric are not useful at ISA level (since the lines of code or the program algorithm are not enterprise
information system attributes). However the authors believe that some of the SA metrics might be
extend/adapted to the ISA domain (this issue is discussed in section 4.3).
Some metrics have already been proposed for evaluating the characteristics of an ISA, namely:
For the information/data architecture there are some metrics focused on entity-relationship (ER)
diagrams for further detail see Gray_et_al. _ (1991), Kesh_ (1995), Moody_ (1998) and Si-Said
Cherfi_et_al._(2002) researches. Genero_et_al._(2003), considering previous researches
presents a set of metrics for measuring the structural complexity of entity-relationship (ER)
diagrams (e.g., as the number of entities within an ER diagram, number of composite attributes,
number of relationships within an ER diagram, Number of Binary Relationships, among others);
The quantification of alignment between the architectural levels is also an issue that has few
metrics and little experienced research. Pereira_and_Sousa_(2003), based on some practical
observations and on a literature review, propose a set of metrics to analyse misalignment
between business, information and application architectures, by counting: the information entities
created only by one process (versus the total number of entities), the processes that create,
Page 9
update and/or delete at least one information entity (versus the total number of processes) and
the information entities read by at least one process (versus the total number of entities).
The analysis of the alignment between an information system architecture and a reference ISA
is also an issue that does not have much experienced research. Vasconcelos_et_al._(2004)
propose a set of metrics for verification of the alignment between the Portuguese Healthcare
Information System reference model and new information system projects the metrics
included: Functional Overlapping indicator (which considers functions implemented by the
proposed project and the ones that already exist in other systems in the organization),
Technology change indicator (that considers the new technology introduced by the project that is
not used in other existing IS of the organization), Informational entity model compatibility
indicator, System overlapping indicator, Interface disregarding indicator, among others.
4.2.3.
Approaches
The approaches previously presented for SA evaluation main aim is to predict the quality of a system
before it has been built and not to establish precise estimates but the principal effects of an
architecture (Kazman_et_al._1993). These approaches present a sequence of steps that guide the
evaluation team. These steps, generally, are independent of the software or enterprise information
system domain. The major decisions are left to the project team (or the architect) such as defining
which attributes are relevant (and how much), what scenarios to consider, identify the sensitive and
trade-offs, etc. Thus, considering that ISA evaluation has similar goals, there arent motives for not
using the approaches presented for SA evaluation (namely ATAM) in ISA evaluation. Furthermore,
using tested and mature evaluation approaches the authors believe that could improve the ISA
evaluation results.
However, some steps of the SA evaluation approaches provide support in the evaluation process by
presenting examples and patterns at software level. For instance, ATAM approach, in order to help
the definition of scenarios, Clements_et_al._(2002) propose a characterization of the quality attributes
(in terms of stimulus, environment and response)Error! Reference source not found.. In these
points the SA evaluation approaches must be adapted for ISA evaluation.
SA metric
ISA metric
ISA arch.
ISA qualities
Number of Applications
App. Arch.
Total Number of
instance variables
Lack of COhesion in
Number of Information
Entities
Average Lack of
Inf. Arch.
Cost
Time-to-market
Adaptability
Modificability
Adaptability
Page 10
App. Arch.
SA metric
ISA metric
ISA arch.
ISA qualities
Methods (LCOM)
COhesion in
Application blocks
Inf. Arch.
Modificability
Average number of
methods per class
Average number of
operations per
application block
Average Service
Cyclomatic Complexity
App. Arch.
App. Arch.
Cost
Adaptability
Modificability
Cost
Adaptability
Modificability
Cyclomatic
Complexity
Coupling Between
Object classes
(CBO)
Average Coupling
Between Applications
Response For a
Class (RFC)
System Medium
Time Between
(consecutive)
Failures (MTBF)
Weight Service
Medium Time Between
(consecutive) Failures
App. Arch.
Modificability
Adaptability
App. Arch.
Testability
Modificability
Security
Reliability
Availability
App. Arch.
Tech. Arch.
In the previous table, by considering the enterprise information system concerns and attributes, and
adapting software attributes inherent to the software metrics, we propose a set of ISA metrics. The
metrics proposed are based on the most widely accepted and used software metrics.
Although adapting the software metrics to the ISA domain provides a more solid basis for ISA
evaluation, it does not ensure that all the ISA concerns are consider in the evaluation. For instance,
one of the most important issues in the ISA is the degree of business support and automation; thus a
simple and important metric when analysing an ISA (and its support to business) is the ratio between
the number of business services required by the business processes and the business services
(Business Service) actually provided by the information systems (IS Block) this metric is not
directly derived from a software metric.
In order to evaluate an ISA one should used an evaluation approach; as described in section 4.2.3,
the software evaluation approaches could be used for ISA evaluation; in the case study described in
next section we will use some of the ATAM steps for an ISA evaluation.
Page 11
The facts presented here stand for a hypothetical project proposal (all names, brands and facts, for confidentiality reasons,
are fictitious); however, this case study is based on our experiences and participation on the evaluation of real IS/IT health care
projects, where analogous proposals where evaluated.
Page 12
In order to ensure the independence between the mobile part of the system and the mobile phone
suppliers and the mobile Operators, the Portability was identified as an important system quality, as
security, modificability, performance, availability and cost (among others).
In Figure 5 the ratio between the number of business services required by the business processes
and the business services actually provided by the application blocks metric is
Prescription did not have the correspondent business service. This gap was reported to the project
proponent that corrected the project proposal. The global ISA, at application level (after this short
correction) is presented in Figure 6.
A simple metric to estimate time-to-market by comparing with other ISA is counting the number of
applications. In this ISA we have 4 applications (plus two existing ones in the health-care
organizations).
In order to verify the performance scenario described in Figure 4 (system response for mobile
users), the team detailed the business service (presenting its operations) - Figure 7Error! Reference
source not found.. In this figure we can see that the business operations of the service are: Get
Patient Record, Update Patient Information, and Search Patients.
For the Get Patient Record operation the sequence of applications (and services) and messages of
a possible scenario are described in Figure 8Error! Reference source not found..
In the previous figure the total time of the operation Get Patient Record is 4 seconds; however this
scenario describes an average situation, since, in worst case scenarios, the communications might be
Page 14
slower, or the patient record might not exist in the CRM application and it might be necessary to
invoke the services provided by the hospital or primary health-care unit.
Figure 9 presents the behaviour diagram for the patient clinical management business service. From
this diagram we can compute this service cyclomatic Complexity, which is: v(G) =e-n+2 = 8-5+2=5.
The response for the Patient Clinical Management Service metric is 4, since four applications might
be used to support this service.
The evaluation team aided by these (and other) metrics and supported on ATAM approach presented
a set of architectural recommendations in order to better accommodate the qualities identified (in the
utility tree) for example ensure the deploy of the mobile application in the major mobile platforms
(Symbian, Windows Mobile and Java Mobile).
Figure 10 describes the technological architecture of the Call Doctor Information System, after the
recommendations of the evaluation team.
7. Acknowledgements
The research presented in this paper was possible thanks to the support of Sade XXI.
8. References
Abreu, F., and R. Carapuca, `Candidate metrics for object-oriented software within a taxonomy
framework, J. Syst. and Software 23, 87-96, 1994.
Babar, M, L. Zhu, and R. Jeffery, Framework for Classifying Sofwtare Architecture Evaluation
Methods, Australian Software Engineering Conference (ASWEC'04), 2004
Clements,P., R. Kazman, and M.Klein, Evaluating Software Architectures: Methods and Case
Studies, Addison-Wesley Professional , ISBN 02017048, 2002
Dobrica, L., and Niemela, E., A Survey on Software Architecture Analysis Methods, IEEE Tansactions
on Software Engineering, vol. 28, no. 7, 2002.
Page 16
Genero, M., G. Poels, and M. Piattini, Defining and Validating Metrics for Assessing the
Maintainability of Entity-Relationship Diagrams, Working Paper, Faculteit Economie En Bedrijfskunde,
2003.
Kazman, R., J. Asundi, and M. Klein, Making_Architecture_Design_Decisions_An Economic
Approach, Technical Report, Carnegie Mellon Software Engineering Institute, 2002
Maes, R., D. Rijsenbrij, O. Truijens, and H. Goedvolk, Redefining Business IT Alignment Through a
Unified Framework, White Paper, May 2000.http://www.cs.vu.nl/~daan/
McCabe, T. J. ``A Complexity Measure," IEEE Trans. Software Eng. 2, 1976
Open Group, The Open Group Architectural Framework (TOGAF) Version 8.1, The Open Group,
December 2003.
Pereira, C., and P. Sousa, Getting Into The Misalignment Between Business And Information
Systems, 10th European Conference On Information Technology Evaluation, 2003.
Si-Said Cherfi S.. Akoka J. and Comyn-Wattiau I, Conceptual Modelling Quality from EER to UML
Schemas Evaluation. 21st International Conference on Conceptual Modeling (ER 2002). Tampere.
Finland. 499-512, 2002.
Spewak, S., and S. Hill, Enterprise Architecture Planning: Developing a Blueprint for Data,
Applications and Technology, Wiley-QED, ISBN 0-471-599859, 1992
Vasconcelos, A., A. Caetano, J. Neves, P. Sinogas, R. Mendes, e J. Tribolet, A Framework for
Modeling Strategy, Business Processes and Information Systems, 5th International Enterprise
Distributed Object Computing Conference EDOC, Seatle, USA, 2001.
Vasconcelos, A., C. Pereira, P. Sousa and J. Tribolet, Open Issues On Information System
Architecture Research Domain: The Vision, Proceedings of the 6th International Conference on
Enterprise Information Systems (ICEIS 2004), Portugal, 2004c.
Vasconcelos, A., Miguel Silva, Antnio Fernandes, and Jos Tribolet (b), An Information System
Architectural Framework for Enterprise Application Integration, Proceedings of 37th Annual Hawaii
International Conference On System Sciences (HICCS37), Hawaii, USA, 2004.
Vasconcelos, A., P. Sousa, and J. Tribolet, Information System Architectures, Proceedings of
Business Excelence 2003, International Conference on Performance Measures, Benchmarking and
Best Practices in New Economy, Portugal, 2003.
Vasconcelos, A., R. Mendes, and J. Tribolet, Using Organizational Modeling to Evaluate Health Care
IS/IT Projects, Proceedings of 37th Annual Hawaii International Conference On System Sciences
(HICCS37), Hawaii, USA, 2004.
Full list of references available in http://ceo.inesc.pt/andre/papers/ecite05/referencesISAEval.doc
Page 17