0% found this document useful (0 votes)
55 views39 pages

Evaluating Architecture - ATAM

The document outlines the Architecture Tradeoff Analysis Method (ATAM) for evaluating software architectures, detailing its phases, participants, and outputs. ATAM helps assess how well an architecture meets quality attributes such as performance and security, while also identifying risks and trade-offs. A case study of the Nightingale System illustrates the application of ATAM in a large healthcare management system to ensure robustness and adaptability for future product lines.

Uploaded by

endalee788
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views39 pages

Evaluating Architecture - ATAM

The document outlines the Architecture Tradeoff Analysis Method (ATAM) for evaluating software architectures, detailing its phases, participants, and outputs. ATAM helps assess how well an architecture meets quality attributes such as performance and security, while also identifying risks and trade-offs. A case study of the Nightingale System illustrates the application of ATAM in a large healthcare management system to ensure robustness and adaptability for future product lines.

Uploaded by

endalee788
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Evaluating an Architecture

[ATAM]

By Destaye A.
Contents
 Introduction
 Types of Architecture Evaluation
 ATAM
 For What Qualities can we Evaluate?
 Participants in the ATAM
 Outputs of the ATAM
 Phases of ATAM
 Phase o: Presentation
 Phase 1:Investigation & Analysis
 Phase 2: Testing
 Phase 3: Reporting
 Steps of the Evaluation Phase(s)
 Case study: The Nightingale System (Reading Assignment)
Introduction
 Software analysis and evaluation is a well-established practice inside
the architecting community of the software systems.
 Benefits of evaluation of architectures are
 Results in prioritization of conflicting goals
 Forces clear explanation of architecture
 Improve quality of architecture documentation
 Uncovers opportunities for cross-project reuse
 Results in improved architecture practices
 In order to assess system’s quality against the requirements of its
customers,
 the architects and the developers need methods and tools to support
them during the evaluation process.
Types of Architecture Evaluation
 Technical: Evaluation against system quality attributes, e.g.
performance, security, modifiability, Suitability of design
decisions,
 E.g. Architecture Tradeoff Analysis Method (ATAM).
 Economic: Biggest tradeoffs in large complex systems usually
have to do with economics, cost and benefits associated with
architectural design decisions
 E.g. Cost Benefit Analysis Method (CBAM)
 Software Architecture Evaluation Methods
 ATAM, Architecture Trade-off Analysis Method.
 CBAM, Cost Benefit Analysis Method.
 SAAM, Software Architecture Analysis Method.
 ALMA, Architecture Level Modifiability Analysis.
 FAAM, Family – Architecture Analysis Method.
Architecture Tradeoff Analysis Method (ATAM)
 Evaluating an architecture for a large system is a complicated task
 Large architecture will be difficult to understand in a limited amount of
time
 a large system usually has multiple stakeholders
 need to make connections between business goals and the technical decisions
 because architecture is intended to support business goals
 ATAM is a thorough and comprehensive scenario-based way to evaluate
a software architecture .
 ATAM
 Has its origins in SAAM (Software Architecture Analysis Method) from
the early 1990
 Reveals how well an architecture satisfies particular goals
 Elicit rationale of design decisions (traceability)
 Provides insight into how quality goals interact, how they trade off
 Discover tradeoffs: Decisions affecting more than one quality attribute
ATAM…
 Discover risks: Alternatives that might create (future) problems
in some quality attribute
 Discover sensitivity points: Alternatives for which a slight change
makes a significant difference in a quality attribute
 ATAM Benefits
 clarified quality attribute requirements
 improved architecture documentation
 documented basis for architectural decisions
 identified risks early in the life-cycle
 increased communication among stakeholders
 ATAM Weaknesses
 Subjective judgment - that depends on the experience of the participants
 check-list thinking for risk detection
For What Qualities can we Evaluate?
 From the architecture we can’t tell if the resulting system will
meet all of its quality goals. Why?
 E.g. Usability  largely determined by user interface than SA
 ATAM concentrates on evaluating a software architecture
based on certain quality attributes
 Performance, Reliability, Availability, Security, Modifiability,
Portability, Functionality, Variability [how well the architecture can be
expanded to produce new SAs in preplanned ways ( important for
product lines)], ….
 Quality attributes are evaluated in some context
 A system is modifiable with respect to a specific kind of change.
Participants in the ATAM
 The evaluation team
 Team leader, Evaluation leader, Scenario scribe, Proceedings scribe,
Timekeeper, Process observer, Process enforcer, Questioner
 Project Decision makers
 people empowered to speak for the development project
 Architecture stakeholders
 including developers, testers, users…, builders of systems interacting
with this one
 Evaluation team roles and responsibilities
 Team leader : Sets up evaluation (set evaluation contract), forms team,
interfaces with client, Oversees the writing of the evaluation report, …
 Evaluation leader
 Runs the evaluation
 Facilitates elicitation of scenarios
 Administers selection and prioritization of scenarios process
 Facilitates evaluation of scenarios against architecture
Participants in the ATAM…
 Scenario scribe
 Records scenarios on a flip chart
 Captures (and insists on) agreed wording
 Proceedings scribe
 Captures proceedings in electronic form
 Raw scenarios with motivating issues
 Resolution of each scenario when applied to the architecture
 Generates a list of adopted scenarios
 Timekeeper
 Helps evaluation leader stay on schedule
 Controls the time devoted to each scenario
 Process observer
 Keeps notes on how the evaluation process could be improved
 Process enforcer – helps the evaluation leader stay “on process”
 Questioner - raises issues of architectural interest stakeholders may not
have thought of.
Outputs of the ATAM
 A concise presentation of the architecture
 ATAM will force a “one hour” presentation of the architecture forcing
it to be concise and clear
 Articulation of the business goals
 Quality requirements in terms of a collection of scenarios
 Mapping of architectural decisions to quality requirements
 A set of identified sensitivity and tradeoff points
 E.g. A backup database positively affects reliability (sensitivity point with
respect to reliability), however it negatively affects performance (tradeoff)
 A set of risks and non-risks
 Identified risks can form the basis of a “architectural risk mitigation” plan
 Other secondary outputs
 more documentation, rationale, sense of community between
stakeholders, architect, …
Conceptual Flow ATAM
 …
Phases of the ATAM
Step 1: Present the ATAM
 Evaluation Team presents an overview of the ATAM to all
participants
 To explain the process and answer questions
 To set the context and expectations for the process
 Using standard presentation discuss:
 ATAM steps in brief,
 Techniques, and
utility tree generation
 architecture elicitation and analysis
 scenario brainstorming/mapping
 Outputs
 architectural approaches
 Scenarios & utility tree
 risks and “non-risks”
 sensitivity points and tradeoffs
Step 2: Present Business drivers
 ATAM customer representative describes the system’s business drivers
including the following for the project representatives as well as the
evaluation team members:
 The system’s most important functions
 Any relevant technical, managerial, economic, or political constraints
 The business goals and context as they relate to the project
 Time to market
 The major stakeholders
 Most important quality attribute requirements
 Architectural drivers:
 Quality attributes that “shape” the architecture
 Critical requirements:
 Quality attributes most central to the system’s success
o High availability, high security, …
Step 3: Present the Architecture
 The architect presents at the appropriate level of detail
 Driving architectural requirements, measurable quantities associated
with these, standards/models/approaches for meeting these
 Important architectural information: Context diagram, Module or layer
view, Component and connector view, Deployment view
 Architectural approaches, patterns, tactics employed, what quality
attributes they address and how they address those attributes
 Use of COTS and their integration
 Most important use case scenarios and change scenarios
 Issues/risk with respect to meeting the diving requirements
 The "appropriate level" depends on several factors:
 how much of the architecture has been designed and documented;
 how much time is available;
 the nature of the behavioral and quality requirements.
 Evaluation team begins probing for and capturing risks
Step 4:identify architectural approaches
 The ATAM focuses on analyzing an architecture by understanding its
architectural approaches.
 Start to identify parts of the architecture that are key for realizing
quality attribute goals
 Identify any predominant architectural styles, guidelines & principles
 For example: client-server, 3-tier, proxy, publish-subscribe,
redundant hardware
 architectural patterns are useful for the known ways in which each one
affects particular quality attributes.
 A layered pattern tends to bring portability to a system, possibly at the
expense of performance.
 A data repository pattern is usually scalable in the number of producers
and consumers of data. And so forth.
 In this short step, the evaluation team simply catalogs the patterns and
approaches that are evident which serves as the basis for later analysis
Step5: Generate quality attribute utility tree
 Identify, prioritize, and refine the most important quality
attribute goals by building a utility tree
 A utility tree is a top-down vehicle for characterizing the “driving”
attribute-specific requirements
 Select the most important quality goals to be the high-level nodes (e.g.
performance, modifiability, security, availability, …)
 Scenarios are the leaves of the utility tree
 Scenarios are prioritized
 Depending on how important they are and
 Depending on how difficult it will be for the architecture to satisfy a
scenario
 Output: A characterization and a prioritization of specific
quality attribute requirements
Step5: Generate quality attribute utility tree…
 ….

Some scenarios might express more than one quality attribute and so might appear in
more than one place in the tree.
Step 6:Analyze architectural approaches
 The evaluation team examines the highest priority scenarios one at a time
and the architect is asked how the architecture supports each one.
 Identify the approaches that pertain to the highest priority quality attribute
requirements
 Generate quality-attribute specific questions for highest priority quality
attribute requirement
 Ask quality-attribute specific questions
 Identify and record risks and non-risks, sensitivity points and tradeoffs
 Quality attribute questions probe styles to elicit architectural
decisions which bear on quality attribute requirements.
 Examples
 Performance: How are priorities assigned to processes?, What are the
message arrival rates?, What are transaction processing times?
 Modifiability: Are there any places where layers/facades are circumvented
?, What components rely on detailed knowledge of message formats?,
What components are connected asynchronously?
Step 6:Analyze architectural approaches…
 Sensitivity & Tradeoffs
 Sensitivity – A property of a component that is critical to success of
system.
 E.g. Power of encryption(Security) sensitive to number of bits of the key
 Tradeoff point- A property that affects more than one attribute or
sensitivity point
 E.g. In order to achieve the required level of performance in the discrete
event generation component, assembly language had to be used thereby
reducing the portability of this component.
 Risk & Non-Risk
 Risk -The decision to keep backup is a risk if the performance cost is
excessive
 Non Risk - The decision to keep backup is a non-risk if the performance
cost is not excessive
Step 6:Example of architectural approach analysis
 ….
Step 7 , 8 &9
 Step 7: Brainstorm and Prioritize Scenarios
 Stakeholders generate scenarios using a facilitated brainstorming process.
 Scenarios at the leaves of the utility tree serve as examples to facilitate
the step.
 The new scenarios are added to the utility tree
 Each stakeholder is allocated a number of votes roughly equal to 0.3 x
#scenarios.
 Step 8: Analyze Architectural Approaches
 Identify the architectural approaches impacted by the scenarios generated
in the previous step.
 This step continues the analysis started in step 6 using the new scenarios.
 Continue identifying risks and non-risks.
 Continue annotating architectural information.
 Step 9: Present Results
 architectural approaches, scenarios and their prioritization, utility tree,
risks, non-risks, sensitivities & trade –offs are presented.
Case Study: The Nightingale System
 The system would
 serve as the information backbone for the health care institutions in which it was
installed.
 provide data about patients' treatment history as well as track their insurance and
other payments.
 provide a data-warehousing capability to help spot trends (such as predictors for
relapses of certain diseases).
 produce a large number of on- demand and periodic reports, each tailored to the
institution's specific needs.
 manage the work flow associated with initiating and servicing what amounts to a
loan throughout its entire life for those patients making payments on their own,.
 would either run (or at least be accessible) at all of the health care institution's
facilities, it had to be able to respond to a specific office's configuration needs.
 Different offices might run different hardware configurations, for instance, or require different
reports.
 A user might travel from one site to another, and the system would have to recognize that user
and his or her specific information needs, no matter the location.
The Nightingale System…..
 Is a large system expected to comprise several million lines of code
and that it was well into implementation.
 Though the system is already well on its way to being fielded and sold,
client are interested in an architecture evaluation
 To discover architectural flaws sooner if any
 Client have strong ambitions to sell the system to many other customers
 The client wanted to make sure that it was sufficiently robust and
modifiable to serve as the basis for an entire product family of health care
management systems.
 To start the work team have been formed and role assigned
 One-day kickoff meeting attended by the evaluation team, the project
manager, the lead architect, and the project manager.
 Before the start of presentation phase the team met for two hours.
Step 1: Present ATAM
 Step 1: Present ATAM
 The evaluation leader explains the method - hour-long presentation
 the method's steps and phases
 describes the conceptual foundations underlying the ATAM (such as
scenarios, architectural approaches, sensitivity points, and the like)
 lists the outputs that will be produced by the end of the exercise.
 The decision makers likely already largely familiar with ATAM
 Step 2: Present Business Drivers
 Replace the multiple existing legacy systems, which were
 old (>25 yrs),
 based on aging languages and technology (e.g., COBOL and IBM
assembler), difficult to maintain,
 unresponsive to the current and projected business needs of the health
care sites.
Step 2: Business Drivers
 Business requirements included
 the ability to deal with diverse cultural and regional differences.
 the ability to deal with multiple languages (especially English and Spanish) and
currencies (especially the U.S. dollar and Mexican peso).
 a new system at least as fast as any legacy system being replaced.
 a new single system combining distinct legacy financial management systems.
 Business constraints for the system included
 a commitment to employees of no lost jobs via retraining of existing employees.
 the adoption of a "buy rather than build" approach to software.
 recognition that the customer's marketplace (i.e., number of competitors) had
shrunk.
 Technical constraints for the system included
 use of off-the-shelf software components whenever possible.
 a two-year time frame to implement the system with the replacement of
physical hardware occurring every 26 weeks.
Step 2: Business Drivers...
 Quality attributes were identified as high priority
 Performance: The 5-second transaction response time of the legacy system was too
slow.
 Usability: The new system had to be easy to learn and use.
 Maintainability: The system had to be maintainable, configurable, and extensible
to support new markets (e.g., managing doctors' offices), new customer
requirements, changes in state laws and regulations, and the needs of the
different regions and cultures.
 Important quality attributes as, but of somewhat lower priority
 Security: normal commercial level of security (e.g., confidentiality and data
integrity) required by financial systems.
 Availability: The system had to be highly available during normal business hours.
 Scalability: The system had to scale up to meet the needs of the largest hospital
customers and down to meets the needs of the smallest walk-in clinics.
 Modularity: The system & individual components of it.
 Testability and supportability: The system had to be understandable by the
customer's technical staff since employee training and retention was an issue.
Step 3: Present Architecture
 Several views of the architecture and the architectural approaches are
presented
 Two major subsystems:
 OnLine Transaction Manager (OLTM) - carries interactive performance
requirements
 Decision Support and Report Generation Manager (DSRGM) - is more of a batch
processing system whose tasks are initiated periodically.
 Built to be highly configurable.
 OLTM subsystem was strongly layered.
 Is a repository-based system; a large commercial database lay at its heart.
 Relys heavily on COTS software, including the central database, a rules
engine, a work flow engine, CORBA, a Web engine, a software
distribution tool, and many others.
 Is heavily object oriented, relying on object frameworks to achieve much
of its configurability.
Step 3: Present Architecture…

Layered view of the OLTM in the the A view showing communication, data
architect's informal notation flow, and processors of the OLTM
Step 3: Present Architecture…

Data flow architectural view of the OLTM


Step 4: Catalog Architectural Approaches
 The evaluation team listed the architectural approaches they had
heard, plus those they had learned about during their pre-evaluation
review of the documentation.
 The main ones included
 layering, especially in OLTM.
 object orientation.
 use of configuration files to achieve modifiability without recoding or
recompiling.
 client-server transaction processing.
 a data-centric architectural pattern, with a large commercial database at
its heart.
 These and other approaches gave the evaluation team a conceptual
footing from which to begin asking probing questions when scenario
analysis began.
Step 5: Generate Quality Attribute Utility Tree
 The utility tree generated during the ATAM exercise is indicated in
the form of tables & shown in the next slides
 All of the quality attributes identified during step 2 appear and that
each is refined into one or more specific meanings.

 Notice that
 some of the scenarios are well formed according to our earlier
discussion, others have no stimulus, and still others have no responses.
 At this stage, the imprecision in scenario specification is permissible as
long as the stakeholders understand the meaning.
 If the scenarios are selected for analysis, then the stimulus and response
must be made explicit.
Step 5: Generate Quality Attribute Utility Tree…
Step 5: Generate Quality Attribute Utility Tree…
Step 6: Analyze Architectural Approaches
 The utility tree produced no scenarios ranked (H,H)
 which indicates high-importance, high-difficulty scenarios that merit high
analytical priority.
 So looked for (H,M) scenarios, a cluster of which appeared under
"Modularity," hypothesizing the replacement of various COTS
products in the system.
 Extensive use of COTS
 reduces development risk
 But worrisome to the project's management because there are large
number of COTS vendors.
 Therefore, achieving architectural flexibility to swap out COTS
products was of keen interest.
 Spending minutes for each scenario asking about the range and
impact of the changes the following is learned
Step 6: Analyze Architectural Approaches…
 Because the system uses vendor-specific tools, components, and an
SQL dialect not supported by or compatible with databases supplied
by other vendors, replacing the database would be extremely difficult
and expensive, requiring several staff-years of effort.
 Because operating system dependencies have been localized or
eliminated from OLTM and DSRGM, replacing the operating system
with another one would require only a small modification.
 Changing the rules engine raised several issues of concern.
 Remove the rules engine and then implement the rules directly in C++.
 It would likely improve performance
 It would obviate the need for personnel trained in the rules language
and knowledgeable about the rules engine.
 It would deprive the development team of a useful rules development
and simulation environment.
 …six sensitivity points, one tradeoff point, four risks, and five nonrisks.
Step 7: Brainstorm and Prioritize Scenarios
 Discussion handled with stakeholders & a total of 72 scenarios are
identified.
 More than a dozen of those scenarios were found at the leaves of step 5's
utility tree
 The following table contains a selection of some of the more
interesting scenarios that emerged during step 7.
Step 8 & 9
 Step 8: Analyze Architectural Approaches
 Seven additional scenarios are analyzed, a number slightly above average for an
ATAM exercise.
 Step 9—Present Results
 present summary of the results and findings of the exercise
 In addition to risks, non risks, sensitivity points, and tradeoff points risk
themes are presented
 Catalog of architectural approaches used, the utility tree and brainstormed
scenarios, and the record of analysis of each selected scenario are also
presented
 three risk themes identified
 Over-reliance on specific COTS products - difficulties in swapping out the
database, in removing the rules engine, and in relying on an old and
possibly no-longer-supported version of the database portability layer
 Error recovery processes were not fully defined.The customer's knowledge of available
tools was incomplete.
 Documentation issues - The state of documentation is inadequate
Summary
 The ATAM is a robust method for evaluating software architectures
but
 The ATAM is not an evaluation of requirements – can’t tell anyone whether all
of the requirements for a system will be met
 The ATAM is not a code evaluation - it makes no assumptions about the
existence of code and has no provision for code inspection
 The ATAM does not include actual system testing - makes no assumptions of
the existence of a system and has no provisions for any type of actual
testing.
 The ATAM is not a precise instrument, but identifies possible areas of risk within
the architecture - These risks are embodied in the sensitivity points and the
tradeoffs. The ATAM relies on the knowledge of the architect, and so it is
possible that some risks will remain undetected- risks that are detected
are not quantified
 Though the ATAM has proven itself as a useful tool.

You might also like