CMMI-Dev V1.3 Training
CMMI-Dev V1.3 Training
November 2010
SM
SCAMPI and SCAMPI Lead Appraiser are service marks of Carnegie Mellon University.
CMMI and Carnegie Mellon are registered in the US Patent and Trademark Office by Carnegie Mellon University. For more information on CMU/SEI Trademark use, please visit http://www.sei.cmu.edu/legal/marks/index.cfm
2010 Carnegie Mellon University
November 2010
2010 Carnegie Mellon University This material is distributed by the SEI only to course attendees for their own individual study. Except for the U.S. government purposes described below, this material SHALL NOT be reproduced or used in any other manner without requesting formal permission from the Software Engineering Institute at [email protected]. This material was created in the performance of Federal Government Contract Number FA8721-05C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The U.S. Government's rights to use, modify, reproduce, release, perform, display, or disclose this material are restricted by the Rights in Technical Data-Noncommercial Items clauses (DFAR 252-227.7013 and DFAR 252-227.7013 Alternate I) contained in the above identified contract. Any reproduction of this material or portions thereof marked with this legend must also reproduce the disclaimers contained on this slide. Although the rights granted by contract do not require course attendance to use this material for U.S. Government purposes, the SEI recommends attendance to ensure proper understanding. THE MATERIAL IS PROVIDED ON AN AS IS BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).
November 2010
Purpose
The purpose of this training is to help you to do the following:
understand the changes and improvements made in CMMI V1.3 successfully make the transition from CMMI V1.2 to CMMI V1.3 of the CMMI Product Suite satisfy a requirement for the following: to participate in a SCAMPISM High Maturity appraisal using CMMI model V1.3 to attend further CMMI V1.3 training courses to become a CMMI-DEV V1.3 or CMMI-ACQ V1.3 instructor to become a V1.3 SCAMPI Lead AppraiserSM or a V1.3 SCAMPI Lead AppraiserSM
November 2010
Materials
To successfully complete this on-line training, it would be useful to follow along and see the actual changes in the models. Copies of all three models are available on the SEI website at http://www.sei.cmu.edu/cmmi/tools/cmmiv1-3/. Comparison documents are also available at the website that show the redline changes between V1.2 and V1.3. We recommend that you familiarize yourself with the comparison documents and refer to them when you are completing this training. Also many acronyms are used throughout these materials. If you are not familiar with an acronym, please refer to the acronym list in Appendix B in the model documents.
November 2010
Course Objectives
The objectives of this training course are to enable you to
understand the differences between CMMI V1.2 and V1.3 General changes Process area changes High maturity changes
November 2010
Depicting Changes
In these materials, strikethroughs will be used to indicate deletions (strikethrough); red font without strikethroughs to indicate insertions (red font). Editorial changes are seldom depicted.
November 2010
The core and shared PAs covered in this module do not include the high maturity core and shared PAs.
2010 Carnegie Mellon University
November 2010
November 2010
November 2010
Completion Criteria
Successful completion of CMMI V1.3 Model Upgrade Training requires that you complete the following:
Open and review all modules Complete the confirmation at the end of the course that shows that you have reviewed and understand the material
10
November 2010
Summary
This training provides you with material to help you understand the differences between CMMI V1.2 and CMMI V1.3. If you have any questions about upgrade training, please send email to [email protected]
11
SM
SCAMPI and SCAMPI Lead Appraiser are service marks of Carnegie Mellon University.
CMMI and Carnegie Mellon are registered in the US Patent and Trademark Office by Carnegie Mellon University. For more information on CMU/SEI Trademark use, please visit http://www.sei.cmu.edu/legal/marks/index.cfm
2010 Carnegie Mellon University
November 2010
2010 Carnegie Mellon University This material is distributed by the SEI only to course attendees for their own individual study. Except for the U.S. government purposes described below, this material SHALL NOT be reproduced or used in any other manner without requesting formal permission from the Software Engineering Institute at [email protected]. This material was created in the performance of Federal Government Contract Number FA8721-05C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The U.S. Government's rights to use, modify, reproduce, release, perform, display, or disclose this material are restricted by the Rights in Technical Data-Noncommercial Items clauses (DFAR 252-227.7013 and DFAR 252-227.7013 Alternate I) contained in the above identified contract. Any reproduction of this material or portions thereof marked with this legend must also reproduce the disclaimers contained on this slide. Although the rights granted by contract do not require course attendance to use this material for U.S. Government purposes, the SEI recommends attendance to ensure proper understanding. THE MATERIAL IS PROVIDED ON AN AS IS BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).
13
November 2010
Purpose
The purpose of this module is to describe the general changes to the CMMI models (ACQ, DEV, SVC) for Version 1.3. These changes usually affect more than one place in the model and are often times these changes are discussed further in later modules of this training.
14
November 2010
Topics
CMMI Framework Terminology Updating Model Architecture Harmonizing V1.2 Models Glossary Teaming Concepts The Term Project Process Related Experiences Addressing Agile Causal Analysis at Lower Levels of Maturity Customer Satisfaction Modernizing Development Practices Prioritized Customer Requirements
Organization Level Contracts Providing Appropriate Phrasing in Easing Translation Practice Statements Front Matter Generic Practices Lifecycle Needs and Standards
15
November 2010
16
November 2010
Overview of Solution
With the expansion of CMMI to address Acquisition and Services in V1.2, the terminology used to describe CMMI models was formalized and improved. Updates to the glossary include the following:
CMMI Framework The basic structure that organizes CMMI components, including common elements of the current CMMI models as well as rules and methods for generating models, appraisal methods (including associated artifacts), and training materials. (See also CMMI model and CMMI Product Suite.) The framework enables new disciplines areas of interest to be added to CMMI so that the new disciplines they will integrate with the existing ones. [from Glossary]
17
November 2010
To allow the use of multiple models within the CMMI Framework, model components are classified as either common to all CMMI models or applicable to a specific model. The common material is called the CMMI Model Foundation or CMF. Those components are combined with material applicable to an area of interest (e.g., acquisition, development, services) to produce a model. [from
section 1 Introduction]
18
November 2010
All CMMI models contain 16 core process areas. These process areas cover basic concepts that are fundamental to process improvement in any area of interest (i.e., acquisition, development, services). Some of the material in the core process areas is the same in all constellations. Other material may be adjusted to address a specific area of interest. Consequently, the material in the core process areas may not be exactly the same.. [from section 2 Process Area
Components]
Shared PAs (There is only one shared PA [SAM].) A shared PA is shared by at least two CMMI models, but not all of them. [from
section 1 Introduction]
19
November 2010
20
10
November 2010
21
November 2010
Overview of Solution
Revised the model component descriptions (If defined in the glossary, the redundant text in the description was minimized, inconsistent text was eliminated, and a reference to the glossary was added.) Removed any text that made the component inconsistent with its assignment as a required, expected, or informative model component
Example
Generic Practice Elaborations A gGeneric practice elaborations appear after a generic practice in a process area practices to provide guidance on how the generic practices should can be applied uniquely to the process area areas. A generic practice elaboration is an informative model component. (See the definition of generic practice elaboration in the glossary.) ... [from section 2 Process Area Components]
For another example, see next slide.
2010 Carnegie Mellon University
22
11
November 2010
Overview of Solution
Renamed typical work product to example work product Reviewed typical work product lists to ensure that individually they are recognizable and collectively they are sufficiently broad in scope In ACQ, renamed typical supplier deliverable to example supplier deliverable Revised the glossary definition to eliminate the following sentence: These examples are called typical work products because there are often other work products that are just as effective but are not listed.
23
November 2010
Overview of Solution
Introduced continuous and staged representation concepts agnostically with more balance in the front matter Reduced the amount of model material that discusses the representations and their differences, thereby deemphasizing the importance of representations in the model
24
12
November 2010
Overview of Solution
Removed the amplification model component Converted amplifications having value into examples or notes
25
November 2010
Overview of Solution
Revised references so that users can easily find the information that the reference points to by searching for a goal or practice title or purpose statement in the destination PA Introduced a standard sequence to references when there are multiple adjacent references Constellation-unique PAs appear first Within each PA reference grouping, the references are listed alphabetically by the destination PA
26
13
November 2010
27
November 2010
28
14
November 2010
Overview of Solution
Analyzed differences among the three models (ACQ, DEV, SVC) to identify opportunities to improve commonality Examples of improvements include the following: GGs, GPs, and GP elaborations consolidated into one location (DEV) Improved measurement, supplier, and agreement terminology (DEV) Improved definitions of terms related to products and services (DEV, ACQ) Increased emphasis on customer satisfaction (all three) Improved examples, example work products, and notes (all three) Many of the changes made to the model are due to harmonization and are covered in the PA Changes modules of the upgrade training.
2010 Carnegie Mellon University
29
November 2010
Glossary
30
15
November 2010
Glossary1
The Problem
Some definitions of terms in the glossary had the following problems: They were not consistent with how the same terms were described in other parts of the model (e.g., expected CMMI components, functional configuration audit, higher level management, version control). They did not accurately reflect their relationships with other terms in the glossary (e.g., process, process element, subprocess). They were not consistent with ISO standard definitions (e.g. quality, process, subprocess). They were incorrect or misleading (e.g., base measure, development, end user, natural bounds, process performance baseline, process performance model, quantitative objective). It wasnt clear that part of the glossary entry was the definition of the term and part was usage notes.
31
November 2010
Glossary2
Overview of Solution
For terms that were inconsistently described, changed the glossary definition, changed the description in another part of the model, or replaced a description with a reference to the glossary Ensured that related terms had consistent definitions where possible Updated glossary definitions for some terms to be more consistent with ISO standard definitions Corrected errors and improved the clarity of some definitions Established three distinct parts for glossary entries that have different formats, which allows their easy identification: 1. The term 2. The definition 3. Usage notes
32
16
November 2010
Glossary3
Example Definition Inconsistent with Other Model Material Chapter 2
Generic Practice Elaborations A genericGeneric practice elaboration appearselaborations appear after a generic practice in a process areapractices to provide guidance on how the generic practice shouldcan be applied uniquely to the process areaareas. A generic practice elaboration is an informative model component. (See the definition of generic practice elaboration in the glossary.)
Glossary
generic practice elaboration An informative model component that appears after a generic practice to provide guidance on how the generic practice shouldcould be applied uniquely to thea process area. (This model component is not present in all CMMI models.)
33
November 2010
Glossary4
Example Three-Part Glossary Entries
Two-part definition from DEV V1.2
product baseline In configuration management, the initial approved technical data package (including, for software, the source code listing) defining a configuration item during the production, operation, maintenance, and logistic support of its lifecycle. (See also configuration item and configuration management.)
34
17
November 2010
Glossary5
Example Definition Inconsistent with ISO Glossary
quality The degree to which ability of a set of inherent characteristics fufillsof a product, product component, or process to fulfill requirements. of customers.
35
November 2010
Glossary6
Example Definition Inconsistent with Intent Glossary
establish and maintain Create, document, use, and revise . . . as necessary to ensure it remains they remain useful.
The phrase establish and maintain means more than a combination of its component terms; . . . plays a special role in communicating a deeper principle in CMMI: work products that have a central or key role in work group, project, and organizational performance should be given attention to ensure they are used and useful in that role. This phrase has particular significance in CMMI because it often appears in goal and practice statements . . . and should be taken as shorthand for applying the principle to whatever work product is the object of the phrase.
36
18
November 2010
Teaming Concepts
37
November 2010
Teaming Concepts1
The Problem
Teams are clearly relevant to product development. How teams are established in an organization has a lot to do with whether or not they are successful. However, there are no specific practices addressing rules for establishing and operating teams in DEV, instead there is the Integrated Product and Process Development (IPPD) addition, which is optional. Fewer than 5% of recent appraisals have included IPPD. For the acquisition of complex systems, integrated teaming is not an option but a necessity. Thus, ACQ has, instead of an addition for IPPD, expected material that covers integrated teaming derived from generalizing and simplifying the IPPD material from DEV. SVC adopted the ACQ approach, but in many service contexts integrated teams were not the key differentiator for success and the concept also proved to be problematic in some contexts. Thus to harmonize the models, a different approach was needed.
38
19
November 2010
Teaming Concepts2
Overview of Solution
Replaced the concepts of integrated teaming and IPPD with a more general concept of teaming, thereby eliminating the IPPD addition and making the approach to teaming consistent in all three models (By making the three constellations common, teaming can be part of the CMF.) Replaced the glossary definition of integrated team with a definition of team In the glossary definition, placed emphasis on what enables superior team performance: A team establishes and maintains a process that identifies roles, responsibilities, and interfaces; is sufficiently precise to enable the team to measure, manage, and improve their work performance; and enables the team to make and defend their commitments.
39
November 2010
Teaming Concepts3
Example
IPPD Addition Integrated processes that emphasize parallel rather than serial development are a cornerstone of IPPD implementation. The processes for developing the product and for developing product-related lifecycle processes, such as the manufacturing process . . .
ACQ IPM SP 1.6 Establish Teams Establish and maintain integrated teams.
SVC IWM SP 1.6 Establish Teams Establish and maintain integrated teams.
40
20
November 2010
41
November 2010
42
21
November 2010
43
November 2010
44
22
November 2010
45
November 2010
46
23
November 2010
Overview of Solution
Replaced these compound expressions, itemizing types of process related experiences with the simpler phrase process related experiences (OPF) SP 3.4 Incorporate Experiences into Organizational Process Assets Incorporate process related work products, measures, and improvement information experiences derived from planning and performing the process into organizational process assets. . . . Examples of process related experiences include work products, measures, measurement results, lessons learned, and process improvement suggestions. . . . [from GP 3.2 informative material]
47
November 2010
48
24
November 2010
49
November 2010
Examples
(PPQA) SP 1.1 Objectively Evaluate Processes Objectively evaluate designated selected performed processes against applicable process descriptions, standards, and procedures. Package and Deliver the Product or Product Component Package the assembled product or product component and deliver it to the appropriate customer. Analyze Individual Incident Data Analyze individual incident data to determine the best a course of action.
(Also see GP 2.6 statement in GP section of this presentation.)
(PI) SP 3.4
(IRP) SP 2.2
50
25
November 2010
Generic Practices
51
November 2010
Generic Practices1
The Problem
There are inconsistencies among GP descriptions, glossary definitions, associated PAs, and GP elaborations. There is wording in some GP descriptions (notably GP 2.8 and GP 3.2) that confuse users about the intent of the GP and lead to incorrect interpretation (e.g., a measure for every PA).
52
26
November 2010
Generic Practices2
Solutions
Revised GG 1 so that it is in the correct voice (i.e., passive voice) GG 1 Achieve Specific Goals The process supports and enables achievement of the specific goals of the process area byare supported by the process by transforming identifiable input work products to produceinto identifiable output work products. [Note: process of may be replaced by process by] Revised the GP 2.6 title to align with the statement and intent of the GP
GP 2.6 Manage ConfigurationsControl Work Products Place designatedselected work products of the process under appropriate levels of control.
53
November 2010
Generic Practices3
Clarified the GP 2.8 informative material (These clarifications are intended to 1) counter the impression that a measurement is required for every PA and 2) differentiate GP 2.8 and GP 2.10 relative to: a) the frequency of monitoring and controlling and b) the level of management involved.) Subpractices 1. Measure Evaluate actual progress and performance against the plan for performing the process. The measures evaluations are of the process, its work products, and its services. 3. Review activities, status, and results of the process with the immediate level of management responsible for the process and identify issues. TheThese reviews are intended to provide the immediate level of management with appropriate visibility into the process. The reviews can be both periodic and event driven based on the day-to-day monitoring and controlling of the process, and are supplemented by periodic and eventdriven reviews with higher level management as described in GP 2.10.
54
27
November 2010
Generic Practices4
Expanded the scope of GP 2.9 to align with the expectations set in PPQA GP 2.9 Objectively Evaluate Adherence Objectively evaluate adherence of the process and selected work products against itsthe process description, standards, and procedures, and address noncompliance. The purpose of this generic practice is to provide credible assurance that the process isand selected work products are implemented as planned and adheres adhere to its the process description, standards, and procedures. This generic practice is implemented, in part, by evaluating selected work products of the process.
55
November 2010
Generic Practices5
Revised the GP 2.10, Review Status with Higher Level Management, informative material to clarify that higher level management can include, but is not required to include senior management . . . In particular, higher level management includes can include senior management. Streamlined GP 3.2 and its elaborations with process related experiences
GP 3.2 Collect Improvement Information Process Related Experiences Collect work products, measures, measurement results, and improvement information process related experiences derived from planning and performing the process to support the future use and improvement of the organizations processes and process assets.
Updated GP elaborations to 1) reflect changes in the content of the PA they are associated with and 2) remove references to the GP-PA relationships table that is now in the same section
56
28
November 2010
Generic Practices6
Examples
Examples of updates to GP elaborations that reflect PA changes (DEV GP 2.3) QPM Elaboration Special expertise in statistics . . . may be needed to define the analytic techniques for statistical used in quantitative management . . . however, teams need sufficient expertise to support a basic understanding of their process performance as they perform their daily work. Examples of other resources provided include the following: . . . Scripts and tools that assist teams in analyzing their own process performance with minimal need for additional expert assistance (DEV GP 2.4) TS Elaboration Appointing a lead or chief architect that oversees the technical solution and has authority over design decisions helps to maintain consistency in product design and evolution. Deleted capability levels 4 and 5, generic goals 4 and 5, and all level 4 and 5 generic practices as part of high maturity changes
2010 Carnegie Mellon University
57
November 2010
58
29
November 2010
Overview of Solution
Added examples from DEV Part 1 to the SVC model for lifecycles such as manufacturing and maintenance In OPF SP 1.1, added an example box that provides examples of standards that could be used to reflect the organizations process needs and objectives, including lifecycle related standards Added standards as entries to the References in the Appendix (e.g., ISO/IEC 15288:2008, ISO/IEC 27001:2005, NDIA Engineering for System Assurance Guidebook)
59
November 2010
Addressing Agile
60
30
November 2010
Addressing Agile1
The Problem
Developers that use Agile methods sometimes resist using CMMI because they cant see how CMMI practices are applicable to and can improve the effectiveness of organizations using Agile methods.
Overview of Solution
Added guidance to appropriate PAs to do the following: Help users interpret the practices in a context where Agile methods are used Reinforce the applicability of the practices in an Agile environment Send the message that CMMI is a robust best practice framework meant to be used in Agile environments as well as other development environments
61
November 2010
Addressing Agile2
Added a new section to DEV Chapter 5 entitled Interpreting CMMI When Using Agile Approaches This section describes how CMMI practices can apply in a variety of development environments. It also provides interpretive guidance in selected PAs that explains how the PA can be used in Agile environments. A reference to this new section appears in the SSD intro notes of SVC. Added interpretive guidance to the following PAs: In DEV: CM, REQM, PP, RD, TS, PI, VER, PPQA, and RSKM In ACQ: AM, ATM, PMC, and PP In SVC: SSD Added in DEV and SVC (SSD only) Agile-related examples (as bullets)
62
31
November 2010
Addressing Agile3
Example
An example of a note added to DEV is the following one for PP: In Agile environments . . . Teams plan, monitor, and adjust plans during each iteration as often as it takes (e.g., daily). Commitments to plans are demonstrated when tasks are assigned and accepted during iteration planning, user stories are elaborated or estimated, and iterations are populated with tasks from a maintained backlog of work. (See Interpreting CMMI When Using Agile Approaches in Part I.)
63
November 2010
64
32
November 2010
65
November 2010
66
33
November 2010
Customer Satisfaction
67
November 2010
Customer Satisfaction
The Problem
Especially in DEV, customer satisfaction was rarely mentioned in V1.2. All three models have this issue to some degree.
Overview of Solution
Added informative material, such as customer satisfaction related measures and examples, to the following PAs: In ACQ: ARD and OPF In DEV: MA, OPF, PMC, and RD In SVC: MA
68
34
November 2010
69
November 2010
70
35
November 2010
71
November 2010
architecture The set of structures needed to reason about a product. These structures are comprised of elements, relations among them, and properties of both.
In a service context, the architecture is often applied to the service system. Note that functionality is only one aspect of the product. Quality attributes, such as responsiveness, reliability, and security, are also important to reason about. Structures provide the means for highlighting different portions of the architecture. (See also functional architecture.)
2010 Carnegie Mellon University
72
36
November 2010
73
November 2010
74
37
November 2010
Overview of Solution
Made the prioritization of requirements more explicit in RD SP 1.2 and SSD SP 1.1. This change will also result in RD, SSD, and ARD being more in synch (harmonized) SP 1.2 Develop the Transform Stakeholder Needs into Customer Requirements Transform stakeholder needs, expectations, constraints, and interfaces into prioritized customer requirements. Added informative material to the following PAs: In ACQ: PP In DEV: OPD, PP, RD In SVC: PP, SSD
75
November 2010
76
38
November 2010
Overview of Solution
Added minimal informative material to the following: SSAD SP 1.1 Identify Potential Suppliers (ACQ only) OPD SP 1.1 Establish Standard Processes (all 3 models) SAM SP 1.1 Determine Acquisition Type (DEV and SVC only)
77
November 2010
Easing Translation
78
39
November 2010
Easing Translation1
The Problem
The models are translated into multiple languages because CMMI has a worldwide user base. However, most of the developers who create and update the models are not familiar with issues encountered in translation and may use idioms (e.g., one size doesnt fit all). Further, because of the large number of developers, the work of multiple authors often means inconsistent use of phrasing (e.g., organizational process assets vs. organizations process assets) and even some terminology.
Overview of Solution
Reviewed the V1.3 model drafts to identify terminology and phrasing issues that can create translation problems (The CMMI Translation Team, consisting of members from different countries and representing different languages conducted the review. Many of the problems were consistency problems [e.g., percent vs. percentage] or overloaded terms [e.g., performance].)
79
November 2010
Easing Translation2
CMMI V1.2 foreign language translation status Language Japanese French German Spanish Portuguese Language Status (for DEV V1.2) Completed August 2007. Intro course translated October 2007 Completed August 2008 Completed April 2009. Intro course translated October 2009 Completed in June 2009 Completed in May 2010 Status (for ACQ V1.2)
Chinese (trad.) Completed April 2009 Language Arabic Status (for SVC V1.2) To start, pending agreement
2010 Carnegie Mellon University
80
40
November 2010
Front Matter
81
November 2010
Front Matter
The Problem
Many improvements were made to the models and the front matter needed to describe and reflect those changes. Overview of Solution Added the history of CMMI to accompany Figure 1.2 With a few exceptions, removed mention of source models Removed any biases favoring maturity levels or capability levels Clarified that CMMI models are not processes or process descriptions Added information on selecting the right CMMI model for use Clarified that section 2 Process Area Components contains descriptions and not definitions Mentioned (DEV only) that recursion among PAs can also apply to Project Management PAs (In V1.2, some inferred the idea of recursion might apply to Engineering PAs only.)
2010 Carnegie Mellon University
82
41
November 2010
Summary
There were many changes to the model in many areas. The following V1.3 change criteria was used to guide these changes:
Improve clarity of high maturity material and its alignment across required, expected, informative Provide more effective GPs Improve appraisal efficiency Increase the commonality across the constellations: harmonize the constellations, including incorporation of improvements introduced in later V1.2 releases Reduce model complexity and size Correct identified model defects or provide enhancements
83
SM
SCAMPI and SCAMPI Lead Appraiser are service marks of Carnegie Mellon University.
CMMI and Carnegie Mellon are registered in the US Patent and Trademark Office by Carnegie Mellon University. For more information on CMU/SEI Trademark use, please visit http://www.sei.cmu.edu/legal/marks/index.cfm
2010 Carnegie Mellon University
42
November 2010
2010 Carnegie Mellon University This material is distributed by the SEI only to course attendees for their own individual study. Except for the U.S. government purposes described below, this material SHALL NOT be reproduced or used in any other manner without requesting formal permission from the Software Engineering Institute at [email protected]. This material was created in the performance of Federal Government Contract Number FA8721-05C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The U.S. Government's rights to use, modify, reproduce, release, perform, display, or disclose this material are restricted by the Rights in Technical Data-Noncommercial Items clauses (DFAR 252-227.7013 and DFAR 252-227.7013 Alternate I) contained in the above identified contract. Any reproduction of this material or portions thereof marked with this legend must also reproduce the disclaimers contained on this slide. Although the rights granted by contract do not require course attendance to use this material for U.S. Government purposes, the SEI recommends attendance to ensure proper understanding. THE MATERIAL IS PROVIDED ON AN AS IS BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).
85
November 2010
Purpose
Improvements were made to all PAs; some PAs changed more than others. This module provides a summary of changes for each PA. The summaries of changes are presented in alphabetical order by PA abbreviation.
86
43
November 2010
Summary of Changes in AM
Made no substantive changes to specific goals, specific practices, or terminology Adjusted the focus of an example work product to help ensure that the specific practice (SP 1.1 Execute the Supplier Agreement) is interpreted to cover responsibilities of the acquirer as well as the supplier Added guidance for determining the levels to which the selected supplier processes should be monitored Moved this process area from the Acquisition process area category to the Project Management category
87
November 2010
88
44
November 2010
89
November 2010
90
45
November 2010
91
November 2010
92
46
November 2010
93
November 2010
Summary of Changes in CM
Made no substantive changes to specific goals or specific practices Removed functional configuration audits (FCA) and physical configuration audits (PCA) from the glossary (which are used only in SP 3.2 Perform Configuration Audits) Added guidance at multiple locations on how CM applies to hardware, equipment, and other tangible assets in addition to software and documentation Addressed how CM applies in Agile environments in the introductory notes Added subpractices about specifying relationships among configuration items, providing access control to the CM system, and categorizing and prioritizing change requests Added the need to check for consistency among configuration items as part of assuring baseline integrity
2010 Carnegie Mellon University
94
47
November 2010
95
November 2010
96
48
November 2010
97
November 2010
Summary of Changes in MA
Made no substantive changes to specific goals, specific practices, or terminology Distinguished more clearly between information needs and objectives, measurement objectives, and business/project objectives in the informative material Added informative material to acknowledge the importance of customer satisfaction Added a table similar to the one for CMMI-ACQ, which provides some common examples of measures, measurement information categories, base measures, derived measures, and measurement relationships
98
49
November 2010
99
November 2010
100
50
November 2010
101
November 2010
102
51
November 2010
Summary of Changes in OT
Revised the PA to eliminate technical training where it may not apply to all model constellations Removed the subjective term necessary (as in Provide Necessary Training) from the SG 2 title and statement Made no substantive changes to specific practices or terminology Added examples and guidance to the informative material
103
November 2010
Summary of Changes in PI
Made no substantive changes to specific goals Changed SP 1.1 to focus on an integration strategy rather than an integration sequence to reflect the complexity of the activity (This change caused additional revisions to SP 3.2 and informative material throughout the process area.) Added material to incorporate modern engineering practices, such as quality attributes, product lines, system of systems, architecture-centric practices, allocation of product capabilities to release increments, and technology maturation (This change resulted in further changes to SP 3.1 and informative material throughout the process area.) Added information on how Product Integration works with Agile methodologies Added examples and guidance to the informative material
104
52
November 2010
105
November 2010
Summary of Changes in PP
Made no substantive changes to specific goals or specific practices Revised the definitions of the following terms in the glossary: project and project startup Deleted the term program from the glossary Added information on how PP applies to product lines and Agile environments in the introductory notes Removed the material that addressed IPPD Added subpractices to specific practices 2.3 and 2.4 Modified the informative material so that the development of WBS can be based on other considerations in addition to the product or product architecture
106
53
November 2010
107
November 2010
108
54
November 2010
Summary of Changes in RD
Revised a specific goal (i.e., SG 3 Analyze and Validate Requirements), a specific practice (i.e., SP 3.2 Establish a Definition of Required Functionality and Quality Attributes), informative material, and terminology by adding material throughout the process area to incorporate modern engineering practices involving quality attributes, product lines, system of systems, architecture-centric practices, allocation of product capabilities to release increments, and technology maturation Modified SP 1.2 Transform Stakeholder Needs into Customer Requirements to emphasize that the result is prioritized customer requirements Added information on how Requirements Development works with Agile methodologies Added and revised the following glossary terminology: quality attributes, architecture, definition of required functionality and quality attributes, allocated requirement, functional analysis, product line
2010 Carnegie Mellon University
109
November 2010
110
55
November 2010
111
November 2010
112
56
November 2010
113
November 2010
Summary of Changes in SD
Made no substantive changes to specific goals, specific practices, or terminology Provided guidance to maintain traceability to requirements when the resolution to a service request results in changes to the service system Added material that reminds service providers to give customer and end user training and orientation as needed Added guidance on managing and controlling operationally oriented quality attributes associated with service delivery
114
57
November 2010
115
November 2010
116
58
November 2010
117
November 2010
118
59
November 2010
Summary of Changes in TS
Made no substantive changes to specific goals, specific practices, or terminology Added material throughout the process area to incorporate modern engineering practices, such as quality attributes, product lines, system of systems, architecture-centric practices, allocation of product capabilities to release increments, and technology maturation Added information on how Technical Solution works with Agile methodologies Added examples and guidance to the informative material
119
November 2010
120
60
November 2010
121
November 2010
122
61
November 2010
123
SM
SCAMPI and SCAMPI Lead Appraiser are service marks of Carnegie Mellon University.
CMMI and Carnegie Mellon are registered in the US Patent and Trademark Office by Carnegie Mellon University. For more information on CMU/SEI Trademark use, please visit http://www.sei.cmu.edu/legal/marks/index.cfm
2010 Carnegie Mellon University
62
November 2010
2010 Carnegie Mellon University This material is distributed by the SEI only to course attendees for their own individual study. Except for the U.S. government purposes described below, this material SHALL NOT be reproduced or used in any other manner without requesting formal permission from the Software Engineering Institute at [email protected]. This material was created in the performance of Federal Government Contract Number FA8721-05C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The U.S. Government's rights to use, modify, reproduce, release, perform, display, or disclose this material are restricted by the Rights in Technical Data-Noncommercial Items clauses (DFAR 252-227.7013 and DFAR 252-227.7013 Alternate I) contained in the above identified contract. Any reproduction of this material or portions thereof marked with this legend must also reproduce the disclaimers contained on this slide. Although the rights granted by contract do not require course attendance to use this material for U.S. Government purposes, the SEI recommends attendance to ensure proper understanding. THE MATERIAL IS PROVIDED ON AN AS IS BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).
125
November 2010
Purpose
This module provides a description of the changes for each of the core and shared process areas at maturity levels 2 and 3. All but Supplier Agreement Management are core process areas. SAM is a shared PA that appears in DEV and SVC models.
126
63
November 2010
Topics
Configuration Management (CM) Decision Analysis and Resolution (DAR) Integrated Project Management (IPM) Measurement and Analysis (MA) Organizational Process Development (OPD) Organizational Process Focus (OPF) Organizational Training (OT) Project Monitoring and Control (PMC) Project Planning (PP) Process and Product Quality Assurance (PPQA) Requirements Management (REQM) Risk Management (RSKM) Supplier Agreement Management (SAM)
127
November 2010
128
64
November 2010
Summary of Changes in CM
Made no substantive changes to specific goals or specific practices Removed functional configuration audits (FCA) and physical configuration audits (PCA) from the glossary (which are used only in SP 3.2 Perform Configuration Audits Added guidance at multiple locations on how CM applies to hardware, equipment, and other tangible assets in addition to software and documentation Addressed how CM applies in Agile environments in the introductory notes Added subpractices about specifying relationships among configuration items, providing access control to the CM system, and categorizing and prioritizing change requests Added the need to check for consistency among configuration items as part of assuring baseline integrity
2010 Carnegie Mellon University
129
November 2010
130
65
November 2010
131
November 2010
132
66
November 2010
133
November 2010
134
67
November 2010
135
November 2010
136
68
November 2010
137
November 2010
138
69
November 2010
139
November 2010
140
70
November 2010
141
November 2010
142
71
November 2010
143
November 2010
144
72
November 2010
SP 3.4 SP 3.5
145
November 2010
Introductory Notes
Added a bullet that addressed teams when describing what this PA involves Reduced the use of the phrase product and product component throughout the PA to ensure broader applicability of IPM SPs in all three constellations Deleted the IPPD notes here and throughout the PA
146
73
November 2010
147
November 2010
148
74
November 2010
149
November 2010
150
75
November 2010
151
November 2010
152
76
November 2010
153
November 2010
Summary of Changes in MA
Made no substantive changes to specific goals, specific practices, or terminology Distinguished more clearly between information needs and objectives, measurement objectives, and business/project objectives in the informative material Added informative material to acknowledge the importance of customer satisfaction Added a table similar to the one for CMMI-ACQ, which provides some common examples of measures, measurement information categories, base measures, derived measures, and measurement relationships
154
77
November 2010
155
November 2010
156
78
November 2010
157
November 2010
CMMI-SVC Made no substantive changes beyond harmonization changes and those described in CMMI-DEV
158
79
November 2010
159
November 2010
160
80
November 2010
Due to the elimination of the IPPD addition, demote SP 2.1, Establish Empowerment Mechanisms, to a subpractice in SP 1.7 Due to the elimination of the IPPD addition, delete SP 2.3, Balance Team and Home Organization Responsibilities
161
November 2010
Introductory Notes
Added guidance that the organizations set of standard processes (OSSP) can also describe standard interactions with suppliers Added guidance on how rules and guidelines can be used to deploy teams
162
81
November 2010
163
November 2010
164
82
November 2010
165
November 2010
166
83
November 2010
167
November 2010
168
84
November 2010
169
November 2010
170
85
November 2010
171
November 2010
172
86
November 2010
Summary of Changes in OT
Revised the PA to eliminate technical training where it may not apply to all model constellations Removed the subjective term necessary (as in Provide Necessary Training) from the SG 2 title and statement Made no substantive changes to specific practices or terminology Added examples and guidance to the informative material
173
November 2010
Changed the SG 2 title and statement to remove the unnecessary and potentially confusing qualification
SG 2 Provide Necessary Training Training necessary for individuals to perform their roles effectively is provided.
174
87
November 2010
175
November 2010
176
88
November 2010
177
November 2010
178
89
November 2010
179
November 2010
180
90
November 2010
181
November 2010
182
91
November 2010
183
November 2010
CMMI-SVC Renamed this PA to be Work Monitoring and Control Made no substantive changes beyond harmonization changes and those described in CMMI-DEV
184
92
November 2010
185
November 2010
Summary of Changes in PP
Made no substantive changes to specific goals or specific practices Revised the definitions of the following terms in the glossary: project and project startup Deleted the term program from the glossary Added information on how PP applies to product lines and Agile environments in the introductory notes Removed the material that addressed IPPD Added subpractices to specific practices 2.3 and 2.4 Modified the informative material so that the development of WBS can be based on other considerations in addition to the product or product architecture
186
93
November 2010
187
November 2010
188
94
November 2010
189
November 2010
190
95
November 2010
191
November 2010
192
96
November 2010
193
November 2010
194
97
November 2010
195
November 2010
196
98
November 2010
197
November 2010
198
99
November 2010
199
November 2010
200
100
November 2010
201
November 2010
202
101
November 2010
203
November 2010
204
102
November 2010
205
November 2010
206
103
November 2010
207
November 2010
208
104
November 2010
209
November 2010
210
105
November 2010
211
November 2010
212
106
November 2010
213
November 2010
214
107
November 2010
215
November 2010
216
108
November 2010
217
November 2010
218
109
November 2010
Demoted what were SP 2.2, Monitor Selected Supplier Processes, and SP 2.3, Evaluate Selected Supplier Work Products, to subpractices of SP 2.1, Execute the Supplier Agreement Renumbered SP 2.4, Accept the Acquired Product, and SP2.5, Ensure Transition of Products, to reflect this change
219
November 2010
220
110
November 2010
Introductory Notes
Clarified the scope of SAM to include services, service systems, and modified commercial off-the-shelf (COTS) components Clarified when SAM should be used when acquiring COTS COTS applies in cases where there are modifications to COTS components, government off-the-shelf components, or freeware, that are of significant value to the project or that represent significant project risk. Replaced formal agreement with supplier agreement since the definition of supplier agreement addresses formality Added information to describe that the PA typically is not implemented when the supplier is also the projects customer
221
November 2010
222
111
November 2010
223
November 2010
224
112
November 2010
OPD
SG 2 Enable IPPD Management
OT
SG 1 Establish an Organizational Training Capability A training capability, which supports the roles in the organization organization's management and technical roles, is established and maintained. SG 2 Provide Necessary Training Training necessary for individuals to perform their roles effectively is provided.
225
November 2010
OPD
SP 1.7 Establish Rules and Guidelines for Teams Establish and maintain organizational rules and guidelines for the structure, formation, and operation of teams.
226
113
November 2010
REQM
SP 1.5 Ensure Alignment Identify Inconsistencies Between Project Work and Requirements Ensure that Identify inconsistencies between the project plans and work products remain aligned and with the requirements.
RSKM
SP 3.1 Develop Risk Mitigation Plans Develop a risk mitigation plan in accordance with for the most important risks to the project as defined by the risk management strategy.
2010 Carnegie Mellon University
227
November 2010
Demoted what were SP 2.2, Monitor Selected Supplier Processes, and SP 2.3, Evaluate Selected Supplier Work Products, to subpractices in new SP 2.1, Execute the Supplier Agreement
SP 2.53 Ensure Transition of Products Ensure Transition the transition of acquired products acquired from the supplier to the project.
228
114
November 2010
SM
SCAMPI and SCAMPI Lead Appraiser are service marks of Carnegie Mellon University.
CMMI and Carnegie Mellon are registered in the US Patent and Trademark Office by Carnegie Mellon University. For more information on CMU/SEI Trademark use, please visit http://www.sei.cmu.edu/legal/marks/index.cfm
2010 Carnegie Mellon University
November 2010
2010 Carnegie Mellon University This material is distributed by the SEI only to course attendees for their own individual study. Except for the U.S. government purposes described below, this material SHALL NOT be reproduced or used in any other manner without requesting formal permission from the Software Engineering Institute at [email protected]. This material was created in the performance of Federal Government Contract Number FA8721-05C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The U.S. Government's rights to use, modify, reproduce, release, perform, display, or disclose this material are restricted by the Rights in Technical Data-Noncommercial Items clauses (DFAR 252-227.7013 and DFAR 252-227.7013 Alternate I) contained in the above identified contract. Any reproduction of this material or portions thereof marked with this legend must also reproduce the disclaimers contained on this slide. Although the rights granted by contract do not require course attendance to use this material for U.S. Government purposes, the SEI recommends attendance to ensure proper understanding. THE MATERIAL IS PROVIDED ON AN AS IS BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).
230
115
November 2010
Purpose
This module provides a description of the changes for each of the process areas that are unique to CMMI-ACQ.
231
November 2010
Topics
Agreement Management (AM) Acquisition Requirements Development (ARD) Acquisition Technical Management (ATM) Acquisition Validation (AVAL) Acquisition Verification (AVER) Solicitation and Supplier Agreement Development (SSAD)
232
116
November 2010
233
November 2010
Summary of Changes in AM
Made no substantive changes to specific goals, specific practices, or terminology Adjusted the focus of an example work product to help ensure that the specific practice (SP 1.1 Execute the Supplier Agreement) is interpreted to cover responsibilities of the acquirer as well as the supplier Added guidance for determining the levels to which the selected supplier processes should be monitored Moved this process area from the Acquisition process area category to the Project Management category
234
117
November 2010
235
November 2010
236
118
November 2010
237
November 2010
These concepts are mentioned in example boxes, examples provided in the notes, and discussion that mentions various approaches that can be used. When functional requirements are discussed, mention of quality attributes is added to balance the view of requirements.
238
119
November 2010
239
November 2010
240
120
November 2010
241
November 2010
242
121
November 2010
These concepts are mentioned in example boxes, in examples provided in the notes, and in discussion that mentions various approaches that can be used. When functional requirements are discussed, mention of quality attributes is added to balance the view of requirements.
243
November 2010
244
122
November 2010
245
November 2010
246
123
November 2010
247
November 2010
248
124
November 2010
249
November 2010
250
125
November 2010
251
November 2010
252
126
November 2010
253
November 2010
254
127
November 2010
255
SM
SCAMPI and SCAMPI Lead Appraiser are service marks of Carnegie Mellon University.
CMMI and Carnegie Mellon are registered in the US Patent and Trademark Office by Carnegie Mellon University. For more information on CMU/SEI Trademark use, please visit http://www.sei.cmu.edu/legal/marks/index.cfm
2010 Carnegie Mellon University
128
November 2010
2010 Carnegie Mellon University This material is distributed by the SEI only to course attendees for their own individual study. Except for the U.S. government purposes described below, this material SHALL NOT be reproduced or used in any other manner without requesting formal permission from the Software Engineering Institute at [email protected]. This material was created in the performance of Federal Government Contract Number FA8721-05C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The U.S. Government's rights to use, modify, reproduce, release, perform, display, or disclose this material are restricted by the Rights in Technical Data-Noncommercial Items clauses (DFAR 252-227.7013 and DFAR 252-227.7013 Alternate I) contained in the above identified contract. Any reproduction of this material or portions thereof marked with this legend must also reproduce the disclaimers contained on this slide. Although the rights granted by contract do not require course attendance to use this material for U.S. Government purposes, the SEI recommends attendance to ensure proper understanding. THE MATERIAL IS PROVIDED ON AN AS IS BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).
257
November 2010
Purpose
This module provides a description of the changes for each of the process areas that are unique to CMMI-DEV.
258
129
November 2010
Topics
Requirements Development (RD) Technical Solution (TS) Product Integration (PI) Verification (VER) Validation (VAL)
259
November 2010
260
130
November 2010
Summary of Changes in RD
Revised a specific goal (i.e., SG 3 Analyze and Validate Requirements), a specific practice (i.e., SP 3.2 Establish a Definition of Required Functionality and Quality Attributes), informative material, and terminology by adding material throughout the process area to incorporate modern engineering practices involving quality attributes, product lines, system of systems, architecture-centric practices, allocation of product capabilities to release increments, and technology maturation Modified SP 1.2 Transform Stakeholder Needs into Customer Requirements to emphasize that the result is prioritized customer requirements Added information on how Requirements Development works with Agile methodologies Added and revised the following glossary terminology: quality attributes, architecture, definition of required functionality and quality attributes, allocated requirement, functional analysis, product line
2010 Carnegie Mellon University
261
November 2010
262
131
November 2010
Changed the SP 3.2 title and statement to add an equal emphasis on non-functional requirements
SP 3.2 Establish a Definition of Required Functionality and Quality Attributes Establish and maintain a definition of required functionality and quality attributes.
263
November 2010
These concepts are mentioned in example boxes, in examples provided in the notes, and in discussion that mentions various approaches that can be used. When functional requirements are discussed, mention of quality attributes is added to balance the view of requirements.
264
132
November 2010
265
November 2010
266
133
November 2010
Summary of Changes in TS
Made no substantive changes to specific goals, specific practices, or terminology Added material throughout the process area to incorporate modern engineering practices, such as quality attributes, product lines, system of systems, architecture-centric practices, allocation of product capabilities to release increments, and technology maturation Added information on how Technical Solution works with Agile methodologies Added examples and guidance to the informative material
267
November 2010
268
134
November 2010
These concepts are mentioned in example boxes, in examples provided in the notes, and in discussion that mentions various approaches that can be used. When functional requirements are discussed, mention of quality attributes is added to balance the view of requirements.
269
November 2010
270
135
November 2010
271
November 2010
272
136
November 2010
Summary of Changes in PI
Made no substantive changes to specific goals Changed SP 1.1 to focus on an integration strategy rather than an integration sequence to reflect the complexity of the activity (This change caused additional revisions to SP 3.2 and informative material throughout the process area.) Added material to incorporate modern engineering practices, such as quality attributes, product lines, system of systems, architecture-centric practices, allocation of product capabilities to release increments, and technology maturation (This change resulted in further changes to SP 3.1 and informative material throughout the process area.) Added information on how Product Integration works with Agile methodologies Added examples and guidance to the informative material
273
November 2010
Terminology
Revised the terminology used from a emphasis on integration sequence to an emphasis on integration strategy
The product integration strategy describes the approach for receiving, assembling, and evaluating the product components that comprise the product.
Established a new phrase, integration strategy, procedures, and criteria to use for clarity throughout the process area
274
137
November 2010
Changed the SP 3.1 statement to replace function with behavior in order to encompass non-functional requirements
SP 3.1 Confirm Readiness of Product Components for Integration Confirm, prior to assembly, that each product component required to assemble the product has been properly identified, behaves functions according to its description, and that the product component interfaces comply with the interface descriptions.
275
November 2010
276
138
November 2010
277
November 2010
278
139
November 2010
Verification (VER)
279
November 2010
280
140
November 2010
281
November 2010
282
141
November 2010
Validation (VAL)
283
November 2010
284
142
November 2010
285
November 2010
286
143
November 2010
TS
SP 1.2 Select the product component solutions based on selection that best satisfy the criteria established.
287
November 2010
288
144
November 2010
SM
SCAMPI and SCAMPI Lead Appraiser are service marks of Carnegie Mellon University.
CMMI and Carnegie Mellon are registered in the US Patent and Trademark Office by Carnegie Mellon University. For more information on CMU/SEI Trademark use, please visit http://www.sei.cmu.edu/legal/marks/index.cfm
2010 Carnegie Mellon University
November 2010
2010 Carnegie Mellon University This material is distributed by the SEI only to course attendees for their own individual study. Except for the U.S. government purposes described below, this material SHALL NOT be reproduced or used in any other manner without requesting formal permission from the Software Engineering Institute at [email protected]. This material was created in the performance of Federal Government Contract Number FA8721-05C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The U.S. Government's rights to use, modify, reproduce, release, perform, display, or disclose this material are restricted by the Rights in Technical Data-Noncommercial Items clauses (DFAR 252-227.7013 and DFAR 252-227.7013 Alternate I) contained in the above identified contract. Any reproduction of this material or portions thereof marked with this legend must also reproduce the disclaimers contained on this slide. Although the rights granted by contract do not require course attendance to use this material for U.S. Government purposes, the SEI recommends attendance to ensure proper understanding. THE MATERIAL IS PROVIDED ON AN AS IS BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).
290
145
November 2010
Purpose
This module provides a description of the changes for each of the process areas that are unique to CMMI-SVC.
291
November 2010
Topics
Capacity and Availability Management (CAM) Incident Resolution and Prevention (IRP) Service Continuity (SCON) Service Delivery (SD) Service System Development (SSD) Service System Transition (SST) Strategic Service Management (STSM)
292
146
November 2010
293
November 2010
294
147
November 2010
295
November 2010
296
148
November 2010
297
November 2010
Terminology
Explicitly defined the term workaround
A workaround is a less-than-optimal solution for a certain type of incident that is nevertheless effective enough to use until a better solution can be developed and deployed.
298
149
November 2010
Changed the SG 3 title and statement to focus on the analysis of selected incidents to define how to address similar incidents in the future
SG 3 Define Approaches to Analyze and Address Causes and Impacts of Selected Incidents Approaches to address Causes and impacts of selected incidents are defined to prevent analyzed and addressed.
299
November 2010
Removed SP 2.4, Address Underlying Causes of Selected Incidents, because it is already covered by SP 3.3 subpractice 2 Changed what was SP 2.5 (now SP 2.4) statement to emphasize management of incidents and remove the implications of if necessary
SP 2.54 Monitor the Status of Incidents to Closure Monitor Manage the status of incidents to closure and escalate if necessary.
2010 Carnegie Mellon University
300
150
November 2010
Changed the SP 3.2 title and statement to focus on solutions for responding to future incidents
SP 3.2 Plan Actions Establish Solutions to Address Underlying Causes of Selected Respond to Future Incidents Identify the underlying causes of selected incidents Establish and create maintain solutions to respond to future incidents.
Changed the SP 3.3 title and statement to focus on reducing future incident occurrence
SP 3.3 Establish Workarounds for Selected Incidents and Apply Solutions to Reduce Incident Occurrence Establish and maintain workarounds for apply solutions to reduce the occurrence of selected incidents.
2010 Carnegie Mellon University
301
November 2010
302
151
November 2010
303
November 2010
304
152
November 2010
305
November 2010
306
153
November 2010
307
November 2010
308
154
November 2010
309
November 2010
310
155
November 2010
311
November 2010
Terminology
Explicitly defined the term essential function
Essential functions are those functions that must be sustained in an emergency or significant disruption of services.
312
156
November 2010
313
November 2010
314
157
November 2010
Summary of Changes in SD
Made no substantive changes to specific goals, specific practices, or terminology Provided guidance to maintain traceability to requirements when the resolution to a service request results in changes to the service system Added material that reminds service providers to give customer and end user training and orientation as needed Added guidance on managing and controlling operationally oriented quality attributes associated with service delivery
315
November 2010
316
158
November 2010
317
November 2010
318
159
November 2010
Changed the SP 1.3 statement to include quality attributes as part of analyzing and validating requirements
SP 1.3 Analyze and Validate Requirements Analyze and validate requirements, and define required service system functionality and quality attributes.
319
November 2010
320
160
November 2010
321
November 2010
322
161
November 2010
323
November 2010
324
162
November 2010
325
November 2010
326
163
November 2010
327
November 2010
328
164
November 2010
329
November 2010
330
165
November 2010
331
November 2010
332
166
November 2010
SCON
SP 3.3
333
November 2010
SST
SP 1.1 Analyze the functionality, quality attributes, and compatibility of the current and future service systems to minimize impact on service delivery.
SSM
SP 1.1 Gather and Analyze Relevant Data
334
167
November 2010
November 2010
2010 Carnegie Mellon University This material is distributed by the SEI only to course attendees for their own individual study. Except for the U.S. government purposes described below, this material SHALL NOT be reproduced or used in any other manner without requesting formal permission from the Software Engineering Institute at [email protected]. This material was created in the performance of Federal Government Contract Number FA8721-05C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The U.S. Government's rights to use, modify, reproduce, release, perform, display, or disclose this material are restricted by the Rights in Technical Data-Noncommercial Items clauses (DFAR 252-227.7013 and DFAR 252-227.7013 Alternate I) contained in the above identified contract. Any reproduction of this material or portions thereof marked with this legend must also reproduce the disclaimers contained on this slide. Although the rights granted by contract do not require course attendance to use this material for U.S. Government purposes, the SEI recommends attendance to ensure proper understanding. THE MATERIAL IS PROVIDED ON AN AS IS BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).
336
168
November 2010
Purpose
The purpose of this module is to do the following:
Review the high maturity concepts Summarize high maturity changes for CMMI V1.3
337
November 2010
Topics
High Maturity Overview Key Terminology Organizational Process Performance (OPP) Quantitative Process Management (QPM) Causal Analysis and Resolution (CAR) Organizational Performance Management (OPM) Summary
338
169
November 2010
339
November 2010
High maturity concepts were present in ALL constellations, but in much of the material all the examples focused on development.
340
170
November 2010
341
November 2010
342
171
November 2010
343
November 2010
344
172
November 2010
Improvement Proposals
Management
Organizational Quality & process performance objectives
Customer
2010 Carnegie Mellon University
345
November 2010
346
173
November 2010
347
November 2010
Key Terminology
348
174
November 2010
Key Terminology1
causal analysis The analysis of defects outcomes to determine their cause causes. process performance A measure of actual results achieved by following a process. (See also measure)
Process performance is characterized by both process measures (e.g., effort, cycle time, defect removal efficiency) and product or service measures (e.g., reliability, defect density, response time).
process performance baseline A documented characterization of the actual results achieved by following a process performance, which may include central tendency and variation. (See also process performance)
A process performance baseline can be is used as a benchmark for comparing actual process performance against expected process performance.
349
November 2010
Key Terminology2
process performance model A description of relationships among the measurable attributes of one or more processes or work products that is developed from historical process performance data and calibrated using collected process and product measures from the project and that is used to predict future performance. (See also measure)
One or more of the measureable attributes represent controllable inputs tied to a subprocess to enable performance of what-if analyses for planning, dynamic replanning, and problem resolution. Process performance models include statistical, probabilistic and simulation based models that predict interim or final results by connecting past performance with future outcomes. They model the variation of the factors, and provide insight into the expected range and variation of predicted results. A process performance model can be a collection of models that (when combined) meet the criteria of a process performance model.
350
175
November 2010
Key Terminology3
quality and process performance objectives Quantitative objectives and requirements for product quality, service quality, and process performance.
Quantitative process performance objectives include quality; however, to emphasize the importance of quality in the CMMI Product Suite, the phrase quality and process performance objectives is used. Process performance objectives are referenced in maturity level 3; the term quality and process performance objectives implies the use of quantitative data and is only used in maturity levels 4 and 5.
quantitative management Managing a project or work group using statistical and other quantitative techniques to build an understanding of the performance or predicted performance of processes in comparison to the projects or work groups quality and process performance objectives, and identifying corrective action that may need to be taken. (See also statistical techniques)
Statistical techniques used in quantitative management include analysis, creation, or use of process performance models, analysis, creation, or use of process performance baselines; use of control charts; analysis of variance, regression analysis; and use of confidence intervals or prediction intervals, sensitivity analysis, simulations, and tests of hypotheses.
351
November 2010
Key Terminology4
stable process The state in which all special causes of process variation have been removed and prevented from recurring so that only the common causes of process variation of the process remain. (See also capable process, common cause of variation, special cause of variation, and standard process)
352
176
November 2010
Key Terminology5
statistical and other quantitative techniques Analytic techniques that enable accomplishing an activity by quantifying parameters of the task (e.g., inputs, size, effort, and performance). (See also statistical techniques and quantitative management.)
This term is used in the high maturity process areas where the use of statistical and other quantitative techniques to improve understanding of project, work, and organizational processes is described. Examples of non-statistical quantitative techniques include trend analysis, run charts, Pareto analysis, bar charts, radar charts, and data averaging. The reason for using the compound term statistical and other quantitative techniques in CMMI is to acknowledge that while statistical techniques are expected, other quantitative techniques can also be used effectively.
353
November 2010
Key Terminology6
statistical techniques An analytic technique that employs statistical methods (e.g., statistical process control, confidence intervals, and prediction intervals). Techniques adapted from the field of mathematical statistics used for activities such as characterizing process performance, understanding process variation, and predicting outcomes.
Examples of statistical techniques include sampling techniques, analysis of variance, chisquared tests, and process control charts.
354
177
November 2010
355
November 2010
356
178
November 2010
357
November 2010
SP 1.5
358
179
November 2010
359
November 2010
360
180
November 2010
361
November 2010
SP 1.2
SP 1.3
362
181
November 2010
SP 2.31
SP 2.2
363
November 2010
SP 2.4
364
182
November 2010
365
November 2010
366
183
November 2010
367
November 2010
368
184
November 2010
369
November 2010
370
185
November 2010
SG 12 Select Improvements
ProcessImprovements are proactively identified, evaluated using statistical and technology improvements, which contribute to other quantitative techniques, and selected for deployment based on their contribution to meeting quality and process- performance objectives, are selected.
SG 23 Deploy Improvements
Measurable improvements to the organizations processes and technologies are continually and systematically deployed and evaluated using statistical and other quantitative techniques.
371
November 2010
372
186
November 2010
373
November 2010
SP 2.3
Validate Improvements
Validate selected improvements.
374
187
November 2010
375
November 2010
Summary
376
188
November 2010
377
November 2010
Organizational Innovation and Deployment was expanded and titled Organizational Performance Management to emphasize its business performance focus.
Specific goal 1, Manage Business Performance, was added. SP 1.1 Maintain Business Objectives SP 1.2 Analyze Process Performance Data SP 1.3 Identify Potential Areas for Improvement Specific goal 2, Select Improvements, was modified. Incremental and innovative improvement threads were merged into a common improvement thread through analysis, validation, and selection for deployment (with a focus on performance improvement).
378
189