0% found this document useful (0 votes)
22 views129 pages

Unit 1

The document outlines the Software Process Maturity Framework, detailing its principles, assessment methods, and levels of maturity from Initial to Optimizing. It emphasizes the importance of structured processes in software development, including specification, design, verification, and maintenance, while also addressing common misconceptions and the need for continuous improvement. Key objectives include assessing process maturity, identifying weaknesses, enhancing product quality, and increasing predictability and efficiency in software projects.

Uploaded by

ARCHANA MANNE
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views129 pages

Unit 1

The document outlines the Software Process Maturity Framework, detailing its principles, assessment methods, and levels of maturity from Initial to Optimizing. It emphasizes the importance of structured processes in software development, including specification, design, verification, and maintenance, while also addressing common misconceptions and the need for continuous improvement. Key objectives include assessing process maturity, identifying weaknesses, enhancing product quality, and increasing predictability and efficiency in software projects.

Uploaded by

ARCHANA MANNE
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 129

Software Process & Project

Management
by
M.Archana
Asst.Prof
UNIT-1 SOFTWARE PROCESS MATURITY
 Software Process Maturity Framework
 Principles of Software Process Changes
 Software Process Assessment
 The Initial Process
 The Repeatable Process
 The Defined Process
 The Managed Process
 The Optimizing Process
 Process Reference Models
 Capability Maturity
Models(CMM,CMMI,PCMM,PSP,TSP)
SOFTWARE PROCESS
 A software process is a set of related activities
that leads to the production of the software.
 These activities may involve the development of
the software from the scratch, or, modifying an
existing system.
SOFTWARE PROCESS MUST INCLUDE
THE FOLLOWING FOUR ACTIVITIES:
 Software specification (or requirements
engineering): Define the main functionalities of the
software and the constrains around them.
 Software design and implementation: The software is
to be designed and programmed.
 Software verification and validation: The software
must conforms to it’s specification and meets the
customer needs.
 Software evolution (software maintenance): The
software is being modified to meet customer and market
requirements changes.
 When we talk about a process, we usually talk
about the activities in it.
 Products: The outcomes of the an activity.
 For example, the outcome of architectural design maybe a
model for the software architecture.

 Roles: The responsibilities of the people involved


in the process.
 For example, the project manager, programmer, etc.

 Pre and post conditions: The conditions that


must be true before and after an activity.
 For example, the pre condition of the architectural design is
the requirements have been approved by the customer,
while the post condition is the diagrams describing the
architectural have been reviewed.
WHAT IS SOFTWARE MATURITY
FRAMEWORK
 A Software Maturity Framework is a
structured approach used to assess and improve
the capabilities, processes, and quality of a
software development organization.

It Helps in Enabling
Continuous
Improvement
KEY OBJECTIVES OF SOFTWARE MATURITY
FRAMEWORKS:
 Assess Process Maturity:
 Evaluate how well-established and efficient the software
development processes are.
 Identify Weaknesses:
 Highlight areas where the organization or team needs
improvement.
 Provide a Roadmap for Improvement:
 Offer a structured path to improving software development
processes and practices.
 Enhance Product Quality:
 Ensure the software produced is reliable, scalable, and
maintains high quality.
 Increase Predictability and Efficiency:
 Help reduce risks, improve estimation, and foster a culture of
BENEFITS OF USING A SOFTWARE
MATURITY FRAMEWORK:

 Improved Process Efficiency


 Higher Product Quality
 Risk Reduction
 Predictability
 Better Customer Satisfaction
SOFTWARE PROCESS IMPROVEMENT
 An important first step in addressing software
problems is to treat the entire software task as a
process that can be controlled, measured and
improved.
 To improve s/w capabilities, organizations must
take 5 steps:
 Understand the current status of their development
process or processes.
 Develop a vision of the desired process.
 Establish a list of required process improvement actions
in order of priority
 Produce a plan to accomplish the required actions.
 Commit the resources to execute the plan
PROCESS MATURITY LEVELS
 Process maturity levels are different maturity states of a
process.
 Initially created by the Software Engineering Institute,
they serve as a helpful tool to reference the maturity of a
particular process and the next level of maturity for a
process.
 The 5 levels of process maturity are:

1. Initial

2. Repeatable

3. Defined

4. Managed

5. Optimizing
1. Initial – Processes are unpredictable, poorly
controlled, and reactive.

2. Repeatable - Basic project management


processes are established to track cost, schedule,
and functionality.

3. Defined - Processes are well-characterized and


standardized across projects..

4. Managed – Metrics are used to manage


processes and predict performance.

5. Optimized - Focus on continuous improvement


through innovation and process optimization.
PROCESS MATURITY LEVELS
THE PRINCIPLES OF SOFTWARE PROCESS
CHANGE

 Major changes to the software process must


start at the top
 Ultimately, everyone must be involved
 Effective Change requires a goal and knowledge
of the current process
 Change is continuous
 Software process changes will not be retained
without conscious effort and periodic
reinforcement
 Software process improvement requires
investment
COMMON MISCONCEPTIONS
ABOUT SOFTWARE PROCESS
 We Must Start With Firm Requirements
 If It Passes Test, it Must Be Ok
 Software Quality Cant Be Measured
 The Problems Are Technical
 We Need Better People
 Software Management Is Different
SOFTWARE PROCESS ASSESSMENT
 Process assessments helps s/w organizations improve themselves
by identifying their critical problems and establishing
improvement priorities.

 Key Objectives of the Assessment Process


 Evaluate Software Quality
 meets functional and non-functional requirements
 Ensure Process Compliance
 aligned with standards of organization and industry
 Risk Management
 Identify potential risks related to software, processes, or
project management.
 Continuous Improvement
 Provide feedback and recommendations
 Stakeholder Alignment
ASSESSMENT PHASES:

Assessments are in 3 phases:


Preparation: Phase one concludes with a brief one or
two day training program for the assessment team

Assessment :Phase two is the on site assessment


period. This activity typically takes several days, although
it can take two or more weeks, depending on the size of
the organization and the assessment technique used

Recommendations: phase three the findings and action


recommendations are presented to the local managers.
TYPES OF SOFTWARE ASSESSMENT

 Self Assessment : This is conducted internally by the


people of their own organization.
 Second Party assessment: This is conducted by an
external team or people of the own organization are
supervised by an external team.
 Third Party assessment: In an ideal case Software
Process Assessment should be performed in a
transparent, open and collaborative environment.
 The results of the Software Process Assessment are confidential
and are only accessible to the company.
 The assessment team must contain at least one person from the
FIVE ASSESSMENT
PRINCIPLES
 The need for a process model as a basis for the
assessment
 The requirement for confidentiality
 Senior management involvement
 An attitude of respect for the views of the people
organization being assessed
 An action orientation
THE SOFTWARE PROCESS AND PROJECT
MANAGEMENT ASSESSMENT PROCESS
 1. Planning the Assessment
 Define Objectives and Scope: e.g., code
quality, process adherence, risk management,
 Identify Key Stakeholders: Determine who
will be involved in the assessment
 Assessment Methodology: Choose
appropriate assessment techniques
 2. Preparation for Assessment:
 Data Collection: Gather relevant
documentation, metrics, and data..
 Set Benchmarks and Standards: Define the
metrics, benchmarks, or standards This could
be industry standards (ISO, CMMI) or internal
best practices.
 3.Execution of the Assessment: It involves conducting
actual reviews and evaluations across both the software and
project management processes.
 For Software Process Assessment:
 Code Reviews: Analyze the codebase for maintainability,

consistency with coding standards, performance, and security.


 Process Audits: Review the adherence to established processes

such as continuous integration/continuous delivery (CI/CD), unit

testing, and documentation.


 Testing and Validation: Evaluate the adequacy of testing (e.g.,

unit, integration, performance testing), as well as defect tracking and

resolution processes.
 Configuration and Version Control: Assess how well version

control practices (e.g., branching strategies, commits) are managed.


 For Project Management Assessment:
 Schedule Adherence: Check if the project is on track

in terms of timelines and milestones.

 Resource Allocation: Evaluate how effectively

resources (people, tools, budget) are managed.

 Risk and Issue Management: Review the process of

risk identification, mitigation, and issue tracking.

 Stakeholder Communication: Assess how well

stakeholders are being informed of progress, changes,

risks, and issues.

 Change Management: Review how change requests

are handled, and whether they follow formal procedures.


 4. Analysis of Findings:
 Aftercollecting data from reviews and assessments,
analyze the findings to identify:
 Strengths
 Weaknesses
 Opportunities for Improvement
 Risks
 5. Reporting:
 Document Findings
 Review with Stakeholders
 Prioritize Actions
 Implement Recommendations:
 ActionPlan
 Monitor Implementation
 Re-Assessment and Continuous Monitoring:
 Follow-UpReviews
 Continuous Improvement
IMPLEMENTING STEPS FOR ASSESSMENT

 Forming an Assessment Team


 Self Assessment considerations
 Assessment Ground rules
 Assessment Team training
FORMING AN ASSESSMENT TEAM:

 The assessment team members should be


experienced s/w developers, and one or more
should have experience in each phase of the
software process.
 Four to six persons form an a team, although
more can be used if desired. Since larger
teams cost more money and are harder to
manage, an upper limit of 8 to 10 participants
is usually wise
SELF ASSESSMENT CONSIDERATIONS
 While it is possible for organizations to assess
themselves, they should be aware of several
potential problems.
 Site managers who desire a self assessment
should thus reach agreement with their line
managers on its importance and their commitment
to support it.
 They must plan to assign people where needed,
commit to attend the necessary meetings, and
agree to participate in developing action plans to
address the recommendations.
ASSESSMENT GROUND RULES
 The assessment results will be kept confidential by the assessment

team members.

 The site manager personally agrees to participate in the opening and

closing assessment meetings.

 In addition to the regular members, the site manager agrees to assign

one or two local professional to handle the assessment arrangements

and to lead the follow-up action plan work.They will be full assessment

team members.

 The site manager commits to developing and implementing appropriate

action plans in response to the assessment recommendations.

 The site manager agrees to designate a person responsible for

developing the action plans. If possible this person should be a member

of the assessment team.


ASSESSMENT TEAM TRAINING

 Understanding Assessment Objectives

 Clarify the goals and objectives of the assessment (e.g., quality evaluation,

process improvement, risk assessment, etc.).

 Ensure the team knows what success looks like and what specific outcomes

are expected.

 Methodologies and Standards


 Introduce relevant assessment methodologies, frameworks, or standards (such

as CMMI, ISO, or Agile for software projects).

 Train the team on how to apply these frameworks consistently throughout the

assessment process.

 Tools and Techniques


 Provide training on the use of software tools, data collection instruments, or

evaluation techniques that will be used in the assessment.

 Teach methods such as surveys, interviews, observations, checklists, and

process walkthroughs.
 Roles and Responsibilities
 Clearly define the roles of each team member within the assessment

process.

 Train team members on specific responsibilities, such as data collection,

analysis, and reporting.

 Communication and Reporting


 Train the team on how to effectively communicate findings to stakeholders.

 Focus on both oral and written reporting skills, ensuring that the assessment

results are presented clearly and professionally.

 Ethical and Confidentiality Considerations


 Emphasize the importance of ethical standards, integrity, and confidentiality

during the assessment.

 Provide guidance on how to handle sensitive information or proprietary data.

 Risk and Issue Management


 Train the team on how to identify potential risks or issues that may affect the
ASSESSMENT CONDUCT
1. Probing Questions:

In conducting assessments it is hard to obtain really

accurate information.

The reasons are:

 Questions are misunderstood

 The respondents may have a different

understanding of some common terms

 The respondents may not be broadly aware of the

work in their own organization

 Occasionally people are unwilling to risk the truth


2. Assessment Conclusion:
 At the assessment conclusion, the team
prepares a report on its initial findings. The
report should be a composite of site status,
together with more detailed findings in key
areas.
 Prior to reviewing this material with the site
manager, the team should review it with the
project managers. This should identify any
overlooked problems.
3. Assessment Report:

The final assessment team action is the presentation of a

written final report and recommendations to the site

manager and staff.

A written assessment report should always be prepared

because:

 Writing the actual recommendations helps the assessment

team understand precisely what it is recommending.

 Since presentations are generally worded, their interpretation

is highly dependent on the listeners background and biases.

 A written report provides an idea vehicle for informing the

professionals about what was found and recommended.


4.Action Plans:

The action plans are next prepared by the local site


organization, generally under the guidance of the
team member named for this purpose. If properly
chosen, this member is now fully knowledgeable on
the issues an is able to start quickly.
5.Reassessments:

Organizations should generally conduct follow-up


assessments one to two years after the initial action
plans have been developed and approved.

This is improvement for several reasons:


 To assess the progress that has been made.
 To provide a visible future milestone for completion
of the actions from the prior assessment
 To establish new priorities for continued
improvements.
INITIAL PROCESS

Why organizations are chaotic?


INITIAL PROCESS

While lack of a commitment discipline is the most common


reason for chaotic behavior, there are other forces.

1. Under extreme pressure, s/w managers often


make a guess rather than a plan. When the
guess is drastically low, chaos ensues.

2. When the process gets rough, there is a strong

Temptation to believe in magic.


 Some savior may appear, or a new technology

may be the answer.


 Since this hope is often an excuse for not planning, it merely
postpones the problem.
3. The scale of s/w projects follows an escalating cycle:
 Programs generally take far more code

than expected.
 As the programs become larger,

new technical and management

issues come up.


 Since these are unlike previous

experience, they are a surprise.


 As the scale increases, the surprises

continue, but at dramatically increased cost.


Chaotic Forces-Unplanned Commitments

 The most difficult s/w management problem is the unplanned

commitment.

 Minor Commitments:
 When a new requirement seems simple the manager is tempted to

merely commit to do it. Since the s/w is a matter of detail, simple

looking functions often have hidden traps.

 When these are not found until after commitment, the manager is in

an impossible situation. The normal result is an inadequately staffed

crash effort to meet the date, regardless of job size.

 The unplanned commitment trap:


 Lack of prioritization , overloading resources, Poor Decision making,

Difficulty in saying no
CHAOTIC FORCES-GURUS AND MAGIC
 The technical wizard can be a powerful
asset. Unfortunately, gurus sometimes
believe they can do no wrong.

 After they have led a small team, they believe they


can do anything. During a time of crisis there is no
way to recover as most gurus run project out of
their heads.
 While there are many cases in which improved
technology can help, there are many more that
need effective management. It is both
understandable and desirable to search for better
CHAOTIC FORCES-PROBLEMS OF SCALE
As software becomes larger, they are much more difficult to understand
 One person knows the details of the program
 One person understands it but cant remember it so the design must be
documented.
 One person understands the overall design of a program, but the details
of its component modules are each understood by separate experts
 A large s/w product is defined and understood at the product
management level, but a separate team understands the characteristics
and design of each of its component programs.
 With s/w systems, the high level design may be well defined and
understood by the system, management team, but each of its
component product is only understood by the respective product
management organizations
 When the system is very large and has evolved through many versions
there maybe no one who understands it.
SOFTWARE PROCESS ENTROPY:
 There are many forces on the s/w process that push
us toward disorganization.
 1. Dynamic Requirements
 2. Increasing system size
 3. Human nature is the three classes of forces that
tend to disrupt it.
 The evolutionary process is driven by requirement
dynamics.
 Each new development uncovers new issues that lead to
changes. Changes are highly error prone and thus disruptive.
 Change disruption is thus compounded by increasing project
scale.

 In large s/w systems, the most severe problems are not


obvious until testing gets into trouble.
 Without a plan, the response driven management system
treats testing as a problem. Such irrational priorities produce
increasingly chaotic behavior.
THE WAY OUT:
 The general solution for chaos of the initial process is:
 Apply systematic project management- the work
must be estimated, planned and managed.
 Adhere to careful change management -changes
must be controlled, including requirements, design,
implementation and test.
 Utilize independent s/w assurance -an
independent technical means is required to assure that
all essential project activities are properly performed.
Initial Process- Basic
principles
 Plan the work

 Track and maintain the plan

 Divide the work into independent parts.

 Precisely define the requirements for each part

 Rigorously control the relationships among the

parts

 Treat software development as a learning

process
 Recognize what you don’t know
 When the gap between your knowledge and the
task is severe, fix it before proceeding.
 Manage, audit,and review the work to ensure is
done as planned
 Commit to your work and work to meet your
commitments.
 Refine the plan as your knowledge of the job
improves.
REPEATABLE PROCESS
 Managing Software organization
 Basic Principles of project management

 Making commitment
 Elements of Effective commitment
 Software commitment Process

 Management System
 Project Plan
 Project Planning Principle
 Planning Cycle
 Planning Content
REPEATABLE PROCESS-BASIC PRINCIPLES

 Repeatable Process includes Commitments, Planning,


configuration management and Quality Assurance.
 The basic principles of project management are:
 Plan based on hierarchy of commitments.
 A management system resolves the natural
conflicts between the projects and between the
line and staff organizations.
 An oversight and review system audits and tracks
progress against the plans.
MAKING COMMITMENT
 What is commitment?
 Commitment is an agreement by one person to do
something for another. Typically this involves a planned
completion date and some consideration or payment.

For eg: two programmers they must divide up the work.


THE ELEMENTS OF EFFECTIVE
COMMITMENTS ARE:
 Willingness
 The person making the commitment does so willingly
 Dedication:
 The commitment is not made lightly,
 The person responsible tries to meet the commitment, even If
help is needed.
 Contract:
 The agreement between the parties is on what is to be done,
by whom and when
 Declaration:
 The commitment is openly and publicly stated
 Preemptive:
 Prior to committed date, if it is clear that it cannot be met,
advance notice is given and a new commitment is negotiated.
THE SOFTWARE COMMITMENT
PROCESS:
To be effective, the software commitment process must
reach to the top of the organization.
 This requires:
 All commitments for future s/w delivery are personally

made by the organizations senior executive.


 These commitments are made only after successful
completion of a formal review and concurrence process.

 An enforcement mechanism ensures proper


conduct of reviews and concurrences.
To do this, the senior manager should require
evidence that the following work has done prior to
approving a commitment:
 The work is defined and agreed upon
between developers and the customer, ensuring
mutual understanding and alignment.

 A documented project plan has been


created, outlining the resource estimate,
schedule, and cost estimate.

 All parties directly involved have formally


agreed to the plan in writing.
 Sufficient business and technical preparation
has been completed to ensure that the

commitment poses a reasonable risk.

 An independent review has been


conducted to verify that the planning work
adheres to the organization's standards and
procedures.
 The participating groups have access to or
can acquire the necessary resources for the
work.
THE MANAGEMENT SYSTEM
 While every organization will have unique objectives, there
are generally four components:
 To have a technical and business strategy that aims at such long
term goals as growth rate or market position
 To provide quality products that meet customers’ needs in a
timely and effective way
 To perform the assigned mission competitively and economically
 To perform continually the organizations ability to handle more
challenging work

 general objectives, applicable to most organizations,


demonstrate the inherent conflicts common to most
software groups.
PROJECT PLAN-PRINCIPLES
 While requirements are initially vague

and incomplete, a quality program can


only be built from an accurate and
precise understanding of users needs
 A conceptual design is then developed
as a basis for planning
 At each subsequent requirements
refinement, resource projections, size
estimates and schedules are also refined.
CONT…
 When the requirements are clear, a
detailed design and implementation
strategy is developed and incorporated in
the plan
 Implementation details are established and
documented in further plan refinements
 The plan provides the framework for
negotiating the time and resources to do
the job.
THE PLANNING CYCLE:
 The cycle starts with the initial requirements
 The response to every demand for a commitment
must be to produce a plan with objectives
 The plan is produced by first breaking the work
into key elements, called a work breakdown
structure(wbs)
 The size of each product element is estimated
 The resource needs are projected
 The schedule is produced
PROJECT PLAN CONTENTS:

 The elements of a s/w plan are:


 Goals and objectives: these describe what
is to be done, for whom and by when
 Work Breakdown Structure (WBS): The
WBS subdivides the project into tasks
that are each defined, estimated and
tracked.
 Product Size Estimates: these are
quantitative assessments of the code
required for each product element.
CONTD..
 Resource estimates: Based on prior
experience, known productivity factors are
applied to yield reasonable estimates of the
resources required for each WBS element
 Project Schedule: Based on the available
project staffing and resource estimates, a
schedule for the key tasks and deliverable
items is produced
DEFINED PROCESS

 A standard is a rule or basis for comparison that


is used to assess the size,content,value or
quality of an object or activity.

Definitions:
 Policy: A governing principle, typically used as a
basis for regulations, procedures, or standards
and generally stated by the highest authority in
the organization.
 Regulation: A rule, law, or instruction, typically
established by some legislative or regulatory
 Specifications: The precise and verifiable
description of the characteristics of a product.
 Guideline: A suggested practice, method or
procedure, typically issued by some authority.
 Procedure: A defined way to do something,
generally embodied in a procedures manual.
 Standard: A rule or basis for comparison that
is used to assess size, content or value,
typically established by common practice or by
a designated standards body
ESTABLISHING SOFTWARE STANDARDS
 Management and planning standards and
procedures
 Configuration management
 Estimating and costing
 Software Quality Assurance
 Status Reporting
 Development process standards and methods
 Requirement specification
 Design
 Documentation
 Coding
 Integration and testing
 Reviews, walkthrough and inspections
 Tool and process standards
 Product naming
 Size and cost measures
 Defect counting and recording
 Code entry and editing tools
 Documentation systems
 Languages

 Library system
STANDARD DEVELOPMENT PROCESS

 Standards development involves the following


steps:
 Establishing a standards strategy that defines
priorities and recognizes prior work
 Distribute, review, and maintain the strategy
 Select individuals or small working groups to
develop the top priority standards
 The development effort should build on prior work,
define applicable areas, specify the introduction
strategy, and propose an enforcement plan.
 The draft standards should be widely distributed
and reviewed.
 The standards should be revised to
incorporate the review comments and the re
reviewed if the changes are extensive
 The standards should initially be implemented
in a limited test environment
 Based on this test experience, the standards
should again be reviewed and revised
 Implement and enforce the standards across
the defined areas of applicability
 Evaluate the effectiveness of the standards in
actual practice.
MAINTAINING STANDARDS
The key responsibilities of standard
maintenance are:
 Stay aware of standards implementation
problems.
 Be a source of guidance and advice on using the
standard
 Maintain the standards baseline and control
changes
 Periodically review the standards to ensure their
effectiveness and pertinence.
 Standards maintenance should not involve a great deal
of work.
 If a standard need frequent changes, it probably covers
a subject that is not ready for standardization.
 The standards and procedures should also be reviewed
at least once every three to five years to ensure they are
current and needed.
 E.g.: corrective maintenance, Preventive maintenance
SOFTWARE INSPECTIONS
 Software inspections provide a powerful way to
improve the quality and productivity of the
software process.
 The basic objectives of inspections are:
 To find errors at the earliest possible point in the
development cycle
 To ensure that the appropriate parties technically agree
on the work
 To verify that the work meets predefined criteria
 To formally complete a technical task
 To provide data on the product and the inspection
process.
BASIC INSPECTION PRINCIPLES

 Inspection is a formal, structured process.

 Utilizes a system of checklists.

 Defined roles for all participants.

 Ensures an orderly approach to implementing

software engineering standards.

 Generic checklists and standards are developed

for each inspection type. These checklists and

standards are tailored to specific project needs

where appropriate.
 Promotes software engineering excellence

across the organization.

 Reviewers are prepared in advance and

identify concerns and questions prior to the

inspection.

 The focus of the inspection is on identifying

problems, not resolving them

 An inspection is conducted by technical

people for technical people


CONDUCT OF INSPECTIONS
 The inspection participants include the following people:
 The moderator(or inspection leader):
 The person responsible for leading the inspection
expeditiously and efficiently to a successful conclusion

 The producers:
 The person or persons responsible for doing the work
being inspected
 The reviewers(or inspectors):
 Generally people who are directly concerned and
aware of the work being inspected
 The recorder (or scribe):
 someone who records the significant inspection
results.
PREPARING FOR THE INSPECTION
 Preparation for Inspection:
 The producers and their manager determine
the product is ready for inspection.
 They agree on the inspection objectives.
 Identification of Participants and
Materials:
 Inspection participants are identified.
 Inspection entry criteria are prepared.
 Supporting materials are produced for the
opening meeting.
 Opening Meeting:
 The entire inspection group convenes.
 The moderator checks if all participants are
prepared.
 Preparation reports are collected if not already
submitted.
 Review of Errors:
 The producer reviews each major error identified
during the inspection.
 Post-Inspection Actions:
 The producer fixes the identified problems.
 Corrections are either reviewed with the
moderator or addressed in a re-inspection.
SOFTWARE TESTING

 Software Testing is defined as the execution of a


program to find its faults.
 Testing: The process of executing a program with
the intentions of finding errors
 Verification: An attempt to find errors by
executing a program in a test or simulated
environment
 Validation: An attempt to find errors by executing
a program in a real environment
 Debugging: Diagnosing the precise nature of a
known error and then correcting it.
Axioms of
testing
The testing axioms stated by Myers are:

 A good test case is one that has a high probability of

detecting a previously undiscovered defect

 Difficulty in testing is knowing when to stop

 It is impossible to test your own program

 A necessary part of every test case is a description of

the expected output

 The design of a system should be such that each

module is integrated into the system only once


 Avoid non reproduce able testing

 Thoroughly inspect the results

 Ensure that testability is a key objective in

your software design

 Never alter the program to make testing easier

 Testing, must start with objectives


TEST PLANNING

Major Test plan elements are:


 Establish objectives for each test phase
 Establish schedules and responsibilities for
each test activity
 Check the availability of tools, facilities, and
test libraries.
 Define procedures and standards for test
planning, execution, and reporting results.
 Set criteria for test completion and success.
REAL TIME TESTING

 Find ways to test each system state in real-time


or through simulation
 Create simulations for any missing modules so
real-time tests can run.
 Check for potential deadlocks, thrashing, or
sensitivity to timing issues.
 Use tests that simulate hardware faults or special
system conditions
 Use hardware simulation to stress the software
design
DEFINING THE SOFTWARE PROCESS

Process Standards
 Process standardization helps to reduce the
problem of training ,review and tool support
 Each projects experience can contribute to
overall improvement
 Process standards provide the basis for process
and quality measurement
LEVELS OF S/W PROCESS MODELS

Three levels:
 U or Universal process model,
 W or worldly process model and
 A or Atomic Process model
U PROCESS MODEL
The waterfall model is widely used s/w. It has several
shortcomings:
 It does not adequately address changes
 It assumes a relatively uniform and orderly sequence of
development steps
 It does not provide for such methods as rapid prototyping
or advanced languages
 To address these concerns, Boehm proposed Spiral Model.
 This U level model provides a high level overview of
some of the issues.
A PROCESS MODELS
 Atomic level process models are enormously
detailed.
 Precise data definitions, algorithmic
specifications, information flows and user
procedures are essential at this level.
 Atomic process definitions are often
embodied in process standards and
conventions.
W PROCESS MODELS

 The Worldly process level is of most direct


interest to practicing s/w engineers.
 It guides the sequence of their working tasks
and defines task prerequisites and results.
 When reduced to operational form, these
models look like procedures.
CRITICAL S/W PROCESS ISSUES
common issues are:
 Quality:
 humans are error prone.

 Each error when found is a surprise whose correction is both

expensive and disruptive.

 Product Technology:
 New developments may not clearly show how to implement an

algorithm, meet performance goals, or fit functions into limited

configurations.

 Planning processes in advance is wise to avoid time-consuming

and disruptive fixes for unsuccessful attempts.

 Proactive planning allows subsequent development to be more


 Requirements Instability: to design, build and to test

a program the required functions, interfaces and

environments must be stable While these may change

during development, the changes must be temporarily

frozen while development proceeds.

 There are three basic types of requirement changes


 Unknown requirements

 Unstable requirements

 Misunderstood requirements

 Complexity: Application programs are often easier to

develop than complete systems because their

environment is generally more stable.


THE SOFTWARE ENGINEERING PROCESS GROUP

SEPG Sustaining Role The role of the


SEPG can be divided into 6 categories:
 Establish process standards
 Maintain the process database
 Serve as the focal point for technology
insertion
 Provide key process education
 Provide project consultation
 Make periodic assessments and status reports.
THE MANAGED PROCESS

The Principles of Data Gathering:


 The data gathered in accordance with specific
objectives and a plan
 The selection of data to be collected is grounded
in a model or hypothesis regarding the process
under investigation.
 The data gathering process should take into
account how it affects the entire organization.
 The data gathering plan must have management
support.
THE OBJECTIVES OF DATA GATHERING:

There are four basic reasons for collecting


software data:
 Understanding
 As part of a research or development study,
data can be gathered to learn about some item
or process
 Evaluation
 The data can be used to study some product to
see if it meets acceptance criteria
 Control
 The data can be used to control some activity
 Prediction
 The data can be used to develop rate or trend
indicators
THE DATA GATHERING PROCESS

1. Managing the data Gathering

Process
 Data gathering is expensive so its process

must be carefully planned and managed.


 The data must be precisely defined to ensure that

the right information is obtained and it must be

validated to ensure that it accurately represents the

process.
2.Data Gathering Plan
The data gathering plan should be produced by SEPG & It cover

the below topics:

a) What data is needed, by whom, and for what purpose?

The plan must clearly and precisely state how the data is to

be used.

b) What is the data specification?

These include the definition of the initial data to be gathered

and the plan for defining any additional data that will likely be

needed.

c) Who will gather the data?

The people who will gather the data are identified


d) Who will support data gathering?

A training program is provided, and a support group


is available to answer questions
e) How will the data be gathered?

Appropriate forms and data entry facilities are made


available, and suitable procedures are established
f) How will the data be validated?

Since data is error prone it should be validated as


rapidly as practical
g) How will the data be managed?

One can gather enormous amounts of data on


software projects.
3.Data Validation

The way data is validated depends on the


data involved. With time measures, the
total elapsed time can be compared to
the total of the elements.
SOFTWARE MEASURES:

Data characteristics:
 The measures should be robust
 The measures should suggest a norm
 The measures should relate to specific product or
process properties
 The measures should suggest improvement
strategy
 They should be a natural result of the process
 The measure should be simple
 They should be both predictable and track able
S/W MEASUREMENTS CAN BE CLASSIFIED IN SEVERAL WAYS:

 Objective/subjective:
 This distinguishes between measures that count
things and those involving human judgment
 Absolute/relative:
 Absolute measures are typically invariant to the
addition of new items.
 The size of one program, are an absolute measure
and are independent of the sizes of the others.
 Relative measures change, as with an average or a
grading curve.
 Explicit/derived:
 Explicit measures are taken directly,
 derived measures are computed from other explicit or
derived measures.
 Dynamic/static:

 Dynamic measures have a time dimension, as with errors


found per month.
 Static measures remain in variant, as with total effort
expended or total defects found during development.
 Predictive/explanatory:

 Predictive measures can be obtained or generated in


advance
 explanatory measures are produced after the fact.
CLASSES OF DEFECT MEASURES:

 Severity:
 Measures the actual or anticipated impact of a defect on the
users operational environment

 Symptoms:
 refers to observed system behavior when the defect is found

 Where found:
 by the system location where they were identified

 When found:
 when they are found in the s/w life cycle, as in unit test,
function test, system test etc.

 How found:
 operations being performed when the defect was found.
 Where caused
 When caused:
 identifies defects as introduced during high level design
 How caused:
 counts of the errors that caused the defects can be
most useful in improving the s/w process.
 Where fixed:
 record where changes are made when installing fixes.
 When fixed:
 helpful in evaluating and improving the maintenance
and enhancement process.
 How fixed:
 how change is designed and applied
MANAGING SOFTWARE QUALITY
Quality Measures fall into following general classes:
 Development : Defects found, change activity
 Product: Defects found, s/w structure, information
structure, controlled tests
 Acceptance: Problems, effort to install, effort to use
 Usage: problems, availability, effort to install, effort to
use, user opinions
 Repair: Defects, resources expended
MAKING A SOFTWARE QUALITY ESTIMATE

 Bench marking:
 Recent programs completed by the organization are

reviewed to identify the ones most similar to the

proposed product.

 Assessment:
 available quality data on these programs is examined to

establish a basis for the quality estimate

 Evaluation:
 The significant product and process differences are then

determined and their potential effects estimated.


 Forecast
 Based on these historical data and the planned process changes, a

projection is made of anticipated quality for the new product

development process

 Alignment:
 This projection is then compared with goals and needed process

improvements are devised to meet the goals

 Refinement:
 The project quality profile is then examined to determine the areas

for potential improvement and a desired quality profile is produced

 Roadmap:
 A development plan is produced that specifies the process to be

used to achieve this quality profile


TRACKING AND CONTROLLING S/W
QUALITY:

 A responsible authority is named to own the quality

data and the tracking and reporting system


 Quality performance is tracked and reported to this

authority, during both development and maintenance


 Resources are established for validating the reported

data and retaining it in the process database.


 Actual product and organizational performance data

is periodically reviewed against plan


 Results are initially reviewed with responsible

line management and any discrepancies

involved

 Performance against targets is determined, and,

if worse than plan, line management prepares an

action plan for review with higher management.

 Quality performance is published, and a highlight

report is provided to senior management

 Overall performance is periodically reviewed with

senior management.
THE OPTIMIZING PROCESS
The principles of s/w defect prevention:
 The fundamental objective of s/w defect prevention
is to make sure that errors, once identified
and addressed , do not occur again.

The principles are:


 The programmers must evaluate their own errors.
 Feedback is an essential part of defect prevention
 There is no single cure-all that will solve all the
problems
 Process improvement must be an integral part of the
process
 Process improvement takes time to learn
AUTOMATING THE S/W PROCESS
 The reason for automating the s/w process is to
improve the quality and productivity of our work.
Organization needs an automation strategy to
guide their use of new methods and techniques
THE LONG TERM AUTOMATION PLAN:
 Establish an automation focal point:
 this involves naming an individual to lead the study and planning

efforts.

 Next, define current automation status, establish

immediate priority need, and gather information on

pertinent available tools and methods.

 An orderly assessment is next made of the most promising

available environments and tools

 Establish a strategic plan for environmental support

 Define a migration plan to take the organization from its

current support systems to the strategic environment.

 It is needed to provide a clear framework for selecting suitable


IMPLEMENTING AN AUTOMATION PLAN
 An implementation support group is established to
handle the ordering, vendor negotiations, installation,
conversion, operation and support for the environment.
 A more detailed automation plan is developed.
 Next specific project automation objectives and plans
are developed, consistent with this overall plan.
 Each project then develops a specific migration plan
with agreed responsibilities and milestones for each
key transition step.
 The new environment will require the programmers to
change the ways they work.
CONTRACTING FOR SOFTWARE
Software contracting conditions:
 Start with a very precise statement of what
was wanted
 Insist on rigid standards and detailed
documentation of every process step
 Require that each step be completed and
approved before the next was initiated
 Demand a firm commitment at the outset
THE PRINCIPLES OF EFFECTIVE
SOFTWARE CONTRACT MANAGEMENT:

 The vendor must be trustworthy and technically

competent

 The buyer must be capable of identifying technically

competent vendors

 The contract presumes mutual trust

 SQA and audit ensure honest and disciplined

performance

 Initially the highest priority task is to arrive at

agreement on what is to be produced, a plan to

produce it, and acceptance criteria


PROCESS REFERENCE
MODELS
 Capability Maturity Model(CMM)
 Capability Maturity Model
Integration(CMMI)
 People- Capability Maturity Model(PCMM)
 Personal software Process(PSP)
 Team Software Process(TSP)
CMM
 CMM can be used to assess an organization against a scale of five
process maturity levels based on certain Key Process Areas (KPA).
 It describes the maturity of the company based upon the project
the company is dealing with and the clients.
 A Maturity model provides:
 A place to start
 The benefit of a community’s prior experiences
 A common language and a shared vision
 A framework for prioritizing actions
 A way to define what improvement means for your
organization
CMM FIVE MATURITY LEVELS
 Initial
 Managed
 Defined
 Quantitatively Managed
 optimizing
 Maturity Level 1 – Initial:
 Company has no standard process for software development.
Nor does it have a project-tracking system that enables
developers to predict costs or finish dates with any accuracy.

 Maturity Level 2 – Managed:


 Company has installed basic software management processes
and controls. But there is no consistency or coordination among
different groups.

 Maturity Level 3 – Defined:


 Company has pulled together a standard set of processes and
controls for the entire organization so that developers can move
between projects more easily and customers can begin to get
consistency from different groups.
 Maturity Level 4 – Quantitatively Managed:
 In addition to implementing standard processes,
company has installed systems to measure the
quality of those processes across all projects.

 Maturity Level 5 – Optimizing:


 Company has accomplished all of the above and can
now begin to see patterns in performance over time,
so it can tweak its processes in order to improve
productivity and reduce defects in software
development across the entire organization.
CMMI
 The Capability Maturity Model Integration (CMMI)
helps organizations streamline process improvement,
encouraging a productive, efficient culture that
decreases risks in software, product and service
development.
 It starts with an appraisal process that evaluates
three specific areas:
 process and service development
 service establishment and management
 product and service acquisition.
METHOD FOR PROCESS
IMPROVEMENT (SCAMPI):

SCAMPI is the official appraisal method used by the CMMI institute.

There are three appraisal classes: Class A, B and C.

 SCAMPI A: The most rigorous appraisal method, SCAMPI A is

most useful after multiple processes have been implemented.


 It provides a benchmark for businesses and is the only level that

results in an official rating.

 It must be performed by a certified lead appraiser, who is part of

the on-site appraisal team.

 SCAMPI B: This appraisal is less formal than SCAMPI A;


 it helps find a target CMMI maturity level, predict success for

evaluated practices and give the business a better idea of where

they stand in the maturity process.


 SCAMPI C:
 This appraisal method is shorter, more flexible
and less expensive than Class A or B.
 It’s designed to quickly assess a business’s
established practices and how those will
integrate or align with CMMI practices.
 It can be used at a high-level or micro-level, to
address organizational issues or smaller
process or departmental issues.
 It involves more risk than Class A or B, but it’s
more cost-effective.
CMMI’S FIVE MATURITY
LEVELS ARE:
 Initial: Processes are viewed as unpredictable and reactive.

At this stage, “work gets completed but it’s often

delayed and over budget.”

 Managed: There’s a level of project management achieved.

Projects are “planned, performed, measured and

controlled” at this level, but there are still a lot of issues to

address.

 Defined: At this stage, organizations are more proactive

than reactive. There’s a set of “organization- wide

standards” to “provide guidance across projects,

programs and portfolios.”


 Quantitatively managed: This stage is more

measured and controlled.


 The organization is working off quantitative data to

determine predictable processes that align with stakeholder

needs.

 The business is ahead of risks, with more data-driven

insight into process deficiencies.

 Optimizing:
 Here, an organization’s processes are stable and flexible.

 At this final stage, an organization will be in constant state

of improving and responding to changes or other

opportunities.
PCMM (PEOPLE CAPABILITY MATURITY MODEL):

 PCMM is a maturity structure that focuses on continuously


improving the management and development of the human
assets of an organization.
 The People Capability Maturity Model (PCMM) is a
framework that helps the organization successfully address
their critical people issues.
 The PCMM guides organizations in improving their steps
for managing and developing their workforces.
 The People CMM defines an, evolutionary improvement
path from Adhoc, inconsistently performed workforce
practices, to a mature infrastructure of practices for
continuously elevating workforce capability.
 The PCMM subsists of five maturity levels that lay
successive foundations for continuously improving talent,
developing effective methods, and successfully directing
the people assets of the organization
THE FIVE STEPS OF THE PEOPLE CMM
FRAMEWORK ARE:
 Initial Level: The Initial Level of maturity
includes no process areas.
 Managed Level:
 In this level, the managers start to perform necessary
people management practices such as staffing,
operating performance, and adjusting compensation
as a repeatable management discipline
 The process areas at Maturity Level 2 are Staffing,
Communication and Coordination, Work Environment,
Performance Management, Training and
Development, and Compensation.
 Defined Level:
 This level helps an organization gain a
competitive benefit from developing the different
competencies that must be combined in its
workforce to accomplish its business activities.

 Predictable Level:
 In this level, the organization handles and exploits
the capability developed by its framework of
workforce competencies.

 The organization is now able to handle its capacity


and performance quantitatively.
 Optimizing Level:
 In this level, the integrated organization is
focused on continual improvement.
 These improvements are made to the efficiency
of individuals and workgroups, to the act of
competency-based processes, and workforce
practices and activities.
PERSONAL SOFTWARE PROCESS (PSP)
PSP is the skeleton or the structure that assist the
engineers in finding a way to measure and improve the
way of working to a great extend.
It helps them in developing their respective skills at a
personal level and the way of doing planning,
estimations against the plans.
OBJECTIVES OF PSP:
The aim of PSP is to give software engineers with
the regulated methods for the betterment of
personal software development processes.

The PSP helps software engineers to:


 Improve their approximating and planning skills.
 Make promises that can be fulfilled.
 Manage the standards of their projects.
 Reduce the number of faults and imperfections
in their work.
LEVELS OF PERSONAL SOFTWARE PROCESS
 PSP 0
The first level of Personal Software Process, PSP 0 includes
Personal measurement , basic size measures, coding
standards.
 PSP 1
This level includes the planning of time and scheduling .
 PSP2
This level introduces the personal quality
management ,design and code reviews.
 PSP3
The last level of the Personal Software Process is for the
Personal process evolution.
TSP
 TSP (Team Software Process) is a guideline
for software product development teams.
 TSP focuses on helping development teams to
improve their quality and productivity to better
meet goals of cost and progress.
 TSP is designed for groups ranging from 2
persons to 20 persons.
 TSP can be applied to large multiple-group
processes for up to 150 persons.
OBJECTIVES OF TSP

 To build a self-directed teams that plan and track their


work, establish goals and own their process and plans.
 These can be pure software teams or integrated
teams
 Shows managers how to coach and motivate the team
members so that they can give peak (high)
performance.
 It accelerate (increases) the software process
development
ACTIVITIES OF TSP

1) Project Launch:
 It reviews project objective and describes the TSP structure
and content.
 It assigns need and roles to the team members and describes
the customers need statement.

2) High Level Design:

it creates the high level design, specify the design, inspect the
design and develop the integration plan.

3) Implementation:

This uses the PSP to implement the modules and the functions.

4) Integration and Testing:

Testing builds and integrates the system.

5) Postmortem:

Writes the cycle report and produces peer and team review.
OTHER REFERENCE MODELS ARE:
 Business Process Maturity Model
(BPMM)
 Supply Chain Operations Reference
(SCOR) Model
 ITIL (Information Technology
Infrastructure Library)
 COBIT (Control Objectives for
Information and Related
Technologies)

You might also like