Murthy
Murthy
1
Key Facts About Tata Consultancy Services (TCS)
“TCS has the size and reach unlike any other Indian software company.”
2
The Business Case for Process Improvement @ TCS – WHY
Productivity Improvement
Cost Competitive Advantage
Time
Global Delivery Capability
Process Agility
Reusability 6σ
High Quality Delivery
Quality Reliability
Maintainability
3
The Critical Decision : Organization Process Framework
*Source: A Taxonomy to Compare Software Process Improvement Frameworks, 12th International Conference on Software Quality
Best of People and Technology cannot guarantee best of the Products and
Services unless the Processes are Effective
4
Why CMMI works for TCS
Capability Maturity Model and CMM®, CMMI® are registered in the US Patent and Trademark office by Carnegie Mellon University.
PCMM,SCAMPI and SEI are service marks of Carnegie Mellon University.
5
CMMI Maturity Levels
TCS has been assessed Enterprise wide to be operating at CMMI Level 5 in Sep 2004
5
Optimizing
Focus on process improvement
4
Process measured and controlled Quantitatively
Managed
3
Process characterized for the Defined
organization and is proactive
2
Process characterized for projects Managed
and is often reactive
1
Process unpredictable, poorly Initial
controlled and reactive
6
Six Sigma Sickle : The Rigor for Success
5
Causal Analysis Tools, TRIZ, Regression Analysis Optimizing
4
Process Capability Analysis, Statistical Process Control Quantitatively
Managed
3
Affinity Diagram, QFD,CTQ Drill Down tree, Benchmarking, Defined
FMEA, QFD, CBM, Hypothesis Testing, Cost Benefit Analyses
2
Project Charter, Process Mapping Managed
1
Initial
7
Strategic Challenges @ TCS
1. Accelerating Customer
Acquisition
2. Building a Culture of
Ownership, and Empowerment • Bring in Customer driven excellence mind set
• Customer Satisfaction Level
• Customer Referral
3. Seamlessly integrating
organizational processes • Quality in deliverables, do it first time.. Every time
• Cost of Quality
4. Accelerating Revenue Growth
and Sustaining Profitability in the
long term
8
CMMI know “WHAT”
4
Quantitative Project Management Quantitatively
Managed
3
Organizational Process Definition Defined
Decision Analysis and Resolution
Requirements Development
2 Risk Management
Managed
1
Initial
9
CMMI know “WHAT” Æ Six Sigma know “HOW”
But do you know Six Sigma tells
how to do this? you How
Percent
Count
20 40
0 0
How it is relevant?
Type Of Mistakes Punctuation Grammar Spelling Typing
Count 22 15 10 3
Percent 44.0 30.0 20.0 6.0
Cum % 44.0 74.0 94.0 100.0
11
CMMI know “WHAT” Æ Six Sigma know “HOW” : Example
CMMI Process Areas (Example) Applicable Six Sigma Tools (Example)
10
0 10 20
12
CMMI know “WHAT” Æ Six Sigma know “HOW” : Example
CMMI Process Areas (Example) Applicable Six Sigma Tools (Example)
Specific Goals
13
Integrating Six Sigma with iQMS - Project Start Up
Project Requirement System Detailed Build & Verification &
Start-Up Analysis Design Design Test Validation
Deliverables :
¾ Task Order
¾ Project Plan
Go
9 ProjectPlan
Project Planning and Released
Pre-Start Up Project Start Up Exit Criteria 9 Work Order
Management
Authorized
9 Signed Task
Order
3
14
Integrating Six Sigma with iQMS - Requirement Analysis
Project Requirement System Detailed Build & Verification &
Start-Up Analysis Design Design Test Validation
Deliverables :
¾ System Requirement Specification (SRS)
¾ Updated Project Plan
Go
9 SRS Released
Business Needs Requirements Gathering Requirements Analysis Exit Criteria 9 Updated project
Plan Released
3
15
Integrating Six Sigma with iQMS - System Design
Project Requirement System Detailed Build & Verification &
Start-Up Analysis Design Design Test Validation
Deliverables :
¾ High Level Design (HLD)
¾ Usability Plan
¾ Prototype
Go
9 HLD Released
High Level Design Design Specifications Design Capability Exit Criteria 9 UsabilityPlan
Concepts Released
9 Reviewed
prototype
16
Integrating Six Sigma with iQMS - Design, Build & Test
Project Requirement System Detailed Build & Verification &
Start-Up Analysis Design Design Test Validation
Deliverables :
¾ Low Level Design (LLD)
¾ Unit Test Plan (UTP), Unit Test Specifications
(UTS)
¾ System Test Plan (STP), System Test
Specifications (UTS)
¾ Code Go
Test Plan
9 LLD Released
Ò Prepare STP, UTP Construction and Testing
Develop Detailed Design Test Plan Exit Criteria9 UTP, UTS
Ò Review Released
9 STP, STS available
9 Reviewed and unit
tested code
17
Integrating Six Sigma with iQMS - Verification & Validation
Project Requirement System Detailed Build & Verification &
Start-Up Analysis Design Design Test Validation
Deliverables :
¾ Acceptance letter from Client
¾ Client Feedback Stop
¾ Project Wind-up Note
6
18
Six Sigma Sickle : Harvesting for Success
3
Risk Management: Defined
Business Continuity and
Disaster Recovery
2
Requirements Management: Managed
Requirement Management
process
1
Initial
19
Requirements Process– Requirements Management
Analyze
Define High Level process map :
Define
Business Case / Project Needs:
Business Case / Project Needs:
The Data-Dynamics group at offshore are getting requirements and
The Data-Dynamics group at offshore are getting requirements and Develop Perform FMEA
• The Data-Dynamics group at offsite was receiving
modification details about Informatica mapping/SP through
Scripts
modification details about Informatica mapping/SP through
email/Tcons. Whenever any new changes comes for an existing
&
Oracle DB
DevelopRequirements
the and via emails
Implement the
Monitor
email/Tcons. Whenever any new changes comes for an existing Process Each
or tele-cons and keeping track of requirements/designs
mapping or SP, the developer first gathers all the design documents
On Server
mapping or SP, the developer first gathers all the design documents
and modification history. This requirement gathering process is
and modification history. This requirement gathering process is
docs/modification
process Requirement
extremely time consuming and prone to errors. With the increasing
histories was a huge challenge
extremely time consuming and prone to errors. With the increasing
number of mappings/SPs developed at offshore, keeping track of all
number of mappings/SPs developed at offshore, keeping track of all
requirements/designs docs/modification histories is getting difficult day
requirements/designs docs/modification histories is getting difficult day
by day.
by day.
Problem Statement:
Problem Statement:
• Six Sigma DMADV rigor was used to identify stakeholder needs
In the last 6 months there have been continuous changes in all the
Design
In the last 6 months there have been continuous changes in all the
Design
deliverables ( Informatica Mappings, SPs, Functions) provided by the
1. Details Design for improvement of Requirement Management
deliverables ( Informatica Mappings, SPs, Functions) provided by the 1. Details Design for improvement of Requirement Management
data dynamics team of CPQ Selectica Project at offshore. On an process.
data dynamics team of CPQ Selectica Project at offshore. On an process.
average a developer takes 30 mins to 2 hours to gather all 2. Risk analysis with FMEA
average a developer takes 30 mins to 2 hours to gather all 2. Risk analysis with FMEA
• Requirements Management system was developed as part of the
requirements related to any deliverable. This leads to delayed output
requirements related to any deliverable. This leads to delayed output
form the offshore team leading to customer dissatisfaction.
form the offshore team leading to customer dissatisfaction.
Measurable Project Y :
Measurable
The Project
time taken to retrieveYinformation
: about a deliverable .
Goal Statement: The time taken to retrieve information about a deliverable .
Goal Statement:improvement
To reduce the time taken to retrieve all information for any deliverable
To reduce the time taken to retrieve all information for any deliverable
Alternating Architecture :
Alternating Architecture :
Two different architectures were available:
Two different architectures were available:
at offshore to less than 10 mins by FW44. 1. Client-Server Architecture
at offshore toPhase
less than
1 10 mins by FW44. Phase 2 1. Client-Server Architecture
MGPP: 2. Distributed Architecture
MGPP: 1. Specify CTQs 1. Design New Process 2. Distributed Architecture
• Actual results from the collected data after the implementation of the
2. Develop Process Flow 2. Coding & Testing
3. Implementation
CTQ Measures:
CTQ Measures:
• Ensuring 100% execution of Improvement of Requirement
• Ensuring 100% execution of Improvement of Requirement
Management Process.
Management Process.
20
Business Continuity and Disaster Recovery: Risk Management
Business situation
• After Sept 11, existing business continuity plans were
• Existing Business Continuity Plans (BCP)Point and Disaster Recovery Plans
Point of
complete
of
inadequate to cater to increasing customer concerns on
(DRP) were inadequate to cater to increasing Disastercustomer concerns on security
recovery
Support Level
security and continuity.
and continuity y
• There was a need to re-engineer current processes in
x
continuing
• Six support servicesDMAIC
Sigma during and rigor
after a disaster
used to enhance Business Continuity Plan Time
situation Time for Time for Disaster
BCP DRP
Business Recovery
• Mission Critical applications were identified and back up support personnel
Continuity
y, Disaster Recovery • Backup support personnel at alternate locations for MCAs created
• Processes for role transition of backup personnel documented
• Mock drills conducted to assess readiness for disasters
21
Reduction of Defects – Causal Analysis and Resolution
PA- Causal Analysis and Resolution: Analyze & Improve:Vital X’s and Solutions
To identify causes of defects and other problems and take action to prevent CTQ Vital X (In Control) Improvement Plan
them from occurring in future. Reduction of 1. Inconsistency in Testing 1. Dedicated Quality
• A high level study of operational metrics in Defect Tracker indicated thatDefects Process Assurance Team
2. New Testing Process
roughly
Define:Business Case 50% of the cases reported were either defects in delivery or
A high level study of operational metrics in Defect Tracker for business website Cycle Time - 1. Time to Assign Change 1. Separate Team for
indicated thatproduction
roughly 50% of the problem
cases reported and thedefects
cycle timeorfor fixing defects was unpredictable
Response Request Defect Change Requests
were either in delivery
2. Time for Clarification 2. Remove Estimation and
production problems. Further the cycle time for fixing defects, when reported was Approval cycle
3. Time for Estimation
unpredictable.
• Causal Analysis techniques were used toCycle identify
Time - vital causes
1. Response Time and1. solutions
Separate Team for
Goal Statement: Resolution 2. Time to Schedule Defect Change Requests
To reduce theto address
number them
of defects and were
Production identified
problems using Six Sigma rigor
reported, by 50%. Development 2. Quality Assurance Build
To fix 100% defects reported, within an acceptable and agreed upon time frame, 3. Time for Deployment from offshore
with a tolerance of 5 %
• A dedicated
Measure:CTQ's and Baseline Quality Assurance Group and a separateImprovements
Control:Process group for defect
CTQ
change
Segment
requests
Mean
was setup
Standard
Deviation
within the project
Process Sigma CTQ team Segment Mean Standard
Deviation
Process
Sigma
Number of All 45.12 11.65 0 Number of All 13.67 9.29 2.2
Defects (%)
• Customer
Cycle Time – Severity1
signed
4.84
off savings
4.49
of USD
0.699
125,000 incurred through the Defects (%)
improvements
Response Time
(Days) Severity2 6.22 8.29 0.899
Cycle Time –
Response
Severity1 0.5 0.707 2.2
Problem Statement:
(Baseline) Planned Actual Planned Actual Planned Actual Planned Actual
• Using Six Sigma DMADV rigor a Siebel Center of Excellence was developed
Business GE 100% 100% 100% 95% 90%
Requirements CoE - ON 0% 0% 0% 5% 10%
In the past due to shortage of Siebel skilled resources, Customer added capacity Functional
CoE - OFF
GE
0%
100% 100%
0%
100%
0% 0%
90%
0%
75%
at GDC
CoE - OFF 0% 0% 0% 0% 0%
Design GE 70% 70% 50% 25% 25%
were added on a “Time and Material” basis and the prior Customer Siebel team Development GE
CoE - ON
CoE - OFF
30%
0%
20%
30%
0%
0%
40%
10%
0%
50%
25%
0%
50%
25%
0%
using “Fixed Price” pricing model and achieving high offshore leverage. Transfer
Documentatio GE
CoE - ON
CoE - OFF
75%
55%
0%
50% 50% 33% 33% 33%
n CoE - ON 10% 50% 35% 33% 33% 33%
Phase 1: Setting up the CoE at Onsite, having cross-training within the team and GECIS
CoE - OFF
GE
improvements
Resources GE 4 5 6 6 6 6
• Achieved First Time Right and On Time Delivery by the Center of Excellence
Team Reported Behavior
• Technology Team
/Designed SS Logs Into PVCS; BL Designates PM PL Determines
Siebel Support Behavior Type = Defect; Testers, Sets Owner Resolution/Work
Notifies User Differs from Owner & Submitter Sets Priority, = PL; Current Around &
team
Outcome
N Y
KEY
DL Unit & Performance CM Sets PL Sets PL
BL =Business DL Configures
Measures of Success:
Manager CM Perform s DEV to BL Integration N BL Sets Owner
QA Migration; Owner = Perform s & Assurance o = CM; Current
SS=Siebel BL; Current State = Integration & Testing State = Testing
Support M igrated/Loaded to QA Assurance Testing Successful Failed in QA
Verify:
required CM ; Current State
23
For questions contact
Nidhi Srivastava,
Email: [email protected]
Website : www.tcs.com
24