Function Point Estimation
Function Point Estimation
www.davidconsultinggroup.com
Topics
The User Perspective Program Start-up Characteristics of an Effective Sizing Metric Use of Function Points Project Estimation Quantitative & Qualitative Assessments Establishing and Using Baseline Data Modeling Improved Performance
Functional description and accountability (user) Management of delivery expectations Credibility in project estimation
Program Start-up
Planning
How will the information be used (objectives)? Who is counting? What is being counted? History versus industry?
Culture
Internal versus external Pilot/rollout versus organization-wide
External
Economies of scale Awareness & orientation Internal resources required
When to Size
1
DEFINE
2
DESIGN BUILD TEST
3
IMPLEMENT
SIZING
SIZING
SIZING
1) Initial sizing during or after Requirements Phase 2) Subsequent sizing after System Design or when Change occurs 3) Final sizing after Install
Consistent (methodology)
Easy to learn and apply Accurate, statistically based
A vehicle to estimate cost and resources required for software development, enhancements and/or maintenance A tool to quantify performance levels and to monitor progress made from software process improvement initiatives A tool to determine the benefit of an application to an organization by counting functions that specifically match requirements A tool to size or evaluate purchased application packages
9
A function point count is performed to produce a functional size measure The size can be used to generate project estimates
Estimates should be based upon delivery rates
10
Review the available documentation Meet with SME to gain a thorough understanding of the functionality Apply the function point methodology, and compute a functional size
11
Input
Inquiry
Output
Application
12
Interface
VENDOR SUPPLY
Output
PARTS LISTING USER ORDER PARTS
Inputs
USER CHANGE BILL
13
Data Relationships
14
Total Unadjusted FPs: 21 Value Adjusted Factor: 1.01 Total Adjusted FPs: 21
16
17
Analysis:
Inputs Outputs Inquiries Interfaces Files 75 10 15 7 13 80 25 17 7 15 95 40 17 7 15
Copyright 2005. The David Consulting Group, Inc.
Project Estimation
DEFINITION
CAPABILITY
ESTIMATE
Schedule
REQUIREMENT
PROJECT SIZE
PROJECT COMPLEXITY
RISK FACTORS
Effort
Costs
19
Capability Analysis
Collect project data
Project metrics (e.g., effort, size, cost, duration, defects) Project characteristics Project attributes (e.g., skill levels, tools, process, etc.) Project complexity variables
Analyze data
Performance comparisons (identification of process strengths and weaknesses) Industry averages and best practices Performance modeling (identify high impact areas)
20
Complexity Variables
Logical Algorithms Mathematical Algorithms Data Relationships Functional Size Reuse Code Structure Performance Memory Security Warranty
Metrics
Size Cost Effort Duration Defects Management Definition Design Build Test Environment
Attributes
21
Analysis
PERFORMANCE LEVELS
PROFILES
Results
Correlate Performance Levels to Characteristics Substantiate Impact of Characteristics Identify Best Practices
22
DEFINITION
CAPABILITY
ESTIMATE
Schedule
REQUIREMENT
RATE OF DELIVERY
Effort
Costs
23
Analysis of Results
Analyze estimating accuracy
Plan vs. actual comparisons Effectiveness of delivery rates
Recommend improvements
Improve the level of documentation for more accurate sizing Establish a more effective estimating practice
24
Performance
D D
SIZE
A B C D : 136 276 435 558 759
PROJECT MEASURES
PROFILES
25
Establish A Baseline
Size is expressed in terms of functionality delivered to the user Software
Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 0 2 4
Performance Productivity
A representative selection of projects is measured
Organizational Baseline
8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery
Function Points per Person Month
26
27
28
29
Analyze Results
COLLECT QUANTITATIVE DATA COLLECT QUALITATIVE DATA
Collection
Baseline Performance
Opportunities For Improvement Best Practices
30
Model Performance
Develop parametric models that utilize historical data points for purposes of analyzing the impact of selected process improvements Provide a knowledge base for improved decision making
Perform functional sizing on all selected projects. Collect data on project level of effort, cost, duration and quality. Calculate productivity rates for each project, including functional size delivered per staff month, cost per functional size, time to market, and defects delivered.
Results
Baseline Productivity 133 10.7 6.9 $939 0.0301
Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP
32
Qualitative Assessment Conduct Interviews with members of each project team. Collect Project Profile information. Develop Performance Profiles to display strengths and weaknesses among the selected projects.
Results
Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent
Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional - East Regional - West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade
55.3 27.6 32.3 29.5 44.1 17.0 40.2 29.2 22.7 17.6 40.6 23.5 49.0 49.3 22.8
47.73 50.00 29.55 31.82 31.82 22.73 45.45 56.82 36.36 43.18 56.82 29.55 38.64 54.55 31.82
82.05 48.72 48.72 43.59 53.85 43.59 23.08 28.21 43.59 23.08 71.79 48.72 56.41 74.36 38.46
50.00 11.36 0.00 0.00 34.09 0.00 38.64 22.73 0.00 0.00 0.00 0.00 52.27 20.45 0.00
46.15 38.46 42.31 30.77 38.46 15.38 53.85 26.92 30.77 26.92 38.46 38.46 30.77 53.85 11.54
43.75 0.00 37.50 37.50 53.13 0.00 50.00 18.75 9.38 9.38 43.75 6.25 53.13 50.00 25.00
50.00 42.31 42.31 42.31 42.31 30.77 34.62 53.85 30.77 26.92 38.46 26.92 53.85 38.46 46.15
33
Modeled Improvements
Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent
Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional - East Regional - West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade
55.3 27.6 32.3 29.5 44.1 17.0 40.2 29.2 22.7 17.6 40.6 23.5 49.0 49.3 22.8
47.73 50.00 29.55 31.82 31.82 22.73 45.45 56.82 36.36 43.18 56.82 29.55 38.64 54.55 31.82
82.05 48.72 48.72 43.59 53.85 43.59 23.08 28.21 43.59 23.08 71.79 48.72 56.41 74.36 38.46
50.00 11.36 0.00 0.00 34.09 0.00 38.64 22.73 0.00 0.00 0.00 0.00 52.27 20.45 0.00
46.15 38.46 42.31 30.77 38.46 15.38 53.85 26.92 30.77 26.92 38.46 38.46 30.77 53.85 11.54
43.75 0.00 37.50 37.50 53.13 0.00 50.00 18.75 9.38 9.38 43.75 6.25 53.13 50.00 25.00
50.00 42.31 42.31 42.31 42.31 30.77 34.62 53.85 30.77 26.92 38.46 26.92 53.85 38.46 46.15
Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP
Process Improvements: Code Reviews and Inspections Requirements Management Defect Tracking Configuration Management
Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent
Performance Improvements: Productivity ~ +131% Time to Market ~ -49% Defect Ratio ~ -75%
Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional - East Regional - West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade
75.3 57.6 52.3 69.5 74.1 67.0 59.2 50.2 57.7 52.6 67.6 60.5 79.0 79.3 52.8
61.73 57.00 32.55 53.82 55.82 43.73 49.45 49.82 59.36 55.18 66.82 41.55 68.64 64.55 49.82
82.05 55.72 51.72 65.59 69.85 63.59 27.08 32.21 49.59 30.08 71.79 78.72 76.41 74.36 52.46
60.00 18.36 23.00 12.00 49.09 21.00 58.64 27.73 0.00 0.00 0.00 0.00 62.27 47.45 0.00
60.15 45.46 42.31 50.77 52.46 36.38 53.85 31.92 30.77 33.92 49.46 50.46 65.77 63.85 31.54
53.75 22.00 57.50 67.50 63.13 20.00 54.00 24.75 9.38 19.38 53.75 26.25 53.13 54.00 25.00
50.00 49.31 49.31 49.31 49.31 51.77 49.62 53.85 50.77 26.92 49.46 46.92 53.85 58.46 56.15
Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP
34
Conclusions
Project Management can be successful Requirements can be managed Projects can be sized Performance can be successfully estimated Process improvement can be modeled Measurement can be accomplished
35
Contact Information
36
Contact Information
International Function Point Users Group (IFPUG) www.ifpug.org Practical Software and Systems Measurement (PSM) www.psmsc.com Software Engineering Institute (SEI) www.sei.cmu.edu Software Quality Engineering (SQE) www.sqe.com Quality Assurance Institute (QAI) www.qaiusa.com
37