INFORMATION
Thank you for downloading this template. We hope you'll enjoy it and will be able to use it well for your purposes.
If you enjoy it, we would appreciate it, if you buy us a beer through the website, or visit one of our sponsors
through the advertisements on the website. This way we can continue to develop new templates in the future and
keep on offering them for free.
HOW TO USE THIS TEMPLATE?
1) Fill out the Reference and Trial values.
Fill out the reference values.
If the test sample is OK, fill out 1.
If the test sample is NOK, fill out 0.
Fill out the actual Trial results from the appraisers.
2) Fill out the Gauge, Part and Appraiser information.
All the other cells contain formules and will calculate automatically the KAPPA, False Alarm Rates, etc.
These cell are protected, but you can easily unprotect the workbook as it doesn't contain a password.
Logo
Format No :
Appraiser Reference A-1 A-2 A-3 B-1 B-2 B-3 C-1 C-2 C-3 Code Rev No & Date :
Part 1 = Acc. ABC EFG HIJ
+ All Acc
- All Rej
GAUGE REPEATABILITY AND REPRODUCIBILITY REPORT
0 = Rej.
x Mismatched
1 1 1 1 1 1 1 1 1 1 1 x
2 1 1 1 1 1 1 1 1 1 1 x Part Number ABCD 1234 Appraiser A ABC
3 0 0 0 0 0 0 0 0 0 0 - Part Name Air duct Appraiser B EFG
4 0 0 0 0 0 0 0 0 0 0 - Project PJ123 Appraiser C HIJ
5 0 0 0 0 0 0 0 0 0 0 - Gauge ID 123456 No. of Parts 50
6 1 1 1 0 1 1 0 1 0 0 x Gauge Name Contour Gauge No. of Appraiser 3
7 1 1 1 1 1 1 1 1 0 1 x Gauge Type Go - NoGo No. of Trials 3
8 1 1 1 1 1 1 1 1 1 1 x Measured Characteristic Contour Date performed 30-Nov-23
9 0 0 0 0 0 0 0 0 0 0 -
10 1 1 1 1 1 1 1 1 1 1 x System % Effective Score
11 1 1 1 1 1 1 1 1 1 1 x System % Effective System % Effective Score vs.
12 0 0 0 0 0 0 0 0 1 0 x Score Reference
GAUGE PICTURE
All appraisers agreed within and All appraisers agreed within and between themselves
13 1 1 1 1 1 1 1 1 1 1 x between themselves AND agreed with standard
14 1 1 1 0 1 1 1 1 0 0 x Total Inspected 50 50
15 1 1 1 1 1 1 1 1 1 1 x # in Agreement 39 39
16 1 1 1 1 1 1 1 1 1 1 x 95% UCI 88.47% 88.47%
17 1 1 1 1 1 1 1 1 1 1 x Calculated Score 78.00% 78.00%
18 1 1 1 1 1 1 1 1 1 1 x 95% LCI 64.04% 64.04%
19 1 1 1 1 1 1 1 1 1 1 x
20 1 1 1 1 1 1 1 1 1 1 x % Appraiser KAPPA Appraisers
21 1 1 1 0 1 0 1 0 1 0 x % Appraiser % Score vs. Attribute A B C
22 0 0 0 1 0 1 0 1 1 0 x Appraiser agreed with him/herself on all trials Appraiser agreed on all trials with standard A - 0.86 0.78 Rule of thumb:
23 1 1 1 1 1 1 1 1 1 1 x Source A B C A B C B 0.86 - 0.79 kappa value > 0.75 indicate good to
excellent agreement,
24 1 1 1 1 1 1 1 1 1 1 x Total Inspected 50 50 50 50 50 50 C 0.78 0.79 - values less than 0.40 indicate poor
25 0 0 0 0 0 0 0 0 0 0 - # Matched 42 45 40 42 45 40 KAPPA Appraisers To Reference agreement.
26 0 0 1 0 0 0 0 0 0 1 x False Negative / Over Reject 0 0 0 A B C
27 1 1 1 1 1 1 1 1 1 1 x False Positive / Under Reject 0 0 0 Kappa 0.88 0.89 0.77
28 1 1 1 1 1 1 1 1 1 1 x Mixed 8 5 10
29 1 1 1 1 1 1 1 1 1 1 x 95% UCI 92.83% 96.67% 89.97% 92.83% 96.67% 89.97% Effectiveness Summary
30 0 0 0 0 0 0 1 0 0 0 x Calc. Score 84.00% 90.00% 80.00% 84.00% 90.00% 80.00% Appraiser A B C
31 1 1 1 1 1 1 1 1 1 1 x 95% LCI 70.89% 78.19% 66.28% 70.89% 78.19% 66.28% Count of Under Reject 3 3 6
32 1 1 1 1 1 1 1 1 1 1 x Total Reject opportunities 48 48 48
33 1 1 1 1 1 1 1 1 1 1 x % Score vs. Attribute Count of Over Reject 5 2 9
34 0 0 0 1 0 0 1 0 1 1 x Total Accept opportunities 102 68 102
35 1 1 1 1 1 1 1 1 1 1 x 100% Effectiveness 84.0% 90.0% 80.0%
36 1 1 1 0 1 1 1 1 0 1 x Upper limit Miss Rate (Under Reject) 6.3% 6.3% 12.5%
80%
37 0 0 0 0 0 0 0 0 0 0 - False Alarm Rate (Over Reject) 4.9% 2.9% 8.8%
Percent
Lower limit
38 1 1 1 1 1 1 1 1 1 1 x 60%
39 0 0 0 0 0 0 0 0 0 0 - Calculated False Alarm
40% Decision Guidelines Effectiveness Miss Rate
40 1 1 1 1 1 1 1 1 1 1 x Score Rate
41 1 1 1 1 1 1 1 1 1 1 x 20%
Acceptable for the appraiser ≥ 90% ≤ 2% ≤ 5%
42 0 0 0 0 0 0 0 0 0 0 -
0%
43 1 1 0 1 1 1 1 1 1 0 x A B C Marginally acceptable for the appraiser, may need
improvement ≥ 80% ≤ 5% ≤ 10%
44 1 1 1 1 1 1 1 1 1 1 x
45 0 0 0 0 0 0 0 0 0 0 - Appraiser
Unacceptable for the appraiser, needs improvement < 80% > 5% > 10%
46 1 1 1 1 1 1 1 1 1 1 x
47 1 1 1 1 1 1 1 1 1 1 x
48 0 0 0 0 0 0 0 0 0 0 -
49 1 1 1 1 1 1 1 1 1 1 x Gage acceptable? Remarks:
50 0 0 0 0 0 0 0 0 0 0 - ✘
Yes Accept Authorized Signature: Mr. There are no theory-based decision criteria on acceptable risk.
Total Acc 34 100 103 99 11 No
GaugeReject Date: 30-Nov-23 Management approves the gauge and accepts the risk as the process cabaility
Total Rej 16 50 47 51 39 Gauge studies show that the process is stable.
A * B Crosstabualtion B * C Crosstabualtion A * C Crosstabualtion
Inspector B Inspector C Inspector C
# Rej # Acc Total # Rej # Acc Total # Rej # Acc Total
# Rej 44 6 # Rej 42 5 # Rej 43 7
Inspector A
Inspector B
Inspector A
50 47 50
Expect 15.7 34.3 Expect 16.0 31.0 Expect 17.0 33.0
# Acc 3 97 # Acc 9 94 # Acc 8 92
100 103 100
Expect 31.3 68.7 Expect 35.0 68.0 Expect 34.0 66.0
Total 47 103 150 Total 51 99 150 Total 51 99 150
150 150 150
Observed p value Observed p value Observed p value
Inspector B Inspector C Inspector C
# Rej # Acc Total # Rej # Acc Total # Rej # Acc Total
Inspector A
Inspector B
Inspector A
# Rej 0.293 0.040 0.333 # Rej 0.280 0.033 0.313 # Rej 0.287 0.047 0.333
# Acc 0.020 0.647 0.667 # Acc 0.060 0.627 0.687 # Acc 0.053 0.613 0.667
Total 0.313 0.687 1 Total 0.340 0.660 1 Total 0.340 0.660 1
Expected p value Expected p value Expected p value
Inspector B Inspector C Inspector C
# Rej # Acc Total # Rej # Acc Total # Rej # Acc Total
Inspector A
Inspector B
Inspector A
# Rej 0.104 0.229 0.333 # Rej 0.107 0.207 0.313 # Rej 0.113 0.220 0.333
# Acc 0.209 0.458 0.667 # Acc 0.233 0.453 0.687 # Acc 0.227 0.440 0.667
Total 0.313 0.687 1 Total 0.340 0.660 1 Total 0.340 0.660 1
Appriasers Crosstabualtion Results
Appraiser A Appraiser B Appraiser C
pe = 0.562 pe = 0.560 pe = 0.553
po = 0.940 po = 0.907 po = 0.9
kappa = 0.863 kappa = 0.788 kappa = 0.776
150
A * Reference Crosstabualtion B * Reference Crosstabualtion C * Reference Crosstabualtion
Reference Reference Reference
# Rej # Acc Total # Rej # Acc Total # Rej # Acc Total
# Rej 45 5 # Rej 45 2 # Rej 42 9
Inspector A
Inspector A
Inspector A
50 47 51
Expect 16.0 34.0 Expect 11.6 24.7 Expect 16.3 34.7
# Acc 3 97 # Acc 3 100 # Acc 6 93
100 103 99
Expect 32.0 68.0 Expect 25.5 54.2 Expect 31.7 67.3
Total 48 102 150 Total 48 68 116 Total 48 102 150
150 116 150
Observed p value Observed p value Observed p value
Reference Reference Reference
# Rej # Acc Total # Rej # Acc Total # Rej # Acc Total
Inspector A
Inspector B
Inspector A
# Rej 0.300 0.033 0.333 # Rej 0.300 0.013 0.313 # Rej 0.280 0.060 0.340
# Acc 0.020 0.647 0.667 # Acc 0.020 0.667 0.687 # Acc 0.040 0.620 0.660
Total 0.320 0.680 1 Total 0.320 0.680 1 Total 0.320 0.680 1
Expected p value Expected p value Expected p value
Reference Reference Reference
# Rej # Acc Total # Rej # Acc Total # Rej # Acc Total
Inspector A
Inspector B
Inspector A
# Rej 0.107 0.227 0.333 # Rej 0.168 0.238 0.405 # Rej 0.109 0.231 0.340
# Acc 0.213 0.453 0.667 # Acc 0.367 0.521 0.888 # Acc 0.211 0.449 0.660
Total 0.320 0.680 1 Total 0.414 0.586 1 Total 0.320 0.680 1
Appriasers Crosstabualtion Results
Appraiser A Appraiser B Appraiser C
pe = 0.560 pe = 0.688 pe = 0.558
po = 0.947 po = 0.967 po = 0.9
kappa = 0.879 kappa = 0.893 kappa = 0.774