0% found this document useful (0 votes)
103 views15 pages

Attribute R&R

This document provides an overview of an attribute gage study, which examines the bias and repeatability of an attribute measurement system. It defines key terms like kappa value, effectiveness, miss rate, and false alarm rate. It outlines acceptance guidelines for these metrics and provides an example of attribute gage study results, noting the kappa values between appraisers were generally good but one appraiser's false alarm rate was unacceptable. It recommends additional training could improve agreement levels.

Uploaded by

Rabiul
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views15 pages

Attribute R&R

This document provides an overview of an attribute gage study, which examines the bias and repeatability of an attribute measurement system. It defines key terms like kappa value, effectiveness, miss rate, and false alarm rate. It outlines acceptance guidelines for these metrics and provides an example of attribute gage study results, noting the kappa values between appraisers were generally good but one appraiser's false alarm rate was unacceptable. It recommends additional training could improve agreement levels.

Uploaded by

Rabiul
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

ATTRIBUTE GAGE

R&R
An Overview
Definition
• An attribute gage study is a study that examines the bias and
repeatability of an attribute measurement system.
• Applicable for Go/No gage.
Definitions:
Kappa Value: Cohen's kappa coefficient is a statistic that is used to measure inter-rater reliability for
qualitative items. It is generally thought to be a more robust measure than simple percent agreement
calculation, as κ takes into account the possibility of the agreement occurring by chance.

• Kappa ranges from -1 to 1.


• A general rule of thumb is that values of kappa greater than 0.75 indicate good to excellent agreement
(with a maximum kappa = 1)
• Values les than 0.40 indicate poor agreement."
• Effectiveness: The ability to accurately detect conforming and
nonconforming parts. This is calculated from 0 to 1,where 1 is perfect.
• Miss Rate: This refers to a situation when decision is ok for rejection
part.
• Fales Alarm Rate: This refers to a situation when decision is rejected
for ok part.
Acceptance Guideline
Effectiveness Miss Rate False Alarm Rate
Decision

Acceptable for the


> 90% < 2% < 5%
appraiser

Marginally acceptable
80% to 90% 2% to 5% 5%to 10%
for the appraiser

Unacceptable for the


< 80% > 5% > 10%
appraiser
Procedure
• Select the GO/No Gage whose Attribute R&R have to be studied.
• Select at least 30 samples whose values are measured with the
variable gage.
• If the value is in specification limit, then the sample is termed as
“OK”. Otherwise it is “NOK”.
• Select at least 2 appraiser & each part have to be measured 3
times.
• Measurement system should be randomize to avoid bias.
Examples
Effectivenes Miss
Appraiser kappa Value (Appraiser VS Expert) False Alarm Rate
s Rate
Moinul 0.92 96.7 7.4 1.6
Arif 0.89 95.5 11.1 1.6
Rozob 0.95 97.8 3.7 1.6
Total System   96.7 7.4 1.6

Kappa Values (Within Appraiser)


  Moinul Arif Rozob
Moinul   0.86 0.87
Arif 0.86   0.84
Rozob 0.87 0.84  
Comment:
1. Effectiveness & miss rates are looked satisfactory result.
2. Rozob scored a false alarm rate of 7.4 & Arif scored 11.1 which is
not accepted.
3. Environment false alarm rate score is 7.4, which is marginally
accepted.
4. A training is required especially for Moinul & Arif.
Assessment Agreement Date of study:
Reported by:
Name of product: 1.Shows the corelation of the
Misc:
appriser.
2. As sample size is low,
Within Appraisers Appraiser vs Standard confidence interval is used
100 95.0% CI 100 95.0% CI
Percent Percent

95 95

• Indicates the actual


90 90
performance of the appraiser.
• Moinul agreed 90%, Arif
Percent

Percent
85 85
agreed 83% and Rozob agreed
80 80 93% to the expert opinion.
• Another training session will
75 75 be better specially for Arif.

70 70

Moinul Arif Rozob Moinul Arif Rozob


Appraiser Appraiser

You might also like