Improved Process and Systems Performance
Improved Process and Systems Performance
INTRODUCTION
Hoerl and Snee (2010) have called for the establishment of a new formal
discipline called statistical engineering.1 They see this discipline as a formalization of what many practicing applied statisticians have been doing for
years as they bridge the gap between statistical thinking and statistical methods. When successfully developed, they believe that statistical engineering
will ensure that statistical projects have high impact; integrate concepts of
statistical thinking with statistical methods and tools; and provide statisticians an opportunity to become true leaders in their organizations. They
have further proposed standards that should apply to successful statistical
engineering projects.
There is a small complication; the discipline that Hoerl and Snee (2010)
are proposing already exists.2 It has been evolving for more than six decades. Not only does the existing discipline share the same name but it meets
the goals and standards they suggest.
Statistical engineering as defined by R. D. Shainin (1993) is a rigorous
discipline for performance improvement in manufacturing, engineering,
and business processes. Statistical engineers solve and prevent problems
1
Churchill Eisenhart established a statistical engineering laboratory within the Bureau of
Standards in 1947. According to the NIST Website, much of the organizations current focus is
on the application of statistical methods to metrology.
2
Dorian Shainin taught courses at the University of Connecticut from 1950 through 1983 in statistical engineering. He received the EL Grant Medal in 1981 in recognition of that program.
171
STATISTICAL PERSPECTIVES
Hoerl and Snee (2010) wrote: The term statistical
engineering has been used before, perhaps most
notably by consultant Dorian Shainin, who generally
used it to indicate the application of statistical
approaches that were ad hoc (but generally worked)
rather than based on formal statistical theory (p. 52).
The American Society for Qualitys (ASQ) Website
(http://asq.org/about-asq/who-we-are/bio_shainin.
html, accessed 27 December 2011) notes Shainin
developed a discipline called statistical engineering.
He specialized in creating strategies to enable engineers to talk to the parts and solve unsolvable
problems. The discipline has been used successfully
for product development, quality improvement,
analytical problem solving, manufacturing cost
reduction, product reliability, product liability prevention, and research and development.
Shainins methods are in fact based on sound
statistical theory and are absolutely rigorous. Just
prior to Dorian Shainins retirement, Shainin Problem
Solving and Prevention engaged the services of Carl
Bennett, a distinguished statistician, to ensure that
the methods remained statistically sound.
Steiner et al. (2008) wrote: The Shainin System, as
reflected by the genesis of the methodology in
manufacturing is best suited for medium to high volume production. Most of the tools implicitly assume
that many parts are available for study (p. 18).
Bovenzi et al. (2010) described the use of statistical engineering in risk reducing the development
of a new ammonia sensor for diesel engine applications. Abrahamian et al. (2010) described a Shainin
system for ensuring product reliability. Both of
these papers develop profound knowledge from
extremely small sample sizes during the development process. Lloyd Nelson (1989) noted The
Shainin approach thrives on the tiniest of sample
sizes, plus of course, Mr. Shainins immense talent
(p. 78).
172
DIFFERENT WORLDVIEWS
Different worldviews lead to different problemsolving approaches.
Some problem solvers address problems by figuring out what is wrong. They see the problem from
the perspective of a mechanic. They study symptoms
173
Dr. Joiner uses a quincunx as the model for common cause variation. A quincunx is a mechanical
device invented by Francis Galton that demonstrates
the approximation of the binomial distribution to the
normal distribution when the probabilities of p and q
are 0.5. Beads are dropped over an array of equally
spaced offset pins. As each bead hits the first pin, it
must bounce either left or right. At the next row it
again must go either left or right. Each row is a
source of variation and the additive nature of several
independent sources of variation where each source
makes an equal contribution matches a normal distribution. The quincunx illustrated in Dr. Joiners
Fourth Generation Management has 10 rows.
174
These differences in understanding about the nature of variation drive very different strategies and
approaches to problem solving.
PROBLEM SOLVING IN A
QUINCUNX WORLD
Under the quincunx paradigm, all factors contribute equally to the change in Y. This means that
b1rx1 b2rx2 b3rx3 bmrxm E. In order for this to
be true, the amount of change for each input must
balance precisely with its coefficient so the products
are all equal. Because the variation in each input is a
function of operating conditions in the process or in
upstream processes and the coefficients are functions
of chemistry, physics, or geometry, the quincunx paradigm represents a remarkable set of coincidences.
This model is the foundation for the fifth of
Dr. Demings (1986) famous 14 points: Improve
constantly and forever the system of production and
service (p. 23). Once special causes and easy-to-find
common causes have been eliminated, the best
you can achieve are small incremental reductions in
variation with each improvement to the system.
PROBLEM SOLVING IN A
RED X WORLD
The Red X paradigm is consistent with statistical
thinking as defined by Britz et al. (1996, 2000):
.
175
.
.
.
.
Following Shainins principles, an effective statistical engineering investigation uses a series of iterative
steps to converge on the identity of the Red X. The
following steps can be found in every successful
project:
1. Narrow the focus of the project to a single
problem.
2. Develop a clue generation strategy depending
on the nature of the problem and the nature of
the system or process. Initial strategies never
include a list of variables.
3. Execute the strategy to eliminate broad categories of potential variation sources. Clue generators are statistical tools that leverage
stratification and disaggregation to discover the
largest sources of variation. They are more
hands-on than observational studies but less
detailed than designed experiments.
R. D. Shainin
remaining system inputs to vary within their tolerances. At this point the problem could be contained
at the motor supplier and the wiper manufacturer
could reliably produce good systems.
Containment is not an effective solution. It should
be treated as a tourniquet and used temporarily until
the Red X is found in the supply chain. A component
search at the motor supplier eliminated the motor
assembly process and all components except for
component Q. A paired comparison of component
Q revealed that feature K was the Red X. An examination of the process that created feature K revealed
the critical process parameter that would need to
be more tightly controlled in the future. Feature K
had a manufacturing tolerance but not an engineering tolerance. Its impact on wiper sweep angle was
missed completely. It had not been identified in the
detailed model of wiper system performance. New
standardized work was developed for the production
of component Q and the problem was killed.
Though a sophisticated math model of wiper system performance with 23 contributing factors was
used to design the system, it was ineffective as a
problem-solving tool. When those 23 factors were
held as close to nominal as possible, the wiper build
still produced the full range of sweep angles. Clearly,
there was something missing. The answer was not
going to be found by studying the model. It was
found by talking to the parts. It is valuable to note
that feature K was meeting the initial manufacturing
tolerance and all motors were meeting the original
engineering specifications. The sweep angle problem could only be observed at the wiper system
level.
The statistical engineering approach used a progressive search based on a process of elimination.
Each subsequent step could only be determined
based on the results of the previous steps. Step by
step every factor that could not be the Red X was
eliminated until finally only one variable still fit the
clues. Once feature K on component Q was found
to be the Red X, the model for wiper system performance was updated.
Figure 2 illustrates the relative strength of the
high-level contributors to wiper sweep angle variation. Controlling the motor feature was the only
way to achieve a substantial reduction.
Statistical engineering, as developed by Shainin,
is a contrast-based approach to problem solving.
178
These are a small subset of successfully completed projects. Individuals who are already certified do not submit subsequent projects.
In addition, many students have conducted successful investigations without seeking certification.
179
5
A Green Y1 is a system output that has been developed or selected to provide engineering insight. Its patterns of change guide
the investigator to the Red X.
R. D. Shainin
Following are a few examples of statistical engineering projects where the Red X was a complete
surprise:
1. Engine block burn-in: An automotive foundry had
been tolerating high scrap and rework costs associated with burn-in on cast iron engine blocks.
Burn-in is a condition where sand from the mold
becomes embedded in the skin of the casting.
The problem had persisted for more than 40 years
and the plant metallurgists believed that it was
inherent in the process (common cause). A
contrast-based convergent investigation started
with the development of a new response to create
variable data and a measurement system assessment to confirm the new measurement systems
ability to discriminate. These steps were followed
by a multivari (R. D. Shainin 2008b) study to
identify the largest family of variation; a group
comparison to isolate variables that were at different levels during low and high burn-in times; and
a full-factorial experiment that identified a
three-factor chemical interaction as the Red X. A
response surface map identified the optimum
factor settings to eliminate the burn-in. Even after
the answer had been found and demonstrated,
the lead metallurgist resisted changing procedures
because the answer did not conform to his vision
of how the process should be working (Deland
and Meyer 1990).
2. Risk reducing the development of a new
ammonia sensor: Bovenzi et al. (2010) described
the use of Isoplot1, Function ModelsTM, Variable
180
CONCLUSIONS
The path that Hoerl and Snee (2010) are proposing to formalize has already been blazed. Statistical
engineering as developed by Shainin fills an important position between strategic and statistical thinking and the application of both statistical and
nonstatistical tools. Sound statistical engineers are
rigorous in their use of statistics. They understand
when to take random samples and when to take
stratified samples. They understand the balance
between alpha risk and beta risk and the dangers
of making unsupported inferences from samples.
They use engineering insight to select stratified samples or disaggregate process or system responses in
order to see the patterns of variation that lead to convergence. Their approaches often mirror the procedures suggested by Ellis Ott (1975).
REFERENCES
Abrahamian, J., Hell, R., Hysong, C. (2010). The Red X1 System for product reliability. Available at: https:==shainin.com=library=reliability_
white_paper (accessed 28 December 2011).
Balestracci, D. (2008). Shainin system discussion. Quality Engineering,
20:3135.
Bovenzi, P., Bender, D., Bloink, R., Conklin, M., Abrahamian, J. (2010).
Risk reducing product and process design during new product
183