Chapter 11 The Seven Quality Control Tools and Intoduction To Statistics
Chapter 11 The Seven Quality Control Tools and Intoduction To Statistics
Chapter 11 The Seven Quality Control Tools and Intoduction To Statistics
Members:
May 2, 2019
Mrs. Weddie Mae Villariza
Introduction
There are seven basic quality tools, which can assist organization for problem solving
and process improvement. These seven basic quality control tools are: 1.) Process flow
chart, 2). Cause and effect Diagram, 3.) Check Sheet, 4.) Scatter diagram, 5.) Pareto
Chart, 6.) Histogram 7.) and Control chart. In terms of importance, these are the most
useful. All this tools are important tools used widely at manufacturing field to monitor
the overall operation and continuous process improvement. This tools are used to find
out root causes and eliminates them , thus the manufacturing process can be improved.
The modes of defects on production line are investigated through direct observation on
the production line and statistical tools.
A process flowchart is a diagrammatic view of the various steps in sequential order that
form an overall process in an organization. Flowcharts are used in the quality
management for depicting the processes in an easily understandable form. Process
flow chart should indicate the various steps in the process which means, all the sub-
processes and their inputs and outputs are documented in one diagram. This helps not
only the employees, but also the quality system auditors in understanding how the
processes are functioning in the organization. As a prerequisite for ISO 9000
certification, process flow charts are insisted and the organizations are asked to
document the processes.
There are a number of benefits, process flow chart could offer to any managements.
One of those are enabling employees to understand the processes in the organization
that makes them to be motivated and act towards poor quality, motivates process owner
to improve quality of his own process, helps management to analyze and eliminate
unnecessary processes, improves communication, training the new employees and
helps them to realize how they take part in the process.
Example of a Process Flow Chart
Prof. Ishikawa at the University of Tokyo developed this tool in the year 1943. It is
known as Ishikawa diagram and also called fish bone diagram. The tip of the fish is the
effect to be achieved. The causes are written in the bone portion of the possible causes,
which will help in achieving the effect. The causes can be grouped under a number of
main causes. Each main cause can have a number of level one causes related to the
main cause. The various steps involved in formulating the cause and effect diagram are
the first step is to determine the quality characteristics to be improved and the second
step is that the teams are allowed to generate many ideas without interruption. This can
be done through brainstorming, and this enable them to create recommendations and it
should be implemented by the management. It also helps the team members to
understand the process better.
The check sheet is one of the seven basic tools of quality control made popular by Dr.
Kaoru Ishikawa. A check sheet is a structured, prepared form for collecting and
analyzing data. This is a generic data collection and analysis tool that can be adapted
for a wide variety of purposes and is considered one of the seven basic quality tools.
The document is typically a blank form that is designed for the quick, easy, and efficient
recording of the desired information. When the information collected is quantitative in
nature, the check sheet can also be called as tally sheet.
The very purpose of checklist is to list down the important checkpoints or events in a
tabular/metrics format and keep on updating or marking the status on their occurrence
which helps in understanding the progress, defect patterns and even causes for defects.
Scatter diagram helps in analyzing the relationship between two variables. It is also
called scatter plot or X-Y graph. In the x-axis of it, the plotting of the variable comes in
while on the y-axis, the plotting of the effect of the variable is done. If the variables are
correlated, the points will fall along a line or curve. The better the correlation, the tighter
the points will hug the line.
Juran defined Pareto principle in the year 1950 in a similar manner. The Pareto principle
essentially suggests that 80% are due to 20% of the causes, i.e machines, raw
materials or operators. Similarly, Pareto chart helps in identifying the problems in the
organization that cause the greatest loss of profit. Therefore, it is important that the
organization finds out the few vital problems and eliminates them so that success can
be achieved.
The Pareto chart is also a two-dimensional picture and has two axes X and Y.
Pareto analysis can be used in identifying significant quality costs. It can be used in
diverse applications such as formulating specifications.
Tool 6: Histogram
Histogram is a powerful tool for elementary analysis of data that contains varations.
Statistics is cocerned to the information that are not vary. No items will be identical.
Machined parts, whatever may be the superiority of the machinery, operator, materials
will have variations. Histogram was developed by AM Gurrey a french statistician in the
year 1833. It is, also, a graph. In the graph, the resistance values are plotted in the X-
axis and the frequency of occurrence of values are plotted in the Y-axis.
The frequency of occurence is the number of times the values fallinh between the given
values was measured. Divide tha values into numbers of group called Class Interval
One criterion is to divide the range in to number of class interval equal to the square
root of the number of readings or data or measurement results.
You may take a look in the table above stating the nine data points. Square root of 9 is
3 gorups may be suffecient. They have its class interval as the convenient number. The
lowest point is 99.7 and the highest is 100.25. The case is we divide the data into 5
groups. When we divide the range we can get tha class interval which is 0.1 . The
number of cells or class intervals are to be decided by user.
If the middle points are joined through a smooth curve it will result in a bell shaped
curve.
Histogram were shown as column graphs of frequency and then as a normal histogram-
bell shaped curve.
Fundamentals of Statistics
Statistics has been defined as the collection, organization, analysis, intepretation and
presentation of data. It provide logical analysis and decision-making ability with sample
data. The degree of confidence varies depending on many factors; but it has been
accepted as an effective tool when used correctly. We can use the milk as an example
of statistics. It is impossible to survey all the people for collectong this data because it
will be time consuming and expensive. Therefore, few people of various spectrum of
living can be contacted to find out what is their daily need. Then the average
consumption can be calculated.
Definition of Terms
1. Statistic- A numerical data measurement taken from a sample that may used to
make interference about a population. Numerical dara measurement is measuring
an entity and expressing the result as number. The measurement from the given
example of milk is a numerical data. Statistic is used to make an interference about
a population.
3. Population and Sample- Population is thus a large collection of items of the same
type. For instance, the bottle manufactured in shift called Population.
Three quality gurus Shewahrt,Deming and Juran helped Japanese to use statictics to
control and improve the destiny of the nation.
The type of frequency distribution as shown in the histogram examples called normal
distribution.Distribution occurs when there is a concentration of observations about the
average. While other distributions depict characterictics will occur above and below
average. Tha frequency distribution may defined as "Tabulation of Tally of the number
of times a given quality characterisrics measurement occurs within the sample product
being checked."
a) Arithmetic Mean
b) Median (MidPoint)
c) Mode
a) Arithmetic Mean- the average of a sample is obtained by dividing the sum of
values by the number of readings.
Measures of Dispersion
1. Range -The Range is the difference between the lowest and highest values.
Formula: Range = X max – X min
Example: In {4, 6, 9, 3, 7} the lowest value is 3, and the highest is 9
Range= 9-3
Range= 6
The normal distribution is a probability function that describes how the values of a
variable are distributed. It is a symmetric distribution where most of the observations
cluster around the central peak and the probabilities for values further away from the
mean taper off equally in both directions.
1. The frequency of the mean value of the population mean is the highest.
2. 68.26% of all values in the distribution will occur between plus or minus 1
population standard deviation over the mean and value.
3. 95.45% of all values occur between
5. The curve never touches the x-axis. The curve extends between
Variations in Distribution
Distribution may be skewed to the left and right which can happened due to some
malfunctioning in the equipment or poor judgment of the operator. If put together
samples fabricated on two machines or by two operators in one machine as one lot, it
can depict bi-modal distribution.
Tool 7: Control Chart
Assume that we are manufacturing peanut packets. The marked weight of peanut
packets shipped is 100gram and we pick up few samples of the packets and weght
them and record the result. The central horizontal line indicates the mean weight of the
packets, which is 100gram. The days( time) are in the X-axis and the average values
measured in the Y-axis. We have two more horizontal lines, one above the mean and
the other below the mean. They are called upper control limit(UCL) and the lower
control limit(LCL) respectively. They are called statistical limits. In this case 103gram
and 97gram respectively. If the process is under contol, it means that the variation is
only due to common causes. If any point lying outside the limits is due to a special
cause variations.
In such cases, the organization should make effort to find root cause of the problem and
eliminate it whether this cause due to the inefficiency of the causes of variation.
Specification limits are set by the customer.
Statistical limits are decided by the process performance.
Process capability analysis involves comparing the process variation with the
specification tolerance. This is carried out to confirm the suitability of the process.
Another thing is that of variations which is illustrated below.
SPC (Statiscal Process Control) provides a set of tools for building process capability in
any organization.
Random cause is also known as common cause. This can be due to random variations
in inputs to the process such as temperature changes, voltage changes, sudden
emotional change of the operator, etc.
Special cause is also known as assignable cause. This may due to a particular vendor
supplied product or a particular machine or operator,etc.
Accuracy and precision are terms used to describe the quality of measuring
instruments. They can also be used to describe the process capability. Accuracy refers
to the ability of a process whose performance is within the tolerance. Precision refers to
the degree of variation in the measured values.
Measures of Accuracy
Measures of Accuracy gives the location of the central value of the normal distribution.
Precision-Measure of Spread of Values
Range- It is the difference between the upper limit and lower limit. The mean range is
the average of samples means.
Standard Deviation- It should be used to find the precision when the sample size is
more than 10.
This is the percentage of products that the customer is prepared to accept outside the
outside tolerance.
We can use the standard normal distribution to fin out the maximum standard deviation
that is acceptable to the meet the customers requisition
Example