Unit 1
Unit 1
BASICS OF METROLOGY
UNIT I BASICS OF
METROLOGY
Introduction to Metrology – Need
–
Elements – Work piece,
Instruments – Persons –
Environment their effect on
Precision and
Accuracy – Errors –
Errors in Measurements –
Types – Control Types of
Definition
Metrology is the name given to the science of pure
measurement.
Engineering Metrology is restricted to measurements
of length & angle
Measurement is defined as the process of numerical
evaluation of a dimension or the process of
comparison with standard measuring instruments
3
Definitions
4
Definitions
5
6
importance of metrology
The importance of metrology to society
Measurements have been carried out for as
long as civilization has existed.
Metrology is basic to the economic and social
development of a country.
It is concerned with providing accurate
measurements which impact our economy,
health, safety and general well-being.
7
METROLOGY
Metrology is a wide reaching field, but
can be summarized through three
basic activities:
the definition of internationally
accepted units of measurement,
the realisation of these units of
measurement in practice, and
the application of chains of traceability
(linking measurements to reference
standards)
8
Types of Metrology
1.Scientific Metrology
2.Industrial Metrology
3.Legal Metrology
4.Dynamic Metrology
5.Deterministic Metrology
9
10
Types of Metrology
Scientific
Metrology Industrial Metrology
11
Legal Metrology
12
Types of Metrology
4. Dynamic metrology is
the technique of
measuring small
variations of a
continuous nature.
The technique has proved
very valuable, and a
record of continuous
measurement, over a
surface, for instance, has
obvious advantages over
individual measurements
13
of an isolated character.
Types of Metrology
5. Deterministic metrology
Deterministic metrology is a new philosophy
in which part measurement is replaced by
process measurement.
The new techniques such as 3D error
compensation by CNC (Computer Numerical
Control) systems and expert systems are
applied, leading to fully adaptive control.
This technology is used for very high
precision manufacturing machinery and
control systems to achieve micro technology
14
and nanotechnology accuracies.
Important Terms in Metrology
Span – It can be defined as the range of
an instrument from the minimum to
maximum scale value.
In the case of a thermometer, its scale goes
from −40°C to 100°C. Thus its span is 140°C.
Range –
It can be defined as the measure of
the instrument between the lowest and
highest readings it can measure.
A thermometer has a scale from −40°C to
100°C. Thus the range varies from −40°C to
Important Terms in Metrology
Instrument error
refers to the error of a measuring instrument, or
the difference between the actual value and the
value indicated by the instrument.
There can be errors of various types, and the
overall error is the sum of the individual errors.
Accuracy
is the ability of the instrument to measure
the accurate value. In other words, it is the
closeness of the measured value to a standard or
true value.
PRECISION
Precision- Precision is defined as the ability of the instrument
to reproduce a certain set of readings within a given accuracy
or
It is the degree of exactness for which an instrument is
designed or intended to perform.
It is composed of two characteristics
Conformity- the error created by the limitation of the scale
reading.
Significant figure- it conveys the actual information
regarding the magnitude and the measurement precision of a
quantity.
Accuracy VS Precision
Relates to the quality of a result which Relates to the quality of the result by
relates to the quality of the operation which a result is obtained
by which the result is obtained
22
Important Terms in Metrology
23
24
Important Terms in Metrology
Calibration --is the comparison of
measurement values delivered by a
device under test with those of a
calibration standard of known
accuracy.
27
Important Terms in Metrology
Tolerance ----refers to the total allowable error within an
item. This is typically represented as a +/- value off of a
nominal specification. Products can become deformed due
to changes in temperature and humidity, which lead to
material expansion and contraction, or due to improper
feedback from a process control device.
Overshoot: When an input is applied to instruments, the
pointer does not immediately come to rest at its steady
state (or final deflected) position but goes beyond it or in
other words overshoots its steady position.
The overshoot is evaluated as the
maximum amount by which
moving
system moves beyond
the steady state position.
28
29
30
Important Terms in Metrology
31
Accuracy of measurement
• Accuracy is the ability of an instrument to respond to a
true value of a measured variable under reference
conditions.
OR
• Accuracy is defined as the degree to which the
measured value agrees with the true value.
33
Accuracy of measurement depends upon
the following factors.
Ability of the operator
Variation of temperature
Method adopted for measurement
Deformation of the instrument
34
35
PRECISION
Precision- Precision is defined as the ability of the instrument
to reproduce a certain set of readings within a given accuracy
or
It is the degree of exactness for which an instrument is
designed or intended to perform.
It is composed of two characteristics
Conformity- the error created by the limitation of the scale
reading.
Significant figure- it conveys the actual information
regarding the magnitude and the measurement precision of a
quantity.
36
Accuracy VS Precision
Relates to the quality of a result which Relates to the quality of the result by
relates to the quality of the operation which a result is obtained
by which the result is obtained
40
3.Factors affecting the inherent characteristics of instrument.
- Scale error
- effect of friction, hysteresis, zero drift
- calibration errors
- repeatability and readability
4. Factors affecting person:
- training skill
- ability to select the measuring instruments and standard
5. Factors affecting environment:
-temperature, humidity etc.
-clean surrounding and minimum vibration enhance precision
-temperature equalization between standard, workpiece and
41 instrument,
Sensitivity
Sensitivity- Sensitivity is defined as the ratio of the magnitude of
response (output signal) to the magnitude of the quantity being
measured (input signal)
If the calibration curve is liner, as shown, the sensitivity of the instrument is the slope
of the calibration curve.
If42
the calibration curve is not linear as shown, then the sensitivity varies with the input.
READABILITY
47
Calibration
calibration is the comparison of
measurement values delivered by a device
under test with those of a calibration standard
of known accuracy.
55
DEFLECTION AND NULL TYPE
Physical effect generated by the measuring quantity
Equivalent opposing effect to nullify the physical effect
caused by the quantity
56
ANALOG AND DIGITAL INSTRUMENTS
Physical variables of interest in the form of
continuous or stepless variations
Physical variables are represented by digital quantities
57
ACTIVE AND PASSIVE INSTRUMENTS
Instruments are those that require some source of
auxiliary power
The energy requirements of the instruments are met
entirely from the input signal
58
Automatic and manually operated
Manually operated – requires the service of human
operator
Automated – doesn't requires human operator
59
Contacting And Non Contacting Instruments
A contacting with measuring medium
Measure the desired input even though they are not
in close contact with the measuring medium
60
Absolute and Secondary Instruments
These instruments give the value of the electrical quantity in
terms of absolute quantities
Deflection of the instruments can read directly
61
Intelligent instruments
Microprocessors are incorporated with
measuring instruments
62
Characteristics of Measuring Instrument
Sensitivity
Stability
Readability
Range of accuracy
Precision
63
Accuracy
Accuracy = the extent to which a measured value
agrees with a true value
The difference between the measured value & the true
value is known as ‘Error of measurement’
Accuracy is the quality of conformity
64
Precision
The precision of a measurement depends on the
instrument used to measure it.
For example, how long is this block?
65
Performance of Instruments
All instrumentation systems are characterized
by the system characteristics or system
response
There are two basic characteristics of
Measuring instruments, they are
Static character
Dynamic character
66
Static Characteristics
67
STATIC CHARACTERISTICS OF AN
INSTRUMENTS
Dead zone
Accuracy
Backlash
Precision
True value
Sensitivity
Resolution Hysteresis
Threshold Linearity
characteristics’.
69
Dynamic Characteristics
Steady state periodic
Transient
Speed of response
Measuring lag
Fidelity
Dynamic error
70
Steady state periodic – Magnitude has a definite repeating
time cycle
measured quantity
71
Measuring lag
Retardation type :Begins immediately after the
change in measured quantity
Time delay lag : Begins after a dead time after the
application of the input
Fidelity- The degree to which a measurement system
indicates changes in the measured quantity without
error
Dynamic error- Difference between the true value of
the quantity changing with time & the value indicated
by the measurement system
72
Correction
Correction is defined as a value which is added
algebraically to the uncorrected result of the
measurement to compensate to an assumed systematic
error.
Ex : Vernier Caliper, Micrometer
73
The smallest value that can be measured
by the measuring instrument is called
its least count.
74
75
Thank you