0% found this document useful (0 votes)
14 views76 pages

Unit 1

Basics of Metrology
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views76 pages

Unit 1

Basics of Metrology
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 76

UNIT-1

BASICS OF METROLOGY
UNIT I BASICS OF
METROLOGY
Introduction to Metrology – Need

Elements – Work piece,
Instruments – Persons –
Environment their effect on
Precision and
Accuracy – Errors –
Errors in Measurements –
Types – Control Types of
Definition
Metrology is the name given to the science of pure
measurement.
Engineering Metrology is restricted to measurements
of length & angle
Measurement is defined as the process of numerical
evaluation of a dimension or the process of
comparison with standard measuring instruments

3
Definitions

4
Definitions

5
6
importance of metrology
The importance of metrology to society
Measurements have been carried out for as
long as civilization has existed.
Metrology is basic to the economic and social
development of a country.
It is concerned with providing accurate
measurements which impact our economy,
health, safety and general well-being.

7
METROLOGY
Metrology is a wide reaching field, but
can be summarized through three
basic activities:
the definition of internationally
accepted units of measurement,
the realisation of these units of
measurement in practice, and
the application of chains of traceability
(linking measurements to reference
standards)
8
Types of Metrology
1.Scientific Metrology
2.Industrial Metrology
3.Legal Metrology
4.Dynamic Metrology
5.Deterministic Metrology

9
10
Types of Metrology
Scientific
Metrology Industrial Metrology

11
Legal Metrology

12
Types of Metrology

4. Dynamic metrology is
the technique of
measuring small
variations of a
continuous nature.
The technique has proved
very valuable, and a
record of continuous
measurement, over a
surface, for instance, has
obvious advantages over
individual measurements
13
of an isolated character.
Types of Metrology
5. Deterministic metrology
Deterministic metrology is a new philosophy
in which part measurement is replaced by
process measurement.
The new techniques such as 3D error
compensation by CNC (Computer Numerical
Control) systems and expert systems are
applied, leading to fully adaptive control.
This technology is used for very high
precision manufacturing machinery and
control systems to achieve micro technology
14
and nanotechnology accuracies.
Important Terms in Metrology
Span – It can be defined as the range of
an instrument from the minimum to
maximum scale value.
In the case of a thermometer, its scale goes
from −40°C to 100°C. Thus its span is 140°C.
Range –
It can be defined as the measure of
the instrument between the lowest and
highest readings it can measure.
A thermometer has a scale from −40°C to
100°C. Thus the range varies from −40°C to
Important Terms in Metrology
Instrument error
 refers to the error of a measuring instrument, or
the difference between the actual value and the
value indicated by the instrument.
 There can be errors of various types, and the
overall error is the sum of the individual errors.
Accuracy
 is the ability of the instrument to measure
the accurate value. In other words, it is the
closeness of the measured value to a standard or
true value.
PRECISION
Precision- Precision is defined as the ability of the instrument
to reproduce a certain set of readings within a given accuracy
or
It is the degree of exactness for which an instrument is
designed or intended to perform.
It is composed of two characteristics
Conformity- the error created by the limitation of the scale
reading.
Significant figure- it conveys the actual information
regarding the magnitude and the measurement precision of a
quantity.
Accuracy VS Precision

Figure shows the difference between the concepts of accuracy


versus precision using a dartboard analogy that shows four different
scenarios that contrast the two terms.
A: Three darts hit the target center and are very close together = high
accuracy and precision
B: Three darts hit the target center but are not very close together =
high accuracy, low precision
C: Three darts do not hit the target center but are very close together
= low accuracy, high precision
D: Three darts do not hit the target center and are not close together
= low accuracy and precision
Difference between accuracy and precision
Accuracy Precision
Measure of rightness Measure of exactness

How closely a measured value agrees How closely individual measurement


with the correct value agrees with each other

Relates to the quality of a result which Relates to the quality of the result by
relates to the quality of the operation which a result is obtained
by which the result is obtained

If the temperature is 28o C outside and If on several tests, the temperature


a temperature sensor reads 28o C, then sensor matches the actual temperature
the sensor is accurate. while the actual temperature is held
constant, then the temperature sensor is
precise.
Important Terms in Metrology
sensitivity of an instrument
is the change of output divided by the
change of the measurand
Stability
deals with the degree to which instrument
characteristics remain constant over
time.(the quantity being measured).
Scale Interval
Difference between successive scale
marks
Important Terms in Metrology
Hysteresis -----is the difference
between two
separate measurements
taken at the same point, the first is
taken during a series of
increasing measurement values, and the other
during a series of decreasing
measurement values.
Threshold ---of a measuring instrument is the
minimum value of input signal that is required to make
a change or start from zero. This is the minimum value
below which no output change can be detected when
the input is gradually increased from zero.

22
Important Terms in Metrology

23
24
Important Terms in Metrology
Calibration --is the comparison of
measurement values delivered by a
device under test with those of a
calibration standard of known
accuracy.

The outcome of the comparison


can
result in one of the following:
no significant error being noted on the device
under test

Calibration may be required for the following
reasons:
a new instrument
after an instrument has been repaired or modified
moving from one location to other location
when a specified time period has elapsed
when a specified usage (operating hours) has elapsed
before and/or after a critical measurement
after an event, for example
after an instrument has been exposed to a shock, vibration, or
physical damage, which might potentially have compromised
the integrity of its calibration
sudden changes in weather
whenever observations appear questionable or
instrument indications do not match the output of
surrogate instruments
as specified by a requirement, e.g., customer
specification, instrument manufacturer
Important Terms in Metrology
backlash --In mechanical engineering,
sometimes called lash or play, is a
clearance or lost motion in a mechanism
caused by gaps between the parts.

27
Important Terms in Metrology
Tolerance ----refers to the total allowable error within an
item. This is typically represented as a +/- value off of a
nominal specification. Products can become deformed due
to changes in temperature and humidity, which lead to
material expansion and contraction, or due to improper
feedback from a process control device.
Overshoot: When an input is applied to instruments, the
pointer does not immediately come to rest at its steady
state (or final deflected) position but goes beyond it or in
other words overshoots its steady position.
The overshoot is evaluated as the
maximum amount by which
moving
system moves beyond
the steady state position.

28
29
30
Important Terms in Metrology

31
Accuracy of measurement
• Accuracy is the ability of an instrument to respond to a
true value of a measured variable under reference
conditions.
OR
• Accuracy is defined as the degree to which the
measured value agrees with the true value.

• Practically it is very difficult to measure the true value


and therefore a set of observations is made whose mean
value is taken as the true value of the quantity
measured.
32
Accuracy of measurement

33
Accuracy of measurement depends upon
the following factors.
Ability of the operator
Variation of temperature
Method adopted for measurement
Deformation of the instrument

34

35
PRECISION
Precision- Precision is defined as the ability of the instrument
to reproduce a certain set of readings within a given accuracy
or
It is the degree of exactness for which an instrument is
designed or intended to perform.
It is composed of two characteristics
Conformity- the error created by the limitation of the scale
reading.
Significant figure- it conveys the actual information
regarding the magnitude and the measurement precision of a
quantity.

36
Accuracy VS Precision

Figure shows the difference between the concepts of accuracy


versus precision using a dartboard analogy that shows four different
scenarios that contrast the two terms.
A: Three darts hit the target center and are very close together = high
accuracy and precision
B: Three darts hit the target center but are not very close together =
high accuracy, low precision
C: Three darts do not hit the target center but are very close together
= low accuracy, high precision
D: Three darts do not hit the target center and are not close together
= low accuracy and precision
37
38
Difference between accuracy and precision
Accuracy Precision
Measure of rightness Measure of exactness

How closely a measured value agrees How closely individual measurement


with the correct value agrees with each other

Relates to the quality of a result which Relates to the quality of the result by
relates to the quality of the operation which a result is obtained
by which the result is obtained

If the temperature is 28o C outside and If on several tests, the temperature


a temperature sensor reads 28o C, then sensor matches the actual temperature
the sensor is accurate. while the actual temperature is held
constant, then the temperature sensor is
precise.
39
EFFECTS OF ELEMENTS OF METROLOGY
ON PRECISION AND ACCURACY [SWIPE]
1. Factors affecting the standard
- Coefficient of thermal expansion,
- calibration internal
- stability with time
- elastic properties
- geometric compatibility
2. Factors affecting the work piece
- cleanliness, surface finish, surface defects etc.
- elastic properties
- hidden properties
- arrangement of supporting workpiece.

40
3.Factors affecting the inherent characteristics of instrument.
- Scale error
- effect of friction, hysteresis, zero drift
- calibration errors
- repeatability and readability
4. Factors affecting person:
- training skill
- ability to select the measuring instruments and standard
5. Factors affecting environment:
-temperature, humidity etc.
-clean surrounding and minimum vibration enhance precision
-temperature equalization between standard, workpiece and
41 instrument,
Sensitivity
Sensitivity- Sensitivity is defined as the ratio of the magnitude of
response (output signal) to the magnitude of the quantity being
measured (input signal)

If the calibration curve is liner, as shown, the sensitivity of the instrument is the slope
of the calibration curve.
If42
the calibration curve is not linear as shown, then the sensitivity varies with the input.
READABILITY

• Readability is defined as the ease with which readings

may be taken with an instrument.

• It is the closeness of which the scale of an analog

instrument can be read.

• Readability difficulties may often occur due to parallax

errors when an observer is noting the position of a

43 pointer on a calibrated scale


44
REPEATABILITY
• It is the ability of the measuring instrument to repeat
the same results for the measurements for the same
quantity, when the measurements are carried out
- by the same observer,
- with the same instrument,
- under the same conditions,
- without any change in location,
-without change in the method of measurement,
- the measurements are carried out in short
intervals of time.
• It may be expressed in terms of dispersion of the
45 results.
46
REPRODUCIBILITY
• Reproducibility is the closeness of the agreement
between the results of measurements of the same
quantity, when individual measurements are carried
out:
- by different observers,
- by different methods,
- using different instruments,
- under different conditions, locations, times etc.
• It may be expressed in terms of the dispersion of the
results.

47
Calibration
calibration is the comparison of
measurement values delivered by a device
under test with those of a calibration standard
of known accuracy.

The outcome of the comparison can result in one


of the following:
no significant error being noted on the device
under test
a significant error being noted but no
adjustment made
an adjustment made to correct the error to an
acceptable level
Calibration may be required for the following reasons :
a new instrument
after an instrument has been repaired or modified
moving from one location to other location
when a specified time period has elapsed
when a specified usage (operating hours) has elapsed
before and/or after a critical measurement
after an event, for example
after an instrument has been exposed to a shock, vibration, or
physical damage, which might potentially have compromised
the integrity of its calibration
sudden changes in weather
whenever observations appear questionable or
instrument indications do not match the output of
surrogate instruments
as specified by a requirement, e.g., customer
specification, instrument manufacturer
CALIBRATION
• The calibration of any measuring instrument is necessary to
measure the quantity in terms of standard unit.
• It is carried out by making adjustments such that the read out
device produces zero output for zero input.
Reasons for having instruments calibrated
• To ensure readings from an instrument which are consistent
with other measurements
• To determine the accuracy of the instrument readings.
• To establish the reliability of the instrument
50
51
Interchangeability
A part which can be substituted for the

component manufactured to the small shape and


dimensions is known a interchangeable part.
The operation of substituting the part for

similar manufactured components of the shape


and dimensions is known as interchangeability.
52
53
54
Types of Measuring Instruments
Deflection and null type instruments
Analog and digital instruments
Active and passive instruments
Automatic and manually operated
instruments
Contacting and non contacting instruments
Absolute and secondary instruments
Intelligent instruments.

55
DEFLECTION AND NULL TYPE
Physical effect generated by the measuring quantity
Equivalent opposing effect to nullify the physical effect
caused by the quantity

56
ANALOG AND DIGITAL INSTRUMENTS
Physical variables of interest in the form of
continuous or stepless variations
Physical variables are represented by digital quantities

57
ACTIVE AND PASSIVE INSTRUMENTS
Instruments are those that require some source of
auxiliary power
The energy requirements of the instruments are met
entirely from the input signal

58
Automatic and manually operated
Manually operated – requires the service of human
operator
Automated – doesn't requires human operator

59
Contacting And Non Contacting Instruments
A contacting with measuring medium
Measure the desired input even though they are not
in close contact with the measuring medium

60
Absolute and Secondary Instruments
These instruments give the value of the electrical quantity in
terms of absolute quantities
Deflection of the instruments can read directly

61
Intelligent instruments
Microprocessors are incorporated with
measuring instruments

62
Characteristics of Measuring Instrument

Sensitivity

Stability

Readability

Range of accuracy

Precision
63
Accuracy
Accuracy = the extent to which a measured value
agrees with a true value
The difference between the measured value & the true
value is known as ‘Error of measurement’
Accuracy is the quality of conformity

64
Precision
The precision of a measurement depends on the
instrument used to measure it.
For example, how long is this block?

65
Performance of Instruments
All instrumentation systems are characterized
by the system characteristics or system
response
There are two basic characteristics of
Measuring instruments, they are
Static character
Dynamic character

66
Static Characteristics

The instruments, which are used to measure

the quantities which are slowly varying with

time or mostly constant, i.e., do not vary

with time, is called ‘static characteristics’.

67
STATIC CHARACTERISTICS OF AN
INSTRUMENTS
 Dead zone
 Accuracy
 Backlash
 Precision
 True value
 Sensitivity

 Resolution  Hysteresis

 Threshold  Linearity

 Drift  Range or Span


 Error  Bias
 Repeatability
 Tolerance
 Reproducibility
68  Stability
Dynamic Characteristics

The set of criteria defined for the instruments, which are

changes rapidly with time, is called ‘dynamic

characteristics’.

69
Dynamic Characteristics
 Steady state periodic

 Transient

 Speed of response

 Measuring lag

 Fidelity

 Dynamic error

70
Steady state periodic – Magnitude has a definite repeating

time cycle

Transient – Magnitude whose output does not have

definite repeating time cycle

Speed of response- System responds to changes in the

measured quantity

71
Measuring lag
Retardation type :Begins immediately after the
change in measured quantity
Time delay lag : Begins after a dead time after the
application of the input
Fidelity- The degree to which a measurement system
indicates changes in the measured quantity without
error
Dynamic error- Difference between the true value of
the quantity changing with time & the value indicated
by the measurement system

72
Correction
Correction is defined as a value which is added
algebraically to the uncorrected result of the
measurement to compensate to an assumed systematic
error.
Ex : Vernier Caliper, Micrometer

73
The smallest value that can be measured
by the measuring instrument is called
its least count.

74
75
Thank you

You might also like