0% found this document useful (0 votes)
8 views51 pages

Unit 1

The document outlines the principles of metrology and measurements, emphasizing the importance of accurate measurement methods and standards in engineering. It covers various types of measuring instruments, methods of measurement, and classifications of standards, as well as common errors in measurement. The text also highlights the significance of calibration, interchangeability, and the characteristics of measurement systems.

Uploaded by

dhj9063
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views51 pages

Unit 1

The document outlines the principles of metrology and measurements, emphasizing the importance of accurate measurement methods and standards in engineering. It covers various types of measuring instruments, methods of measurement, and classifications of standards, as well as common errors in measurement. The text also highlights the significance of calibration, interchangeability, and the characteristics of measurement systems.

Uploaded by

dhj9063
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

METROLOGY AND MEASUREMENTS

OBJECTIVE:

• To understand the basic principles of


measurements.
• To learn the various linear and angular
measuring equipments, their principle of
operation and applications.

• To learn about various methods of measuring


Mechanical parameters.
ENGINEER
• Engineers are people who design, construct,
and maintain structures, materials and
systems while considering the limitations
imposed by practicality, regulation, safety, and
cost.
• The word engineer (Latin ingeniator) is
derived from the Latin words ingeniare ("to
contrive, devise") and ingenium
("cleverness").
ENGINEERING
• Engineering is the application of mathematics,
as well as scientific, economic, social, and
practical knowledge in order to
invent, innovate, design,build, maintain, resea
rch, and improve
structures, machines, tools, systems, compone
nts, materials,processes, solutions,
and organizations.
MECHANICAL ENGINEERING
• Mechanical engineering is a diverse subject
that derives its breadth from the need to
design and manufacture everything from
small individual parts and devices (e.g.,
microscale sensors and inkjet printer nozzles)
to large systems (e.g., spacecraft and machine
tools).
INDEX

UNIT I : CONCEPT OF MEASUREMENT

UNIT II : LINEAR ,ANGULAR MEASUREMENT AND


FORM MEASUREMENT

UNIT III : FORM MEASUREMENT

UNIT IV : LASER AND ADVANCES IN METROLOGY


UNIT V : MEASUREMENT OF MECHANICAL
PARAMETERS
UNIT I : CONCEPT OF MEASUREMENT
 General concept
 Generalized measurement system-
 Units and standards
 Measuring instruments
 sensitivity, readability, range of accuracy,precision
 static and dynamic response repeatability
 systematic and random errors-correction
 Introduction to GD &T
 calibration, interchangeability
General concept – Generalized measurement system:

Measurement is a comparison of a given quantity with one of


its predetermined standard values opted as a unit.

There are two important requirements of the measurement:

The standards used for comparison must accurate and


internationally accepted.

The apparatus or instrument and the process used for


information must be provable.
Need for measurement:

To ensure that the part to be measured conforms to the


established standard.

To meet the interchangeability of manufacture.

To provide customer satisfaction by ensuring that no faulty


product reaches the customers.

To coordinate the functions of quality control, production,


procurement & other departments of the organization.

To judge the possibility of making some of the defective parts


acceptable after minor repairs.
Methods of Measurement:
Types of measurements: Primary Measurement
Secondary
Tertiary

1.Method of direct measurement:

The value of the quantity to be measured is obtained directly without


the necessity of carrying out supplementary calculations based on a
functional dependence of the quantity to be measured in relation to
the quantities actually measured.

Example: Weight of a substance is measured directly using a physical


balance.

2. Method of indirect measurement:


The value of the quantity is obtained from measurements carried out
by direct method of measurement of other quantities, connected with
the quantity to be measured by a known relationship.
Example: Weight of a substance is measured by measuring the
length, breadth & height of the substance directly and then by using
the relation Weight = Length x Breadth x Height x Density

3.Method of measurement by comparison:


Based on the comparison of the value of a quantity to be measured
with a known value of the same quantity (direct comparison), or a
known value of another quantity which is a function of the quantity
to be measured (indirect comparison).

5.Method of fundamental measurement:


Based on the measurements of base quantities entering into the
definition of the quantity.
7.Method of measurement by transposition :
The value of the quantity to be measured is in the beginning, balanced by a first
known value A of the same quantity, then the value of the quantity to be
measured is put in place of this known value and is again balanced by another
known value B.
If the position of the element indicating equilibrium is the same in both
the cases, the value of the quantity measured is equal to A & B.

8.Method of measurement by complement:


The value of the quantity to be measured is complemented by a known value of
the same quantity, selected in such a way that the sum of these two values is
equal to a certain value of comparison fixed in advance.
Generalized Measurement System

• Sensor or transducer stage to detect measurand and Convert


input to a form suitable for processing e.g. :
- Temp. to voltage - Force to distance .

• Signal conditioning stage to modify the transduced signal e.g. :


Amplification, Attenuation, Filtering, Encoding

• Terminating readout stage to present desired output (Analog


or Digital form)
Generalized Measurement System
Generalized measuring system:

The generalized measuring systems consist of the following


common elements:

1. Primary sensing element


2. Variable conversion element
3. Variable manipulation element
4. Data transmission element
5. Data processing element
6. Data presentation element
Classification of Standards:

 Line & End Standards: In the Line standard, the length is the
distance between the centres of engraved lines whereas in End
standard, it is the distance between the end faces of the
standard.

Example: for Line standard is Measuring Scale, for End standard


is Block gauge.

Units:

Fundamental Units-Eg. Mtr, Kg,

Supplementary Units-Radian,Steradian

Derived Units-Area,Volume
Standards:

 Primary,

 Secondary

 Tertiary &

 Working Standards:

 Primary standard:
It is only one material standard and is preserved under
the most careful conditions and is used only for
comparison with Secondary standard.
 Secondary standard:

It is similar to Primary standard as nearly as possible and is


distributed to a number of places for safe custody and is used for
occasional comparison with Tertiary standards.

 Tertiary standard:

It is used for reference purposes in laboratories and workshops


and is used for comparison with working standard.

 Working standard:

It is used daily in laboratories and workshops. Low grades of


materials may be used.
Measuring instruments:

A broad classification of the instruments based on the application


mode of operation, manner of energy conversion and the nature of
energy conversion and the nature of output signal is given below:

1. Deflection and null type instruments


2. Analog and digital instruments
3. Active and passive instruments
4. Automatic and manually operated instruments
5. Contacting and non contacting instruments
6. Absolute and secondary instruments
7. Intelligent instruments.
1. Deflection and null type instruments
2. Analog and digital instruments

Analog-Output varies continuously as quantity being measured varies.

Pressure gage

Digital-O/p Varies in discrete steps.


3. Active and passive instruments

Active instruments: Passive Instruments:


Automatic and manually operated instruments
Important terms
• Sensitivity: change in o/p signal/Change in i/p
signal
Readability: Closeness with which a scale of an
instrument can be read.
Range of accuracy:
% of full scale reading
% of True Value
• Precision:
• Characteristics:
Confirmity
Significant figures
Accuracy vs precision:
Performance of instruments:

All instrumentation systems are characterized by the system


characteristics or system response.

It consists of two basic characteristics such as static and dynamic. If


the instrument is required to measure a condition not varying with
time characteristics are called static while for a time varying process
variable measurement, the dynamic characteristics are more
important.

Static response:

The static characteristics of an instrument are considered for


instruments which are used to measure an unvarying process
conditions.
Dynamic response:

The behaviors of an instrument under such time varying input –


output conditions called dynamic response of an instrument. The
instrument analysis of such dynamic response is called dynamic
analysis of the measurement system.

Dynamic quantities are two types

 steady state periodic

 Transient
Types of inputs
• Step Input
• Ramp Input
• Impulse Input
• Sinusoidal Input
• Step Input:
• Input will be high at starting itself and constant throughout
analysis.
• Ramp Input:
input varies linearly with time.

Impulse Input:
Input will be zero everywhere except at t=0, where it will be
maximum.
• Sinusoidal Input:
Input will be in the form of sine waves.

• Uncertainity:
A doubt about exactness of the measurement results.
Terms in Measurement:

Sensitivity:

Sensitivity of the instrument is defined as the ratio of the


magnitude of the output signal to the magnitude of the input
signal.

It denotes the smallest change in the measured variable to


which the instruments responds.

Sensitivity has no unique unit. It has wide range of the units


which dependent up on the instrument or measuring system.
Readability:

Readability is a word which is frequently used in the analog


measurement. The readability is depends on the both the
instruments and observer.

Readability is defined as the closeness with which the scale of an


analog instrument can be read.

The susceptibility of a measuring instrument to having its


indications converted to a meaningful number. It implies the ease
with which observations can be made accurately.

For getting better readability the instrument scale should be as


high as possible.
Accuracy:

Accuracy may be defined as the ability of instruments to


respond to a true value of a measured variable under the reference
conditions. It refers to how closely the measured value agrees with
the true value.

Precision:

Precision is defined as the degrees of exactness for which an


instrument is designed or intended to perform. It refers to
repeatability or consistency of measurement when the instruments
are carried out under identical conditions at a short interval of time.

It can also defined as the ability of the instruments to reproduce a


group of the instruments as the same measured quantity under the
same conditions.
Correction:
Correction is defined as a value which is added algebraically
to the uncorrected result of the measurement to compensate to an
assumed systematic error.

Calibration:

Calibration is the process of determining and adjusting an


instruments accuracy to make sure its accuracy is with in
manufacturing specifications.

It is the process of determining the values of the quantity being


measured corresponding to a pre-established arbitrary scale. It is the
measurement of measuring instrument. The quantity to be
measured is the „input‟ to the measuring instrument.
The “input” affects some ‘parameter 'which is the “output” & is
read out. The amount of “output” is governed by that of “input”.
Before we can read any instrument, a “scale” must be framed for
the “output” by successive application of some already
standardized (inputs) signals. This process is known as “calibration”.

Interchangeability:

A part which can be substituted for the component


manufactured to the small shape and dimensions is known a
interchangeable part. The operation of substituting the part for
similar manufactured components of the shape and dimensions is
known as interchangeability.
• Interchangeability Uses:
Replacement of worn parts is easy.
Less Maintenance required.
Repair Carried out easily.
International standards are used to obtain
universal acceptance in two ways.
1.Universal (or) full interchangeability.
2.Selective assembly.
Types of Errors:
Error :Measured Value-True Value

Errors in Measurement:

I. Absolute Error:
a. True Absolute Error->Measured Value-True Value
b. Apparent Absolute Error->One of the results of measurement-
arithmetic mean
II. Relative Error: Absolute Error-(True Value (or)Arithematic
mean)
Types of Errors:
1.Static Errors: Errors that does not vary with time.
a. Characteristic Error Ex: Linearity, Repeatability
b. Reading Error Ex: Parallax error, Interpolation error
c. Environmental.
2.Loading Errors.
3.Dynamic Errors.
2.Loading error:
Measured quantity looses energy due to act of
measurement.
Ex:Flow rate can’t be measured accurately while
steam flowing through nozzle.
3.Dynamic Errors: Which Varies with time.
1) Systematic error:
i. Calibration Errors
ii. Ambient Errors.
iii. Avoidable Errors Ex: parallax,non alignment of
work piece centre improper holding of instruments.
iv. Stylus Pressure.

2) Random error: Ex: Errors due to displacement of


joints, due to friction,Variation in position of settings etc…..
3) Parasitic error:
It is the error, often gross, which results from incorrect execution of
measurement.
B) Instrumental error:

1) Error of a physical measure:


It is the difference between the nominal value and the conventional
true value reproduced by the physical measure.

2) Error of a measuring mechanism:


It is the difference between the value indicated by the measuring
mechanism and the conventional true value of the measured
quantity.

3) Zero error:
It is the indication of a measuring instrument for the zero value of
the quantity measured.
4) Calibration error of a physical measure:
It is the difference between the conventional true value reproduced
by the physical measure and the nominal value of that measure .

5) Complementary error of a measuring instrument:


It is the error of a measuring instrument arising from the fact that the
values of the influence quantities are different from those
corresponding to the reference conditions.

6) Error of indication of a measuring instrument:


It is the difference between the measured values of a quantity, when
an influence quantity takes successively two specified values, without
changing the quantity measured.

7) Error due to temperature:


It is the error arising from the fact that the temperature of instrument
does not maintain its reference value
8) Error due to friction:
It is the error due to the friction between the moving parts of the
measuring instruments.

9) Error due to inertia:


It is the error due to the inertia (mechanical, thermal or otherwise)
of the parts of the measuring instrument .

C) Error of observation:
1) Reading error: It is the error of observation resulting from
incorrect reading of the indication of a measuring instrument
by the observer.
2) Parallax error: It is the reading error which is produced, when,
with the index at a certain distance from the surface of scale,
the reading is not made in the direction of observation
provided for the instrument used.
3) Interpolation error:
It is the reading error resulting from the inexact evaluation of the
position of the index with regard to two adjacent graduation marks
between which the index is located.

D) Based on nature of errors:

1) Systematic error.
2) Random error.
E) Based on control:
1) Controllable errors: The sources of error are known and it is
possible to have a control on these sources. These can be
calibration errors, environmental errors and errors due to non-
similarity of condition while calibrating and measuring.
Calibration errors:
These are caused due to variation in the calibrated scale from its
normal value. The actual length of standards such as slip gauges
will vary from the nominal value by a small amount. This will cause
an error of constant magnitude.

Environmental (Ambient /Atmospheric Condition) Errors:


International agreement has been reached on ambient condition
which is at 20 C temperature, 760 mm of Hg pressure and 10 mm
of Hg humidity. Instruments are calibrated at these conditions. If
there is any variation in the ambient condition, errors may creep
into final results. Of the three, temperature effect is most
considerable.

Stylus pressure errors:


Though the pressure involved during measurement is generally
small, this is sufficient enough to cause appreciable deformation of
both the stylus and the work piece. This will cause an error in the
Avoidable errors: These errors may occur due to parallax in the
reading of measuring instruments. This occurs when the scale and
pointer are separated relative to one another.

The two common practices to minimize this error are:

 Reduce the separation between the scale and pointer to


minimum.
 A mirror is placed behind the pointer to ensure normal
reading of the scale in all the cases. These avoidable errors
occur also due to non-alignment of work piece centers,
improper location of measuring instruments, etc.

You might also like