Instrumentation and Measurement Lecture-1
Instrumentation and Measurement Lecture-1
UNIT-I-INTRODUCTION
CONTENTS
Measurement
Basic requirements
Significance of measurement
Methods of measurement
Instrument and measurement systems
Evolution of instruments
Classification of Instruments
Types of Instrumentation system
Elements of generalized measurement system
Functional elements of an instrument
Static and dynamic characteristics
Errors in measurement
Statistical evaluation of measurement data
Standards
Calibration
1. Measurement:
The standard used for comparison purposes must be accurately defined and
should be commonly accepted.
The apparatus used and the method adopted must be provable (verifiable).
3. Significance of Measurement
4. Methods of Measurement
Direct Methods
Indirect Methods
6. Evolution of Instruments
Mechanical
Electrical
Electronic Instruments.
MECHANICAL: These instruments are very reliable for static and stable conditions.
But their disadvantage is that they are unable to respond rapidly to measurements of
dynamic and transient conditions.
ELECTRICAL: It is faster than mechanical, indicating the output are rapid than
mechanical methods. But it depends on the mechanical movement of the meters. The
response is 0.5 to 24 seconds.
ELECTRONIC: It is more reliable than other system. It uses semiconductor devices and
weak signal can also be detected
7. Classification of Instruments
Absolute Instruments.
Secondary Instruments.
ABSOLUTE: These instruments give the magnitude if the quantity under
measurement terms of physical constants of the instrument.
SECONDARY: These instruments are calibrated by the comparison with absolute
instruments which have already been calibrated.
Further it is classified as
Deflection Type Instruments
Null Type Instruments.
Functions of instrument and measuring system can be classified into three. They
are:
Measurements and Instrumentation
i) Indicating function.
ii) Recording function.
iii) Controlling function.
Application of measurement systems are:
i) Monitoring of process and operation.
ii) Control of processes and operation.
iii) Experimental engineering analysis.
Intelligent Instrumentation (data has been refined for the purpose of presentation )
Dumb Instrumentation (data must be processed by the observer)
TERMINATING
DETECTOR INTERMEDIATE STAGE STAGE
TRANSDUCER
STAGE
Application involved measurement of quantity that are either constant or varies slowly
with time is known as static.
Accuracy
Drift
Dead Zone
Static Error
Sensitivity
Reproducibility
Static Characteristics
Static correction
Scale range
Scale span
Noise
Dead Time
Hysteresis.
Linearity
Measurements and Instrumentation
ACCURACY: It is the closeness with an instrument reading approaches the true value
of the quantity being measured.
TRUE VALUE: True value of quantity may be defined as the average of an infinite
no. of measured value.
SENSITIVITY is defined as the ratio of the magnitude of the output response to that
of input response.
STATIC ERROR: It is defined as the difference between the measured value and true
value of the quantity.
Reproducibility is specified in terms of scale readings over a given period of time.
Drift is an undesirable quality in industrial instruments because it is rarely apparent and
cannot be maintained.
It is classified as
Zero drift
Span drift or sensitivity drift
Zonal drift.
Noise
A spurious current or voltage extraneous to the current or voltage of interest in an
electrical or electronic circuit is called noise.
Speed of response
Measuring lag
Fidelity
Dynamic error
error.
Measurements and Instrumentation
Residual error
This is also known as residual error. These errors are due to a multitude
of small factors which change or fluctuate from one measurement to another. The
happenings or disturbances about which we are unaware are lumped together and called
“Random” or “Residual”. Hence the errors caused by these are called random or residual
errors.
readings taken.
Deviation
Deviation is departure of the observed reading from the arithmetic mean of the group
of readings.
Standard Deviation
The standard deviation of an infinite number of data is defined as the square root of the
instruments against a known standard and subsequently to find errors and accuracy.
Calibration Procedure involve a comparison of the particular instrument with either
a Primary standard
16. Standards
A standard is a physical representation of a unit of measurement. The term
„standard‟ is applied to a piece of equipment having a known measure of physical
quantity.
Types of Standards
International Standards (defined based on international agreement )