BITS Pilani: INSTR F311: Electronic Instruments and Instrumentation Technology
BITS Pilani: INSTR F311: Electronic Instruments and Instrumentation Technology
INTRODUCTION
The measurement of any quantity plays very important role not only in science but in all branches of
engineering, medicine and in almost all the human day to day activities.
The measurement of a given parameter or quantity is the act of a quantitative comparison between
predefined standard and an unknown quantity to be measured.
If the measurand is an electrical or electronic quantity then it is called electrical or electronic measurement.
The major problem with any measuring instrument is the error. Hence, it is necessary to select the
appropriate measuring instrument and measurement procedure which minimizes the error.
Error is defined as the difference between the actual value and the measured value of the instrument.
Calibration is the process of making an adjustment or marking a scale so that the readings of an instrument
agree with the accepted certified standard.
The calibration offers a guarantee to the instrument that it is operating with required accuracy, under the
stipulated environmental conditions.
International standard:
Primary standard:
Secondary standards:
Reference standards used in industrial measurement laboratories. Checked locally against
reference standards available in that area.
Working standards:
1. Most of the quantities can be converted into the electrical or electronic signals by transducers.
2. An electrical or electronic signal can be amplified, filtered, multiplexed, sampled and measured.
3. The measurement can easily be obtained in or converted into digital form for automatic analysis and recording.
4. The measured signals can be transmitted over long distances with the help of cables or radio links, without any
loss of information.
6. Electronic circuits can detect and amplify very weak signals and can measure the events of very short
duration as well.
7. Electronic measurement makes possible to build analog and digital signals. The digital signals are very much
required in computers. The modern development in science and technology are totally based on computers.
8. Higher sensitivity, low power consumption and a higher degree of reliability are the important features of
electronic instruments and measurements. But, for any measurement, a well defined set of standards and
calibration units is essential.
Data Presentation
Element
Observer
Primary sensing element
The first element in any measuring system (measuring instrument) is the primary sensing element: this
gives an output that is a function of the measurand (the input applied to it).
For most but not all instruments, this function is at least approximately linear. Some examples of primary
sensing elements are a liquid-in glass thermometer, a thermocouple, and a strain gauge.
In the case of a mercury-in-glass thermometer, because the output reading is given in terms of the level of
the mercury, this particular primary sensor is also a complete measurement system in itself.
Variable conversion elements are needed where the output variable of a primary transducer is in an
inconvenient form and has to be converted to a more convenient form.
For instance, the force-measuring strain gauge has an output in the form of a varying resistance. Because
the resistance change cannot be measured easily, it is converted to a change in voltage by a bridge circuit,
which is a typical example of a variable conversion element.
A variable manipulation element (signal processing) exist to improve the quality of the output of a
measurement system in some way.
A very common type of signal processing element is the electronic amplifier, which amplifies the output of
the primary transducer or variable conversion element, thus improving the sensitivity and resolution of
measurement.
The signal transmission element has traditionally consisted of single or multicore cable, which is often
screened to minimize signal corruption by induced electrical noise. However, fiber-optic cables are being
used in ever-increasing numbers in modern installations, because of their low transmission loss and are not
effected by external electrical and magnetic fields.
Types of instruments
Instruments are divided into active or passive ones according to whether instrument output is produced
entirely by the quantity being measured or whether the quantity being measured simply modulates the
magnitude of some external power source.
The change in petrol level moves a potentiometer arm, and the output signal consists of a proportion
of the external voltage source applied across the two ends of the potentiometer.
The energy in the output signal comes from the external power source: the float system is merely
modulating the value of the voltage from this external power source.
The pressure gauge just mentioned is a good example of a deflection type of instrument, where the
value of the quantity being measured is displayed in terms of the amount of movement of a pointer.
weights are put on top of the piston until the downward force balances the fluid pressure. Weights are
added until the piston reaches a datum level, known as the null point. Pressure measurement is made
in terms of the value of the weights needed to reach this null position.
An analogue instrument gives an output that varies continuously as the quantity being measured changes.
The output can have an infinite number of values within the range that the instrument is designed to
measure.
The pointer can therefore be in an infinite number of positions within its range of movement, the number
of different positions that the eye can discriminate between is strictly limited; this discrimination is
dependent on how large the scale is and how finely it is divided.
A digital instrument has an output that varies in discrete steps and so can only have a finite number of
values. A cam is attached to the revolving body whose motion is being measured, and on each
revolution the cam opens and closes a switch.
The switching operations are counted by an electronic counter. This system can only count whole
revolutions and cannot discriminate any motion that is less than a full revolution.
Revolution counter
BITS Pilani, Pilani Campus
Indicating vs Signal output
Instruments that have a signal-type output are used commonly as part of automatic control systems. In
other circumstances, they can also be found in measurement systems where the output measurement signal
is recorded in some way for later use.
Usually, the measurement signal involved is an electrical voltage, but it can take other forms in some
systems, such as an electrical current, an optical signal, or a pneumatic signal.
Bathroom scale
BITS Pilani, Pilani Campus
Smart sensor
SMART
Instrument/Sensor
Self diagnosis
Energy harvester
Adjustment to non
linearity
Passive Passive
Null Deflection
Analog Analog
Non smart indicating
Non smart
Weighing balance
Weighing scale
BITS Pilani, Pilani Campus
Contd..
Passive Active
Indicating and deflection Instrument with signal output
Analog Analog
Non smart Non smart
BITS Pilani, Pilani Campus
Contact vs non contact
Digital thermometer
Infrared thermometer
Active Active
Indicating Indicating
Digital Digital
Non smart Smart
The accuracy of an instrument is a measure of how close the output reading of the instrument is to the
true value.
Precision is a term that describes an instrument’s degree of freedom from random errors. If a large
number of readings are taken of the same quantity by a high-precision instrument, then the spread of
readings will be very small.
Tolerance is a term that is closely related to accuracy and defines the maximum error that is to be
expected in some value.
strictly speaking, a static characteristic of measuring instruments, it is mentioned here because the accuracy
of some instruments is sometimes quoted as a tolerance value
Repeatability describes the closeness of output readings when the same input is applied repetitively
over a short period of time, with the same measurement conditions, same instrument and observer,
same location, and same conditions of use maintained throughout.
Reproducibility describes the closeness of output readings for the same input when there are changes in
the method of measurement, observer, measuring instrument, location, conditions of use, and time of
measurement.
The range or span of an instrument defines the minimum and maximum values of a quantity that the
instrument is designed to measure.
It is normally desirable that the output reading of an instrument is linearly proportional to the quantity
being measured.(linearity)
Nonlinearity is then defined as the maximum deviation of any of the output readings from idealized
straight line and usually expressed as a percentage of full-scale reading.
BITS Pilani, Pilani Campus
Contd..
Sensitivity is defined as the ratio of magnitude of the output to the magnitude of input quantity being
measured.
If the input to an instrument is increased gradually from zero, the input will have to reach a certain
minimum level before the change in the instrument output reading is of a large enough magnitude to
be detectable. This minimum level of input is known as the threshold of the instrument.
The minimum magnitude change in the input measured quantity that produces an observable change
in the instrument output is called resolution.
Zero drift or bias describes the effect where the zero reading of an instrument is modified by a
change in ambient conditions.
Sensitivity drift (also known as scale factor drift) defines the amount by which an instrument’s
sensitivity of measurement varies as ambient conditions change.
If the input measured quantity to the instrument is increased steadily from a negative value, the output
reading varies in the manner shown in curve A. If the input variable is then decreased steadily, the output
varies in the manner shown in curve B. The non coincidence between these loading and unloading curves
is known as hysteresis.
Two Voltmeters (A and B) have a full scale accuracy of ± 5%. Voltmeter A has a range of 0-1 V and B has a
range of 0-10 V. Which voltmeter is more suitable to be used if the reading to be measured is 0.9 V?
5
Voltmeter A error 1V 0.05 V
100
0.05
error percentage 100 5.5%
0.9
Voltmeter B 5
error 10 V 0.5 V
100
0.5
error percentage 100 55.5%
0.9
BITS Pilani, Pilani Campus
Errors
Errors, or uncertainties, are inevitable in measurements.
Note that here “errors” mean random errors. One should always avoid systematic errors.
32
BITS Pilani, Pilani Campus
Types of errors
Types of errors
- Gross errors
- Systematic errors
- Random errors
Gross errors- It mainly covers human mistakes in reading the instrument, recording and calculating the
measurement results.
Environmental errors- These errors are due to conditions external to the measuring device. There may
be effects of temperature, pressure, humidity, dust, vibrations, external electrical noise, external
magnetic fields, electrostatic field etc.
Random errors- These errors are unpredictable errors and occur even when all the systematic are
accounted for, i.e. the instrument is under controlled environment and accurately calibrated before
measurement.
34
BITS Pilani, Pilani Campus
Contd..
Random errors can be found when the readings may slightly vary over a period of time of
observations.
However random errors can be reduced by taking more no of readings and by statistical methods to get
the best approximation of the true value.
Due to the unpredictability of random errors, any error bounds placed on measurements can only be
quantified in probabilistic terms.
35
BITS Pilani, Pilani Campus
Errors: random Vs systematic
36