0% found this document useful (0 votes)
113 views53 pages

Physical Quantity - Data

The document defines key terminology used in instrumentation and measurement systems. It discusses physical quantities, data, information, parameters, measurands, calibration, transducers, sensors, actuators, signals, and instrumentation. A typical measurement system architecture involves a sensor or transducer converting a physical quantity to an electrical signal, signal conditioning, analog to digital conversion, and processing with a computer controller. Instrumentation provides indicating, recording, and controlling functions with advantages like high sensitivity and monitoring remote signals.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
113 views53 pages

Physical Quantity - Data

The document defines key terminology used in instrumentation and measurement systems. It discusses physical quantities, data, information, parameters, measurands, calibration, transducers, sensors, actuators, signals, and instrumentation. A typical measurement system architecture involves a sensor or transducer converting a physical quantity to an electrical signal, signal conditioning, analog to digital conversion, and processing with a computer controller. Instrumentation provides indicating, recording, and controlling functions with advantages like high sensitivity and monitoring remote signals.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 53

• Physical quantity: variable such as pressure,

temperature, mass, length, etc.


• Data: Information obtained from the
instrumentation/measurement system as a result of the
measurements made of the physical quantities
• Information: Data that has a calibrated numeric
relationship to the physical quantity.
• Parameter: Physical quantity within defined (numeric)
limits.
Terminology
• Measurand: Physical quantity being measured.
• Calibration: Implies that there is a numeric
relationship throughout the whole instrumentation
system and that it is directly related to an approved
national or international standard.
• Test instrumentation: It is a branch of instrumentation
and most closely associated with the task of gathering
data during various development phases encountered
in engineering, e.g. flight test instrumentation for
testing and approving aircraft.
Terminology
• Signal
– Any physical quantity variable in time (or any other
independent variable) containing information
• Continuous
• Discrete (Amplitude and time)

• Electrical signal (voltage or current loop)


– Analog - continuous
3
– Digital - quantized

• Transducer: A device that converts one form of


energy to another.
• Electronic transducer: It has an input or output that is
electrical in nature (e.g., voltage, current or
resistance).
• Sensor: Electronic transducer that converts physical
quantity into an electrical signal.
• Actuator: Electronic transducer that converts
electrical energy into mechanical energy.
INTRODUCTION
• Instrumentation is a technology of measurement which serves
sciences, engineering, medicine and etc.

• By definition measurement
 Is the process of determining the amount, degree or capacity by
comparison with the accepted standards of the system units
being used. OR
5
 A method to obtain information regarding the physical values of
the variable.

• Instrument is a device for determining the value or magnitude


of a quantity or variable.

• Electronic instrument is based on electrical or electronic


principles for its measurement functions.

• To define physical quantities in type and magnitude


• Units of measurement may be defined as the standard
measure of each kind of physical quantity.
• Efforts were made to standardise systems of
measurement so that instrument professionals and
specialist in other disciplines could communicate
among themselves.

7
• Two types of units are used in science and
engineering
– Fundamental units ( or quantities)
• E.g. meter (length), kilogram (mass), second (time)
– Derived units (or quantities); i.e. All units which
can be expressed in terms of fundamental units
• E.g. The volume of a substance is proportional to its length (l),
breadth (b) and height (h), or V= l x b x h.
• So, the derived unit of volume (V) is cube of meter (m3).

8
Quantity Unit Unit Symbol

9
Fundamental (Basic) Units
Length Meter m
Mass Kilogram kg
Time Second s
Electric current Ampere A
Thermodynamic temperature Kelvin K
Luminous intensity Candela cd
Quantity of substance Mole mol Supplementary Units
Plane angle Radian rad
Solid angle Steradian sr
Derived Units
Area Square meter m2
Volume Cubic meter m3
Velocity Meter per second m/s

• Foot-pound-second (F.P.S.) used for:

10
– Length
– Mass
– Time
• As a physical representation of a unit of
measurement
• It is used for obtaining the values of the physical
properties of other equipment by comparison
methods; e.g.

11
– The fundamental unit of mass in the SI system is
the kilogram, defined as the mass of a cubic
decimeter of water at its temperature of maximum
density of 4 C.
• Direct comparison
– Easy to do but… less accurate
• e.g. to measure a steel bar

• Indirect comparison

12
– Calibrated system; consists of several devices to
convert, process (amplification or filtering) and
display the output
• e.g. to measure force from strain gages located in a structure

13
ELECTRONIC INSTRUMENT
• Basic elements of an electronics instrument
Signal Indicating
Transducer Modifier Device
1) Transducer
- convert a non electrical signal into an electrical signal
- e.g: a pressure sensor detect pressure and convert it to electricity for
display at a remote gauge.
2) Signal modifier
- convert input signal into a suitable signal for the indicating device
3) Indicating device
- indicates the value of quantity being measure
15

General Structure of Measuring System


• Stage 1: A detection-transducer or sensor-transducer, stage;
e.g. Bourdon tube
• Stage 2: A signal conditioning stage; e.g. gearing, filters, bridges
• Stage 3: A terminating or readout-recording stage; e.g. printers,
oscilloscope

15
16

• Active Instruments
– the quantity being measured simply modulates (adapts to) the
magnitude of some external power source.
• Passive Instruments
– the instrument output is entirely produced by the quantity
being measured

• Difference between active & passive instruments is the level


of measurement resolution that can be obtained.
Instrumentation Examples
• Every engineering discipline uses electrical
instrumentation to collect and analyze data.
• The following examples are illustrative of the
different types of sensors and instrumentation
that different engineering disciplines use.

• e.g. Float-type fuel tank level indicator


17
Circuit
excited by
external
power
source
(battery)
• The change in petrol level moves a potentiometer
arm, and the output signal consists of a proportion of
the external voltage source applied across the two
ends of the potentiometer.
• The energy in the output signal comes from the
external power source: the primary transducer float
system is merely modulating the value of the voltage
from this external power source.
• e.g. Pressure-measuring device

19
• The pressure of the fluid is translated into a
movement of a pointer against scale.
20
• The energy expanded in moving the pointer is
derived entirely from the change in pressure
measured: there are no other energy inputs to
the system.

21
• An analogue instrument gives an output that
varies continuously as the quantity being
measured and has infinite number of values;
e.g. Deflection-type of pressure gauge

22
• A digital instrument has an output that varies
in discrete steps and only have a finite number
of values; e.g. Revolution counter

23
Strain Measurements
Non-destructive Testing
Automotive Sensors
Accelerometer Oxygen
Sensor

Airflow
Sensor

Oil
CO Sensor Pressure Water
Temperature
FUNCTION AND ADVANTAGES
• The 3 basic functions of instrumentation :–
Indicating – visualize the process/operation
– Recording – observe and save the measurement reading
– Controlling – to control measurement and process

• Advantages of electronic measurement


– Results in high sensitivity rating – the use of amplifier
– Increase the input impedance – thus lower loading effects
– Ability to monitor remote signal
Typical Measurement System Architecture

28
Noise and Interference

Proce Sensor
Signal
ss or Amp Conditioner
Transducer

or
Test
ADC
Converter
OUR TOPIC IS HERE
Proces
s
PC
Controller comp
… and control and
over the process or experiment data
storage
29

PERFORMANCE CHARACTERISTICS
• Performance Characteristics - characteristics that show the
performance of an instrument.
– Eg: accuracy, precision, resolution, sensitivity.
• Allows users to select the most suitable instrument for a
specific measuring jobs.
• Two basic characteristics :
– Static – measuring a constant process condition.
– Dynamic - measuring a varying process condition.

30
PERFORMANCE CHARACTERISTICS
• Accuracy – the degree of exactness (closeness) of
measurement compared to the expected (desired) value.
• Resolution – the smallest change in a measurement variable to
which an instrument will respond.
• Precision – a measure of consistency or repeatability of
measurement, i.e successive reading do not differ.
• Sensitivity – ratio of change in the output (response) of
instrument to a change of input or measured variable.
• Expected value – the design value or the most probable value
that expect to obtain.
• Error – the deviation of the true value from the desired value.
31
• Accuracy, Precision, Resolution, and Significant
Figures
– Accuracy (A) and Precision
• The measurement accuracy of 1% defines how close the measurement is
to the actual measured quality.
• The precision is not the same as the accuracy of measurement, but they are
related

32
Measurement precision depends on the smallest change that can be observed in
the measured quantity. A 1mV change will be indicated on the digital voltmeter
display above. For the analog instrument, 50 mV is the smallest change that can
be noted
a) If the measured quantity increases or decreases by 1 mV, the reading
becomes 8.936 V or 8.934 V respectively. Therefore, the voltage is
measured with a precision of 1 mV.
b) The pointer position can be read to within one-fourth of the smallest scale
division. Since the smallest scale division represents 0.2 V, one-fourth of
the scale division is 50 mV.
 Resolution

33
The measurement precision of an instrument defines the smallest
change in measured quantity that can be observed. This smallest
observable change is the resolution of the instrument.
 Significant Figures
The number of significant figures indicate the precision
of measurement.

Instrument ‘loading’ effect : Some measuring instruments depend for


their operation on power taken from the circuit in which measurements are being
made. Depending on the ‘loading’ effect of the instrument (i.e. the current taken
to enable it to operate), the prevailing circuit conditions may change.
The resistance of voltmeters may be calculated since each have a stated sensitivity
(or ‘figure of merit’), often stated in ‘k per volt’ of f.s.d. A voltmeter should have
as high a resistance as possible ( ideally infinite).

34
In a.c. circuits the impedance of the instrument varies with frequency and thus the
loading effect of the instrument can change.
Example:
Calculate the power dissipated by the voltmeter and by resistor R in Figure 10.9 when
(a) R=250 Ω, (b) R=2 MΩ. Assume that the voltmeter sensitivity (sometimes called
figure of merit) is 10 kΩ/V.

35
36
2. Systematic Error: due to shortcomings of the instrument
(such as defective or worn parts, ageing or effects of the
environment on the instrument)
• In general, systematic errors can be subdivided into static and dynamic
errors.
– Static – caused by limitations of the measuring device or the
physical laws governing its behavior.
– Dynamic – caused by the instrument not responding very fast
enough to follow the changes in a measured variable.

- 3 types of systematic error :-


(i) Instrumental error
(ii) Environmental error
(iii) Observational error
37
Types of static error
(i) Instrumental error
- inherent while measuring instrument because of their
mechanical structure (eg: in a D’Arsonval meter,
friction in the bearings of various moving component,
irregular spring tension, stretching of spring, etc)
- error can be avoid by:
(a) selecting a suitable instrument for the
particularmeasurement application
(b) apply correction factor by determininginstrumental error
(c) calibrate the instrument against standard
(ii) Environmental error

38
- due to external condition effecting the
measurement including surrounding area condition
such as change in temperature, humidity,
barometer pressure, etc
- to avoid the error :(a)
use air conditioner
(b) sealing certain component in the instruments
(c) use magnetic shields

(iii) Observational error


- introduce by the observer
- most common : parallax error and estimation error
(while reading the scale)

39
- Eg: an observer who tend to hold his head too far to
the left while reading the position of the needle on the scale.
3) Random error
- due to unknown causes, occur when all systematic
error has accounted
- accumulation of small effect, require at high degree of
accuracy
- can be avoid by
(a) increasing number of reading
(b) use statistical means to obtain best
approximation of true value
2- Systematic Errors versus Random errors
40
Systematic Errors
Instrumental Errors
Friction
Zero positioning
Environment Errors
Temperature
Humidity
Pressure
Observational Error
Random Errors

41
Dynamic Characteristics
• Dynamic – measuring a varying process condition.
• Instruments rarely respond instantaneously to changes in the
measured variables due to such things as mass, thermal
capacitance, fluid capacitance or electrical capacitance.
• Pure delay in time is often encountered where the instrument
waits for some reaction to take place.
• Such industrial instruments are nearly always used for
measuring quantities that fluctuate with time.
• Therefore, the dynamic and transient behavior of the
instrument is important.

42
Dynamic Characteristics
• The dynamic behavior of an instrument is determined
by subjecting its primary element (sensing element)
to some unknown and predetermined variations in
the measured quantity.
• The three most common variations in the measured
quantity:
– Step change
– Linear change
– Sinusoidal change

43
Dynamic Characteristics
• Step change-in which the primary element is subjected to an
instantaneous and finite change in measured variable.
• Linear change-in which the primary element is following the
measured variable, changing linearly with time.
• Sinusoidal change-in which the primary element follows a
measured variable, the magnitude of which changes in
accordance with a sinusoidal function of constant amplitude.
• The dynamic performance characteristics of an
instrument are:

44
Dynamic Characteristics
– Speed of response- The rapidity with which an instrument
responds changes in measured quantity.
– Dynamic error-The difference between the true and
measured value with no static error.
– Lag – delay in the response of an instrument to changes in
the measured variable.
– Fidelity – the degree to which an instrument indicates the
changes in the measured variable without dynamic error
(faithful reproduction).

45
Standard
• A standard is a known accurate measure of physical quantity.
• Standards are used to determine the values of other physical
quantities by the comparison method.
• All standards are preserved at the International Bureau of
Weight and Measures (BIMP), Paris.
• Four categories of standard:
– International Standard
– Primary Standard
– Secondary Standard
– Working Standard

46
Standard
• International Std
– Defined by International Agreement
– Represent the closest possible accuracy attainable by the current science and
technology

• Primary Std
– Maintained at the National Std Lab (different for every country)
– Function: the calibration and verification of secondary std
– Each lab has its own secondary std which are periodically checked and
certified by the National Std Lab.
– For example, in Malaysia, this function is carried out by SIRIM.

47
Standard
• Secondary Standard
– Secondary standards are basic reference standards used by measurement and
calibration laboratories in industries.
– Each industry has its own secondary standard.
– Each laboratory periodically sends its secondary standard to the National
standards laboratory for calibration and comparison against the primary
standard.
– After comparison and calibration, the National Standards Laboratory returns
the secondary standards to particular industrial laboratory with a certification
of measuring accuracy in terms of a primary standard.

• Working Std

48
Standard
– Used to check and calibrate lab instrument for accuracy and performance.
– For example, manufacturers of electronic components such as capacitors,
resistors and many more use a standard called a working standard for checking
the component values being manufactured.

49
INSTRUMENT APPLICATION GUIDE

• Selection, care and use of the instrument :-


 Before using an instrument, students should be thoroughly
familiar with its operation ** read the manual carefully
 Select an instrument to provide the degree of accuracy
required (accuracy + resolution + cost)
 Before using any selected instrument, do the inspection for
any physical problem
 Before connecting the instrument to the circuit, make
sure the ‘function switch’ and the ‘range selector switch’
has been set-up at the proper function or range

50
INSTRUMENT APPLICATION GUIDE

51
Analog Multimeter
INSTRUMENT APPLICATION GUIDE

52
Digital Multimeter

LECTURE REVIEW
• Define the terms accuracy, error, precision, resolution, expected value and
sensitivity.
• State three major categories of error.
• A person using an ohmmeter reads the measured value as 470 ohm when
the actual value is 47 ohm. What kind of error does this represent?
• State the classifications of standards.
• What are primary standards? Where are they used?
• What is the difference between secondary standards and working
standards?
• State three basic elements of electronic instrument.

53

You might also like