0% found this document useful (0 votes)
102 views54 pages

Industrial Instrumentation: Process Measurement

The document discusses fundamentals of industrial instrumentation and process measurement. It covers various topics such as measurement terminology, types of measurements, basic measuring instruments, instrument types and their performance characteristics. Some key points include: 1) Process variables are physical or chemical properties that can be measured, such as temperature, pressure, and flow. 2) Measurements can be static or dynamic depending on if the fluid being measured is still or flowing. 3) Basic measuring instruments include sensors, transducers, amplifiers, transmitters, and indicators. 4) Instrument performance is characterized by accuracy, precision, sensitivity, range, linearity and other factors.

Uploaded by

medobas
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
102 views54 pages

Industrial Instrumentation: Process Measurement

The document discusses fundamentals of industrial instrumentation and process measurement. It covers various topics such as measurement terminology, types of measurements, basic measuring instruments, instrument types and their performance characteristics. Some key points include: 1) Process variables are physical or chemical properties that can be measured, such as temperature, pressure, and flow. 2) Measurements can be static or dynamic depending on if the fluid being measured is still or flowing. 3) Basic measuring instruments include sensors, transducers, amplifiers, transmitters, and indicators. 4) Instrument performance is characterized by accuracy, precision, sensitivity, range, linearity and other factors.

Uploaded by

medobas
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 54

Fundamentals of Industrial instrumentation And process measurement

Measurement Terminology

What is measurement??

A measurement is simply an exact comparison of a physical quantity to some definite standard or measure of dimension called Unit

Process variables
A process variables is a physical or chemical property, quantity or other condition which can be changed.
The main process variables to be controlled are:
Temperature Pressure Flow Level Chemical properties (PH, Density or Viscosity, ...)

Types of measurement
Static
Static measurements are made while a fluid is still and usually contained in a vessel.

Dynamic
Dynamic measurements are more automated. Typically made while the fluid is flowing in a conduit or pipe.

Basic measuring instrument


Sensor: this gives an output that is a function of the measured variables(the input applied to it), convert the magnitude of the parameter to a mechanical or electrical signal. Transducer: converts the output signal of the sensor to a signal that can be used easily (the sensor and the transducer are accomplished in the same device).

Amplifier : increase the process signal to a usable magnitude and also the signal conditioning occurs at amplifier (improving the sensitivity and resolution of measurement). Transmitter : filter out induced noise(signal processing) , and transmits data from one instrument component to another when the components are physically separated. Indicator : displays the process variable signal being measured.

The selection and specification of all instruments for process and utility applications includes:
Selection of the accuracy and range of the instrument to suit the process requirements. The fluid properties such as phase, viscosity, corrosiveness, erosiveness, toxicity, temperature and pressure during normal and abnormal conditions should be evaluated. The maintenance aspects should also be considered. purchase cost.

Instrument types and performance characteristics

Instrument types

Active and passive instruments


Active (Direct):instrument output is entirely produced by the quantity being measured. Passive (Inferred) : the quantity being measured simply modulates the magnitude of some external power source.

Null-type and deflection-type instruments


Deflection type: the quantity to be measured
produces an effect either in the form of a voltage or a current. This effect is then utilized to produce a torque that causes a mechanical deflection.

Null type: the quantity to be measured produces an effect that is compared with an already calibrated effect of another system.

Analogue and digital instruments


Analogue instrument : gives an output that varies continuously as the quantity being measured changes. The output can have an infinite number of values within the range that the instrument is designed to measure. Digital instrument : has an output that varies in discrete steps and so can only have a finite number of values.

Indicating instruments and instruments with a signal output

Instruments that have a signal-type output are commonly used as part of automatic control systems.

The class of indicating instruments normally includes all null-type instruments and most passive ones. Indicators can also be further divided into those that have an analogue output and those that have a digital display.

Smart and non-smart instruments

The advent of the microprocessor has created a new division in instruments between those that do incorporate a microprocessor (smart) and those that dont.

Static
characteristics of

instruments

Accuracy and inaccuracy (measurement uncertainty)


The accuracy of an instrument is a measure of how close the output reading of the instrument is to the correct value. Inaccuracy is the extent to which a reading might be wrong, and is often quoted as a percentage of the full-scale reading of an instrument. Accuracy can also be expressed as the percentage of span, percentage of reading, or an absolute value.

Precision/repeatability/reproducibility

Precision is a term that describes an


instruments degree of freedom from random errors.

Precision refers to the limits within which a


signal can be read and may be somewhat subjective.

Repeatability : describes the closeness of output


readings when the same input is applied repetitively over a short period of time, with the same measurement conditions, same instrument and observer, same location and same conditions of use maintained throughout.

Reproducibility : describes the closeness of


output readings for the same input when there are changes in the method of measurement, observer, measuring instrument, location, conditions of use and time of measurement.
The degree of repeatability or reproducibility in measurements from an instrument is an alternative way of expressing its precision.

Precision and Accuracy


Imprecise and inaccurate x x x x x x x Precise and inaccurate xx x xx x x

Imprecise and accurate x x


x x x x

Precise and accurate


x x x x x x x

Accuracy and Repeatability

Not repeatable Not accurate

Not repeatable Accurate

Repeatable Not accurate

Repeatable Accurate

Range or Span Range : is the region between the limits


within which a quantity is measured.

Span : is the algebraic difference between the


upper and lower range values.

Linearity
Linearity
is a measure of the proportionality between the actual value of a variable being measured and the output of the instrument over its operating range. The non-linearity is then defined as the maximum deviation of any of the output readings from a straight line.
It is normally desirable that the output reading of an instrument is linearly proportional to the quantity being measured

Sensitivity of measurement
Sensitivity is a measure of the change in the output of an instrument for a change in the measured variable, and is known as the transfer function= scale deflection value of measurand producing deflection

Threshold
If the input to an instrument is gradually increased from zero, the input will have to reach a certain minimum level before the change in the instrument output reading is of a large enough magnitude to be detectable. This minimum level of input is known as the threshold of the instrument.

Resolution
Resolution is the smallest amount of a variable that an instrument can resolve, i.e., the smallest change in a variable to which the instrument will respond.

Sensitivity to disturbance
As variations occur in the ambient temperature etc., certain static instrument characteristics change, and the sensitivity to disturbance is a measure of the magnitude of this change. Zero drift or bias describes the effect where the zero reading of an instrument is modified by a change in ambient conditions.

Sensitivity drift (also known as scale factor drift) defines the amount by which an instruments sensitivity of measurement varies as ambient conditions change. It is quantified by sensitivity drift coefficients that define how much drift there is for a unit change in each environmental parameter that the instrument characteristics are sensitive to.

Hysteresis
Hysteresis is the difference in readings obtained when an instrument approaches a signal from opposite directions.

Dead space
Dead space (Dead band) is defined as the range of different input values over which there is no change in output value.

You might also like