0% found this document useful (0 votes)
129 views158 pages

M07 - Calibrating Instrumentation Devices

The document outlines the curriculum for a Level III module on Calibrating and Configuring Instrumentation and Control Devices, with a nominal duration of 60 hours. It includes various units covering topics such as safety policies, configuration planning, calibration processes, and inspection of control devices. The module aims to equip trainees with the necessary skills to ensure instrument accuracy and maintain safe working environments.

Uploaded by

Meseret Sisay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
129 views158 pages

M07 - Calibrating Instrumentation Devices

The document outlines the curriculum for a Level III module on Calibrating and Configuring Instrumentation and Control Devices, with a nominal duration of 60 hours. It includes various units covering topics such as safety policies, configuration planning, calibration processes, and inspection of control devices. The module aims to equip trainees with the necessary skills to ensure instrument accuracy and maintain safe working environments.

Uploaded by

Meseret Sisay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 158

Industrial Electrical/Electronic

Control Technology
Level III
Based on October, 2023 Curriculum (Version-II)

Module Title: Calibrating and Configuring Instrumentation and


Control Devices

TTLM Code: EIS IEC3 M07 1023


Nominal duration: 60Hour
Prepared by:

September 2023
Addis Abeba
Table of Contents
List of Figure.......................................................................................................2
Acknowledgment..................................................................................................5
List of Acronym...................................................................................................6
Introduction to the Module......................................................................................7
This module covers the units:..........................................................................................7
Learning Objective of the Module...................................................................................7
Module Instruction........................................................................................................ 7
UNIT 1: Instrumentation and control device configuration...............................................8
1.1. OH&S policies and procedures and use PPE...........................................................9
1.2. Configuration Work plan....................................................................................13
1.3. Instrumentation and control devices and their standard..........................................23
1.4. Instrumentation and control devices Configuration................................................46
1.5. Material, Tools, equipment and testing devices.....................................................51
Self-Check -1.1........................................................................................................... 65
Operation sheet (1.1) - Analog Oscilloscope Procedures:.................................................66
Operatiion sheet (1.2) Digital Oscilloscope Procedures:.............................................70
Operatiion sheet (1.3) Digital Oscilloscope Procedures:.............................................73
UNIT 2. Calibrate Instrumentation and Control Devices................................................75
2.1. Devices normal functions...................................................................................76
2.2. Condition instrumentation and control devices......................................................77
2.3. Calibrate or adjust instrumentation and control devices.........................................79
2.4. Maintain configured and calibrated devices........................................................128
2.5. Unplanned events or conditions.........................................................................134
Self-check (2.2).........................................................................................................135
Self-check (2.2).........................................................................................................136
Operation sheet (2.1)..................................................................................................137
Operation sheet (2.2) ------------- Calibration of P/I converter.........................................139
UNIT 3. Inspect, test and calibrate instruments and control devices................................141
3.1. Calibration work inspection..............................................................................142
3.2. Examine instrumentation and control devices.....................................................144
3.3. Report result................................................................................................... 149
Self-Check 3............................................................................................................. 152

Page 1 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Expert Profile...................................................................................................154
Reference........................................................................................................155

List of Figure

Figure 1. 1 Valve actuators..........................................................................................................14


Figure 1. 2 Fully open and closed valve.......................................................................................14
Figure 1.3 Valve not fully close cannot fully open.......................................................................15
Figure 1. 4 Valve not fully open cannot fully close......................................................................16
Figure 1. 5 Complementary valve sequence................................................................................17
Figure 1. 6 Opening and closing operation graphs for valve........................................................18
Figure 1. 7 Exclusive valve sequencing.......................................................................................19
Figure 1. 8 Relation of valve opening and controller’s output graph..........................................20
Figure 1. 9 pH control process.....................................................................................................22
Figure 1. 10 Output of two control valves graphs........................................................................22
Figure 1. 11 Configuration process interface...............................................................................26
Figure 1. 12 Pressure sensors a 4-20mA output signal and pressure transducers........................27
Figure 1. 13 Pressure sensor unit................................................................................................28
Figure 1. 14 Pressure transducer.................................................................................................29
Figure 1. 15 Pressure transducer.................................................................................................29
Figure 1. 16 Analogue indicator...................................................................................................30
Figure 1. 17 Digital indicator......................................................................................................30
Figure 1. 18 Control valve...........................................................................................................32
Figure 1. 19 Electric valve actuator controlling ½ needle valves................................................32
Figure 1. 20 Annunciators vs. SCADA alarm systems................................................................34
Figure 1. 21 P&ID of a compressed air control system...............................................................35
Figure 1. 22 chlorine wastewater disinfection systems................................................................36
Figure 1. 23 Calibration hierarchy described by the example of Germany.................................37
Figure 1. 24 Calibration sticker on a mechanical pressure gauge................................................39
Figure 1. 25 Housing rotating Set Screw.....................................................................................46
Figure 1. 26 wiring block.............................................................................................................47
Figure 1. 27 Conduit installation diagram....................................................................................47
Figure 1. 28 sensor positions........................................................................................................48

Page 2 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 29 wiring diagram for the LD301 working as a transmitter.........................................49
Figure 1. 30 wiring diagram for the LD301 working as a controller (optional)..........................49
Figure 1. 31 wiring diagram for the LD301 in multidrop configuration.....................................50
Figure 1. 32 Load curve................................................................................................................50
Figure 1. 33 Different types of pliers...........................................................................................50
Figure 1. 34 different types screw driver.....................................................................................51
Figure 1. 35 different types of soldering iron..............................................................................51
Figure 1. 36 Different types wrenches.........................................................................................52
Figure 1. 37 Different types of water level..................................................................................52
Figure 1. 38 Tri-square.................................................................................................................52
Figure 1. 39 measuring tape.........................................................................................................53
Figure 1. 40 vernier scale.............................................................................................................54
Figure 1. 41 wire gauge................................................................................................................54
Figure 1. 42 Calibration Bench....................................................................................................56
Figure 1. 43 Digital DC PS, analog DC PS, Digital dual PS..................................................58
Figure 1. 44 PSG Analog Signal Generator, Function-generator, Tone-generator-and-wire-
tracker............................................................................................................................................60
Figure 1. 45 Analog Oscilloscope................................................................................................60
Figure 1. 46 Oscilloscope.............................................................................................................62
Figure 1. 47 pressure gauge.........................................................................................................62
Figure 1. 48 Sample level control loop........................................................................................73

Figure 2. 1 The Seaward PV150 handheld meter provides multiple PV array testing functions.
.......................................................................................................................................................77
Figure 2. 2 A block diagram of a typical instrumentation system with several different output
devices...........................................................................................................................................79
Figure 2. 3 Zero shift calibration error graph...............................................................................85
Figure 2. 4 Span shift calibration error........................................................................................85
Figure 2. 5 linearity calibration error...........................................................................................86
Figure 2. 6 Hysteresis calibration error........................................................................................86
Figure 2. 7 Electrical Calibration.................................................................................................87
Figure 2. 8 Mechanical Calibration (micrometer and varnier caliper).........................................88
Figure 2. 9 Flow Calibration........................................................................................................89
Figure 2. 10 Pipette Calibration...................................................................................................90

Page 3 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 2. 11 Pressure Calibration.................................................................................................90
Figure 2. 12 Temperature Calibration..........................................................................................91
Figure 2. 13 Calibration of pressure transmitters.........................................................................93
Figure 2. 14 Calibration of temperature transmitters...................................................................93
Figure 2. 15 Illustration of five-point calibration.........................................................................98
Figure 2. 16 Calibration Procedure of the Pressure Switch.........................................................99
Figure 2. 17 RTD Transmitter Calibration diagram...................................................................101
Figure 2. 18 Thermocouple transmitter equipment setup..........................................................102
Figure 2. 19 Performing a Sensor Trim on a Smart Transmitter...............................................105
Figure 2. 20 Performing a 4 - 20mA Trim on a Smart Transmitter...........................................106
Figure 2. 21 zero and span screws adjustment...........................................................................108
Figure 2. 22 Current to Pressure (I/P) Converter Calibration setup............................................109
Figure 2. 23 Temperature sensor with indicator........................................................................110
Figure 2. 24 block diagram of a smart pressure transmitter.......................................................112
Figure 2. 25 block diagram of analog pressure transmitter.......................................................113
Figure 2. 26 block diagram of a smart pressure transmitter with signal reading.......................114
Figure 2. 27 High-accuracy voltmeter for calibration................................................................116
Figure 2. 28 Dead- test calibrator...............................................................................................120
Figure 2. 29 Pneumatic dead weight tester................................................................................121
Figure 2. 30 Manometer.............................................................................................................121
Figure 2. 31 Electronic manometers..........................................................................................122
Figure 2. 32 Typical self-calibration gas analyzer.....................................................................127

Page 4 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Acknowledgment

The Ministry of Labor and skill would like to extend its gratitude to Regional Labor, and
skill/training Bureaus, TVT College Deans, Instructors, and industry experts for their financial
and technical support Calibrating and Configuring Instrumentation and Control Devices
training module. Finally, MOLS extends its gratitude to the following instructors and experts
who contributed to the development of this TTLM until its finalization.

Page 5 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
List of Acronym
ANSI American National Standards Institute
DAC Digital Analog Converter
EDS Electrical Data Sheet
ESIA Environmental Impact Assessment
ESMF Ethiopia Environmental and Social Management Framework
FAT Field Acceptance Test
HMI Human Machine Interface
ILO International Labor Organization
ISA Instrumentation, Systems and Automation Society
LRV Lower Range Value
OHS Occupational Health and Safety
OHSR Occupational Health and Safety
P&ID Pipe And Instrument Diagram
PH Pound Hydrogen
PLC Programmable Logic Controller
PPE Personal Protective Equipment
PS Power Supply
PSI Pound Ser Square
SCADA Supervisory Control and Data Acquisition
URV Upper Range Value
VDU Visual Display Units
WHS Workplace Safety and Healthy

Page 6 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Introduction to the Module
Instrument calibration and configuration is one of the primary processes used to maintain
instrument accuracy. Calibration is the process of configuring an instrument to provide a
result for a sample within an acceptable range and eliminating or minimizing factors that
cause inaccurate measurements is a fundamental aspect of instrumentation design.

Module also involves planning and preparation for configuration, demonstrate calibration
work and inspection, test and documentation for task.

This module covers the units:

 Plan and prepare for configuration


 Calibrate instrumentation and control devices
 Inspect, test and calibrate instruments and control devices

Learning Objective of the Module

 Determine Plan and prepare for configuration


 Apply calibration on instrumentation and control devices
 Inspect, test and calibrate instruments and control devices

Module Instruction

For effective use these modules trainees are expected to follow the following module
instruction:
1. Read the information written in each unit
2. Accomplish the Self-checks at the end of each unit
3. Perform Operation Sheets which were provided at the end of units
4. Do the “LAP test” giver at the end of each unit and
5. Read the identified reference book for Examples and exercise

Page 7 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
UNIT 1: Instrumentation and control device configuration

This learning guide is developed to provide you the necessary information regarding the following
content coverage and topics:
 OH&S policies and procedures and PPE
 Configuration work plan
 Instrumentation and control devices and their standards
 Instrumentation and control devices configuration
 Materials, tools, equipment and testing devices

This guide will also assist you to attain the learning outcome stated in the cover page. Specifically,
upon completion of this Learning Guide, you will be able to:

 Follow OH&S policies and procedures and use PPE


 Plan and prepare configuration
 Identify instrumentation and control devices
 Adopt instrumentation and Control standards
 Check instrumentation and control devices for configuration
 Obtain and Check materials, tools, equipment and testing devices for configuration

Page 8 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
1.1. OH&S policies and procedures and use PPE

1.1.1. Safety Guidelines and Policies


The best way to provide a safe operating environment is to make personnel and equipment
safety part of the planning process. You should examine every aspect of the system to determine
which areas are critical to operator or machine safety. If you are not familiar with PLC system
installation practices, or your company does not have established installation guidelines, you
should obtain additional information from NEMA, NEC and Local and State Agencies
A. Health and Safety Policy
Your Company Name is committed to the goal of providing and maintaining a healthy and safe
working environment, with a view to continuous improvement. This goal is only achievable by
adherence to established objectives striving to exceed all obligations under applicable
legislation, and by fostering an enthusiastic commitment to health, safety and the environment
within Your Company Name personnel, contractors and visitors.
Health, safety, the environment and loss control in the workplace are everyone’s responsibility.
Your Company Name expects that everyone will join in our efforts to provide a healthy and safe
working environment on a continuous day to day basis. Only through the dedication and efforts
of all individuals can Your Company Name succeeds in providing a healthy safe working
environment.

B. Occupational Health and Safety in Workplaces


Duties of Workers
One of your most important responsibilities is to protect your Health and Safety as well as that
of your co-workers. This booklet will discuss some of your duties under the occupational Health
and Safety legislation and help you to make your workplace safer and healthier.
The law requires
Workplaces under the jurisdiction are governed by your provincial legislation.
The legislation places duties on owners, employers, workers, suppliers, the self-employed and
contractors, to establish and maintain safe and healthy working conditions. The legislation is
administered by your provincial legislation. Your officials are responsible for monitoring
compliance.
Duties of Your Employer

Page 9 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Your employer is responsible for providing you with safe and healthy working conditions. This
includes a duty to protect you from violence, discrimination and harassment. You must
cooperate with your employer in making your workplace safe and healthy.
You’re Responsibilities
 You must also comply with the legislation. You have responsibilities to:
 protect your own Health and Safety and that of your co-workers;
 not initiate or participate in the harassment of another worker; and
 Co-operate with your supervisor and anyone else with duties under the legislation.
You’re Rights
The legislation gives your three rights:
 The right to know the hazards at work and how to control them;
 The right to participate in occupational health and safety; and
 The right to refuse work which you believe to be unusually dangerous.
You may not be punished for using these rights. An employer can be required to legally justify
any action taken against a worker who is active in Health and Safety.
You’re Right to Know
The Act requires your employer to provide you with all the information you need to control the
hazards you face at work. For example, chemicals at the workplace must be listed. You are
entitled to review this list. Your employer must train you to safely handle the chemicals you will
work with. If you are inexperienced, you must receive an orientation which includes;
 What to do in a fire or other emergency;
 First aid facilities;
 Prohibited or restricted areas;
 Workplace hazards; and
 Any other information you should know.
You must also be supervised closely by a competent supervisor.
You’re Right to Participate
You have the right to become involved in occupational Health and Safety.
Committees have duties to:
 Regularly inspect the workplace;
 Conduct accident investigations;
 Deal with the health and safety concerns of employees;
 Investigate refusals to work;

Page 10 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Meet at least (four times a year – consult your provincial act); and return minutes of each
meeting to the division.

You’re Right to Refuse


You have the right to refuse to do work which you believe is unusually dangerous. The unusual
danger may be to you or to anyone else. An unusual danger could include such things as:
 A danger which is not normal for your occupation or the job;
 A danger under which you would not normally carry out your job; and/or
 A situation, for which you are not properly trained, equipped or experienced.
To exercise this right, use the following guidelines.
 Once you believe that the work you have been asked to do is unusually dangerous, you
should inform your supervisor. Make sure that the supervisor understands that you are
refusing to do the disputed job for health and safety reasons. Work with the supervisor to
attempt to resolve the problem.
 If the problem cannot be resolved by the supervisor to your satisfaction, and no worker
health and safety representative or occupational health committee exists at the
workplace, your supervisor should phone the Division and ask for advice. You also have
the right to contact the Division at any time.
 The supervisor has the right to assign you to other work (at no loss in pay or benefits)
until the matter is resolved.

1.1.2. Safe Work Procedure and Practice


Safe Work Procedure
A Safe Work Procedure is a written step-by-step description of how a particular task is to be
performed that is used during performance of the work by the person performing the work (or by
two people doing the work – one reading and one doing). Examples of procedures include:
equipment start-up or shut-down procedures; normal operating procedures; written operating
instructions; abnormal operating procedures, emergency procedures, special test procedures,
maintenance procedures, construction installation procedures, calibration procedures,
hydrostatic test procedures, and inspection procedures.
Safe Work Practice
Safe Work Practices are written descriptions of how work is generally carried out and allow
flexibility in how the work is accomplished. Due to the diversity of circumstances and situations

Page 11 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
within JACOS, the information contained in Safe Work Practices cannot be considered complete
or applicable in every situation.
Supervisors and employees must refer to federal and provincial health and safety legislation, and
industry practices to ensure that the work is accomplished safely.
Development
Procedures should be developed for high-hazard work or where historical information,
legislation, a Hazard Assessment dictates.
Practices should be developed for commonly used equipment or process that does not
necessarily follow a step-by-step order.
Employees, Supervisors, and Management will be involved in the development and/or review of
these Safe Work Procedures and Practices.
All Safe Work Procedures and Practices will be developed using the standard JACOS Safe
Work
Procedure and Practice format and are based on a job hazard assessment.
Review
Employees, Supervisors, and technical experts will periodically review Safe Work Procedures
and Practices to ensure that they are complete, accurate and applicable on a minimum 3 year
bases or when warranted.
Availability
Safe Work Procedures and Practices applicable to the work being performed will be available to
all employees at the work site.
Action Guidelines
IHI Aerospace has established the following action guidelines to put its basic policies into
practice, based on its five fundamental safety rules.
 Specify OH & S targets to achieve this policy; establish and implement action schedules.
 Strive to reduce risks and to identify factors that lead to hazards by applying OH & S risk
assessment activities to all business activities.
 Establish and adhere to voluntary guidelines to ensure compliance with OH & S
regulations and customer agreements.
 Improve health and safety awareness through health and safety training and in-house
information activities.
 Periodically review the OH & S management system to ensure constant improvements.
 Pay particular attention to the following points, based on the specific characteristics of
IHI Aerospace’s operations.

Page 12 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Prevent accidents and disasters involving the handling of explosives and
pressurized gas.
 Prevent falls or accidents caused by hazardous operations.
 Prevent accidents and disasters involving the handling of hazardous substances
and chemicals.
 Provide inexperienced employees with safety training and comprehensive
instruction in work procedures.
 Institute improvements to create a safe, comfortable workplace.
 Eliminate accidents in commutes to and from the workplace.

1.2. Configuration Work plan

A. Configure instrumentation and control devices

System configuration is the process of setting up your hardware devices and assigning resources
to them so that they work together without problems. The way a system is set up, or the
assortment of components that make up the system.

 A properly-configured system will allow you to avoid nasty resource conflict problems,
and make it easier for you to upgrade your system with new equipment in the future.
 An improperly-configured system will lead to strange errors and problems, and make
upgrading a night mare.

 Types of Configurations

Configuration can refer to either hardware or software, or the combination of both. For
instance, a typical configuration for a PC consists of 32MB (megabytes) main memory, a floppy
drive, a hard disk, a modem, a CD-ROM drive, a VGA monitor, and the Windows operating
system.

Many software products require that the computer have a certain minimum configuration. For
example, the software might require a graphics display monitor and a video adapter, a particular
microprocessor, and a minimum amount of main memory.

When you install a new device or program, you sometimes need to configure it, which means
to set various switches and jumpers (for hardware) and to define values of parameters (for

Page 13 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
software). For example, the device or program may need to know what type of video adapter
you have and what type of printer is connected to the computer. Thanks to new technologies,
such as plug-and-play, much of this configuration is performed automatically.

 Hardware/Device Configuration

 Actuator bench-set

Valve actuators provide force to move control valve trim. For precise positioning of a control
valve, there must be a calibrated relationship between applied force and valve position. Most
pneumatic actuators exploit Hooke’s Law to translate applied air pressure to valve stem position.

F=kx

Where,

F=Force applied to spring in Newton’s (metric) or pounds (English)

k=Constant of elasticity, or spring constant in Newton’s per meter (metric) or pounds per foot
(English)

x=Displacement of spring in meters (metric) or feet (English)

Hooke’s Law is a linear function, which means that spring motion will be linearly related to
applied force from the actuator element (piston or diaphragm). Since the working area of a
piston or diaphragm is constant, the relationship between actuating fluid pressure and force will
be a simple proportion (F=PA). By algebraic substitution, we may alter Hooke’s Law to include
pressure and area:

F= k x PA= k x

Solving for spring compression as a function of pressure, area, and spring constant:

x=PA/k

Page 14 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 1 Valve actuators

When a control valve is assembled from an actuator and a valve body, the two mechanisms must
be coupled together in such a way that the valve moves between its fully closed and fully open
positions with an expected range of air pressures. A common standard for pneumatic control
valve actuators is 3 to 15PSI.

There are really only two mechanical adjustments that need to be made when coupling a
pneumatic diaphragm actuator to a sliding – stem valve: the stem connector and the spring
adjuster.

 The stem connector mechanically joins the sliding stems of both actuator and valve
body so they move together as one stem. This connector must be adjusted so neither the
actuator nor the valve trim prevents full travel of the valve trim:

Figure 1. 2 Fully open and closed valve

Page 15 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Note how the plug is fully against the seat when the valve is closed, and how the travel indicator
indicates fully open at the point where the actuator diaphragm nears its fully up ward travel
limit. This is how things should be when the stem connector is properly adjusted. If the stem
connector is set with the actuator and valve stems spaced too far apart (i.e. the total stem length
is too long), the actuator diaphragm will bind travel at the upper end and the valve plug will bind
travel at the lower end. The result is a valve that cannot every fully open:

Figure 1.3 Valve not fully close cannot fully open


A control valve improperly adjusted in this manner will never achieve full-flow capacity, which
may have an adverse impact on control system performance. If the stem connector is set with the
actuator and valve stems too closely coupled (i.e. the total stem length is too short), the actuator
diaphragm will bind travel at the lower end and the valve plug will bind travel at the upper end.
The result is a valve that cannot ever fully close:

Figure 1. 4 Valve not fully open cannot fully close

Page 16 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
This is a very dangerous condition: a control valve that lacks the ability to fully shut off. The
process in which this valve is installed may be placed in jeopardy if the valve lacks the ability to
stop the flow of fluid through it!

Once the stem length has been properly set by adjusting the stem connector, the spring adjuster
must be set for the proper bench set pressure. This is the pneumatic signal pressure required to
lift the plug of the seat. For an air-to-open control valve with a 3 to15PSI signal range, the
bench set pressure would be 3PSI.

Bench set is a very important parameter for a control valve because it establishes the seating
pressure of the plug when the valve is fully closed. Proper seating pressure is critical for tight
shut-off, which carries safety implications in some process services.

Consult the manufacturer’s instructions when adjusting the bench set pressure for any sliding-
stem control valve. These instructions will typically guide you through both the stem connector
and the spring adjuster procedures, to ensure both parameters are correctly set.

 Split-ranging

There are many process control applications in industry where it is desirable to have multiple
control valves respond to the output of a common controller. Control valves configured to
follow the command of the same controller are said to be split-ranged, or sequenced.

Split-ranged control valves may take different forms of sequencing. A few different modes of
control valve sequencing are commonly seen in industry: complementary, exclusive, and
progressive.

 Complementary valve sequencing

The first is a mode where two valves serve to proportion a mixture of two fluid streams, such
as this example where base and pigment liquids are mixed together to form colored paint:

Both base and pigment valves operate from the same 3to15PSI pneumatic signal output by the
I/P transducer (AY), but one of the valves is Air-To-Open while the other is Air-To-Close.

Page 17 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 5 Complementary valve sequence
The following table shows the relationship between valve opening for each control valve and the
controller’s output:

Table 1. 1 The relationship between valve opening and controller’s output

Controller I/P output Pigment valve Base valve

output (%) (PSI) (Stem position) (Stem position)

0% 3PSI fully shut fully opens

25% 6PSI 25%open 75%open

50% 9PSI half-open half-open

75% 12PSI 75%open 25%open

100% 15PSI fully open fully shut

An alternative expression for this split-range valve behavior is a graph showing each valve
opening as a colored stripe of varying width (wider representing further open). For this
particular mode of split-ranging, the graph would look like this:

Figure 1. 6 Opening and closing operation graphs for valve

Page 18 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
With this form of split-ranging, there is never a condition in the controller’s output range where
both valves are fully open or fully shut. Rather, each valve complements the others position.

 Exclusive valve sequencing

Other applications for split-ranged control valves call for a form of valve sequencing where both
valves are fully closed at a 50% controller output signal, with one valve opening fully as the
controller output drives toward 100% and the other valve opening fully as the controller output
goes to 0%. The nature of this valve sequencing is to have an “either-or” throttled path for
process fluid. That is, either process fluid flows through one valve or through the other, but
never through both at the same time.

A practical example of this form of split-ranging is in reagent feed to a pH neutralization


process, where the PH value of process liquid is brought closer to neutral by the addition of
either acid or caustic:

Figure 1. 7 Exclusive valve sequencing


Here, a pH analyzer monitors the pH value of the mixture and a single pH controller commands
two reagent valves to open when needed. If the process pH begins to increase, the controller
output signal increases as well (direct action) to open up the acid valve. The addition of acid to
the mixture will have the effect of lowering the mixtures pH value. Conversely, if the process
pH begins to decrease, the controller output signal will decrease as well, closing the acid valve
and opening the caustic valve. The addition of caustic to the mixture will have the effect of
raising the mixtures pH value.

Both reagent control valves operate from the same 3 to 15PSI pneumatic signal output by I/P
transducer (AY), but the two valves calibrated ranges are not the same. The Air-To-Open acid

Page 19 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
valve has an operating range of 9 to 15PSI, while the Air-To-Close caustic valve has an
operating range of 9 to 3PSI.

The following table shows the relationship between valve opening for each control valve and the
controller’s output:

Table 1. 2 Relationship between valve opening and controller’s output


Controller I/P output Acid valve Caustic valve
Output (%) (PSI) (Stem position) (Stem position)
0% 3PSI fully shut fully open
25% 6PSI fully shut half-open
50% 9PSI fully shut fully shut
75% 12PSI half-open fully shut
100% 15PSI fully open fully shut

Again, we may express the two valves exclusive relationship in the form of a graph, with
colored stripes representing valve opening:

Figure 1. 8 Relation of valve opening and controller’s output graph


Exclusive-sequenced control valves are used in applications where it would be un desirable to
have both valves open simultaneously. In the example given of a pH neutralization process, the
goal here is for the controller to be able to call for it her acid reagent or caustic reagent to push
the pH value either direction as needed. However, simultaneously adding both acid and caustic
to the process would be wasteful, as one reagent would simply neutralize the other with no benfit
to the process liquid itself.

Page 20 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Progressive valve sequencing

A third form of control valve sequencing is used to expand the operating range of flow
control for some fluid beyond that which a single control valve could muster. Once again pH
control provides a suitable example to illustrate an application of this form of sequencing.

PH is an especially challenging application of process control because the dynamic range of


the process is enormous. Each unit of pH value change represents a ten-fold change in
hydrogen ion concentration within the process liquid. This means the difference in ion
concentration between a process liquid having a value of 10pH and a process liquid having a
value of 7pH is a factor of one thousand! Consequently, the flow rate of reagent necessary to
neutralize a process liquid stream may vary widely.

It is quite possible that a control valve sized to handle minimum flow will simply be too small
to meet the demands of high flow when needed.

Yet, a control valve sized large enough to meet the maximum flow rate may be too large to
precisely turn down when just a trickle of reagent is needed.

This general control problem was encountered by automotive engineers in the days when
carburetors were used to mix gasoline with air prior to combustion in an engine.

A carburetor sized to idle well and respond to the needs of in-town driving would not flow
enough air to provide high-end performance. Conversely, a large carburetor suitable for
performance driving would be almost uncontrollable for low-speed and idling operation.

Their solution to this problem was the progressive carburetor, having two butter fly valves to
throttle the flow of air into the engine.

One butter fly valve handled low amounts of air flow only, while a larger butter fly valve
opened up only when the accelerator pedal was nearly at its maximum position. The
combination of two differently- sized butter fly valves progressively opened gave drivers the
best of both worlds. Now, an automobile engine could perform well both at low power levels
and at high power levels.

On a fundamental level, the problem faced in pH control as well as by early automotive


engineers is the same thing: insufficient range ability.

Page 21 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Some processes demand a greater range of control than any single valve can deliver, and it is
within these processes that a pair of progressively-sequenced control valves is a valid solution.

Applying this solution to a pH control process where the incoming liquid always has a high pH
value, and must be neutralized with acid:

Figure 1. 9 pH control process


Proper sequencing of the small and large acid control valves is shown in the table and the graph:

Table 1. 3 sequencing of the small and large acid control valves


Controller I/P output Small acid valve large acid valve

output(%) (PSI) (stem position) (stem position)

0% 3PSI fully shut fully shut

25% 6PSI half-open fully shut

50% 9PSI fully open fully shut

75% 12PSI fully open half-open

100% 15PSI fully open fully open

Page 22 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 10 Output of two control valves graphs
With the two acid control valves sequenced progressively, the control system will have
sufficient range ability to handle widely varying process conditions.

1.3. Instrumentation and control devices and their standard


Instrumentation and control devices are used to measure and control process variables in various
industries. These variables include pressure, temperature, humidity, flow, pH, force, and
speed. The field of instrumentation and control engineering is interdisciplinary and requires
knowledge of chemistry, mechanics, electricity and magnetism, electronics, microcontrollers
and microprocessors, software languages, process control, pneumatics and hydraulics principles,
and communications.

The primary objective of instrumentation is to measure the process variables accurately. Control
devices are used to maintain the process variables at a desired set point.
Instruments are devices that measure or manipulate variables such as flow, temperature, level, or
pressure. They can be classified into different types based on various criteria, such as location
(in-field or panel), power source (pneumatic or electronic), output signal (analog or digital), or
measurement parameter (pressure, temperature, flow, level, etc.) .
Control devices are mechanical, electro-mechanical, or electronic devices that use input signals
to change conditions or values in processes or oversee access to buildings, gated areas,
etc. Controllers generally receive voltage inputs from sources, analyze the inputs, and then
oversee condition changes via signal outputs.
Instrumentation and control engineering (ICE) is a branch of engineering that studies the
measurement and control of process variables using instruments and software tools. ICE
involves the design and implementation of systems that incorporate sensors, transmitters,
controllers, actuators, and displays 4

Page 23 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
1.3.1 Types of instrumentation and control device to be configured
Configuration is the process of arranging parts or elements in a particular form, figure, or
combination. In the context of Instrumentation and Control (I&C) Design, configuration
refers to the planning, preparation, and arrangement of instrumentation and control devices in
line with job requirements.
The purpose of I&C design is to cover the project-specific technical requirements that are to be
followed throughout the Feed or Detailed Engineering Phase while preparing engineering
deliverables.
Control System Philosophy

1. Package Control System Philosophy


2. Power Supply & Instrument Air Supply Philosophy
3. Hazardous Area Classification Requirements
4. Basic Requirements Related to Field Instruments and Cables
5. Basic Requirements Related to Installation & Related Items
6. Spare Philosophy

The design basis is considered as a mother document for all the engineering activities or
deliverables to be carried out in a particular project.

In summary, configuration is important in I&C design because it ensures that instrumentation


and control devices are arranged in a way that meets job requirements and technical
specifications

Access Control Systems

Access Control Systems are electronic or electro-mechanical devices or systems composed of


remote stations and centralized controlling stations which are used for security monitoring and
to manage the movement of personnel, vehicles, materials, etc. through entrances and exits. Key
specifications include the intended application, type, and method of identification. An access
control system uses identifying methods such as facial recognition, fingerprints, metal detection,
bar coding, and swipe cards, to permit entrance and exit from various secured areas. They can be
used as well for tracking the movements of bulk materials, for allowing the use of certain
machines such as x-ray equipment, or for vending various tools. Typical applications include
high security operations in military facilities.

Page 24 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Flow Controllers

Flow Controllers are mechanical or electro-mechanical devices composed of measuring sensors


or controlling elements that are used to ensure media flow in manufacturing processes. Key
specifications include the intended application, type, media, flow rate, connection style, pressure
rating, as well as mounting style. Flow controllers are used primarily in process control
applications. They are made for gases as well as liquids of many types, and are available in
many sizes and configurations depending on the application. Some flow controllers consist of
mechanical valves, while others are electronically operated and controlled. Flow controllers are
sometime called batch controllers for their ability accurately control different ingredients in a
chemical batching process, for instance.

Level Controllers

Level Controllers are mechanical or electro-mechanical devices used for controlling the levels of
tanks, vats, etc. usually by means of pumps, and they are sometimes called pump controllers.
Some level controllers incorporate sensors or other means of detecting the level of products in
containers, etc. while others require inputs from remote switches or sensors. Key specifications
include the intended application, medium type, and the control method. Level controllers rely on
a variety of sensor styles including conductive, capacitance, optical, and ultrasonic, in addition
to mechanical arms, floats, and levers. They can be used for liquids or bulk dry goods such as
grain or powders. Many industries use level controllers in various processes. A level controller
receives an input signal, compares it to a set point and via an output signal adjusts the level as
the process requires.

Pressure Controllers

Pressure Controllers are electro-mechanical devices used for controlling process/system pressure
in various industrial processes. Key specifications include the intended application, type, control
method, sensing element, and the pressure range. Pressure controller types include differential
gap, proportional, on/off types, among others. They rely on a variety of sensing elements such as
a bellows, diaphragms, capsules, bourdon tubes, etc. In operation, the controller receives a
process/system pressure input, compares it to the desired set point in the pressure controller,
then outputs a signal (usually to a control valve) which adjusts the process/system pressure (if
necessary) back to the set point.

Page 25 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Programmable Logic Controllers
Programmable Logic Controllers are electronic devices used for controlling automatic machinery,
processes, etc. Key specifications include the intended application, type, function, mounting style, as well
as power requirements. Programmable logic controllers are configurable with a range of input and output
modules. They control various operating parameters and functions by receiving input signals from
various sources and adjusting machine functions as required by the processes through sets of
programmed instructions. Some PLC makers have begun marketing programmable automation
controllers, or PACs, which are have features beyond those of ordinary PLCs but perform similar tasks.
PLCs are modular in construction and can be fitted with various modules for inputs, outputs, etc.
Data Acquisition System
DAS involves the process of sampling real world physical conditions and conversion of the
resulting samples into digital numeric values that can be manipulated by a computer. Physical
conditions relate to process variable or process conditions, e.g. pressure, temperature, level,
flow, alarm conditions, events, etc.

Page 26 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 11 Configuration process interface
Universal Process/Temperature Controllers

Universal Process/Temperature Controllers are electronic devices used to control various


process parameters, including temperature. Key specifications include the intended application,
control method, input and output types, features, connections type and mounting style, number
of inputs, communication interface, as well as the power specifications for the input and output
process. Universal process/temperature controllers are used mainly in manufacturing
applications for ensuring various process values are within their operational ranges. Several
types of control methods include set point, proportional, etc. as well as input types such as
thermocouples, voltage, etc. that are used to sense and control process parameters. Controllers
can be mounted in panels, walls, DIN rails, etc. Typical applications include boilers, lasers,
tanks, molding machines, pumps, furnaces, etc.

Sensors/Transmitters/Transducers
Some of the main differences between the terms pressure sensor, transducer, and transmitter.
The terms pressure sensor, pressure transducer and pressure transmitter are somewhat
interchangeable in the industrial world. Pressure sensors can be described with a 4-20mA output
signal and pressure transducers with a millivolt signal. Once the details are described to define
the output signal and application, the proper term can be set. Here is a quick guideline on the
terms and some benefits and limitations for each.

Page 27 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 12 Pressure sensors a 4-20mA output signal and pressure transducers

Pressure sensor
A pressure sensor simply monitors this pressure and can display it in one of the several units
known around the world. This is commonly the “Pascal”, “Bar”, and “PSI” (Pounds per Square
Inch) in the United States.

Figure 1. 13 Pressure sensor unit


In a nutshell, a pressure sensor converts the pressure to a small electrical signal that is
transmitted and displayed. These are also commonly called pressure transmitters because of this.
Two common signals that are used is a 4 to 20 milliamps signal and a 0 to 5 Volts signal.

Page 28 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Pressure transducer
High level voltage or frequency output signal including 0.5 to 4.5V ratio metric (output signal
is proportional to the supply), 1-5V and 1-6 kHz. These output signals should be used within
twenty (20) feet of the electronics. Supply voltages are typically from 8-28VDC, except for
the 0.5-4.5V output, which requires a 5VDC regulated supply. Older voltage output signals,
such as 0-5V, do not have a "live zero" where there is signal when the sensor is at zero
pressure. The risk is that the system does not know the difference between a failed sensor with
no output and zero pressure.

Figure 1. 14 Pressure transducer

Pressure transmitter
Current output signal, i.e. 4-20mA (4 to 20mA), the current, rather than the voltage, is
measured on the device, rather than the voltage; TE pressure transmitters are two wire devices
(red for supply, black for the ground). 4-20mA pressure transmitters offer good electrical noise
immunity (EMI/RFI), and will need a power supply of 8-28VDC. Because the signal is
producing current, it can consume more battery life if operating at full pressure.

Page 29 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 15 Pressure transducer
Indicators both analogue and digital

 Analogue Indicator
An indicator on which the value of the physical quantity measured is indicated by an index and
graduated scale, one of which is fixed and the position of the other is a continuous function of
the magnitude of the physical quantity being measured. On analogue indicators the indication
after resetting shall be within 0.2 scale interval of zero

Figure 1. 16 Analogue indicator


 Digital Indicator

An indicator on which the value of the physical quantity measured is represented by a series of
aligned digits which change abruptly such that no indication can be obtained between digits; a
digital indicator does not have graduation lines.

Page 30 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
On digital indicators, the indication after resetting shall be zero.

Figure 1. 17 Digital indicator


Controllers including PLC controlled devices
Type of control systems in which the system elements are dispersed but operated in coupled
manners: -
Fault Tolerant System: - A system which is designed to carry out its assigned function even
in the presence of one or more faults in the hardware or software.

Final Control Element: - The final control element is often an on/off valve or a control valve
but may be another device such as a pump. Mean Time between Failures: - This is the average
time between failures of the different components that make up a system including the time to
repair the fault.

Mean Time to Repair: - This is the average time taken to identify and repair a fault.
Control valves

Control valves are industrial valves specifically designed to control liquid media and gases
transmitted through a pipeline.
The operation principle of control valves is based on the need for a permanent change of flow
path by way of changing the size of the orifice of the valve. Control valves can be operated
manually, by means of a pneumatic single-piston actuator, electrically, by a solenoid or a
diaphragm actuator.
In most cases, control valves tend to leak even when fully closed. This occurs due to the peculiar
design features of these valves. Manufacturing methods and techniques help to minimize the
leakage to acceptable levels. In this case, the valve is referred to as a “shutoff” control valve.
Control valves are applied in a variety of environments: water and heat supply systems, oil and

Page 31 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
gas pipelines, the chemical industry, combined heat and power stations, hydroelectric power
stations and nuclear power stations, etc.
The main structural components of a control valve are
 Body,
 Trim and
 Actuator.
 The trim controls fluid flow.
Trims have different designs and they are selected according to control process requirements and
operating medium characteristics.
Taking into account that control valves are often installed in pipelines with aggressive and
abrasive media, high pressure and high temperature conditions and under conditions of
cavitation, the trim sees heavy use and wears out relatively quickly. Many manufacturers
produce control valves with trims constructed as separate units. This design strategy has a
number of advantages:
 The capability to repair or replace the trim without valve removal makes trim assembly,
fitting and finishing in the course of control valve installation or repair less labor-
intensive;
 The trim can be made of different materials than other body parts, thus providing better
corrosion and erosion resistance;
 For various operating media and working conditions, different trims can be installed into
a typical control valve body.

Figure 1. 18 Control valve


Actuators
An actuator is a component of a machine that is responsible for moving and controlling a
mechanism or system, for example by opening a valve. In simple terms, it is a "mover".

Page 32 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
An actuator requires a control signal and a source of energy. The control signal is relatively low
energy and may be electric voltage or current, pneumatic, or hydraulic fluid pressure, or even
human power. Its main energy source may be an electric current, hydraulic pressure,
or pneumatic pressure. When it receives a control signal, an actuator responds by converting the
source's energy into mechanical motion. In the electric, hydraulic, and pneumatic sense, it is a
form of automation or automatic control.

Figure 1. 19 Electric valve actuator controlling ½ needle valves


Recorders
Recording precipitation automatically has the advantage that it can provide better time
resolution than manual measurements, and it is possible to reduce the evaporation and wetting
losses. Three types of automatic precipitation recorders are in general use, namely
 The weighing recording type,
 The tilting or tipping-bucket type and
 The float types.
Only the weighing type is satisfactory for measuring all kinds of precipitation, the use of the
other two types being for the most part limited to the measurement of rainfall. Some new
automatic gauges that measure precipitation without using moving parts are available. These
gauges use devices such as capacitance probes, pressure transducers, and optical or small radar
devices to provide an electronic signal that is proportional to the precipitation equivalent. The
clock device that times intervals and dates the time record is a very important component of the
recorder. Because of the high variability of precipitation intensity over a 1 min timescale, a
single 1 min rainfall intensity value is not representative of a longer time period. Therefore,
1 min rainfall intensity should not be used in a temporal sampling scheme, such as one synoptic
measurement every one or three hours. Very good time synchronization, better than 10 s, is
required between the reference time and the different instruments of the observing station.

Page 33 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Annunciator associated with the installed devices
In industrial process control, an annunciator panel is a system to alert operators of alarm
conditions in the plant. Multiple back-lit windows are provided, each engraved with the name of
a process alarm.
Lamps in each window are controlled by hard-wired switches in the plant, arranged to operate
when a process condition enters an abnormal state (such as high temperature, low pressure, loss
of cooling water flow, or many others).
Single point or multipoint alarm logic modules operate the window lights based on a preselected
is an 18.1 or custom sequence. In one common alarm sequence, the light in a window will flash
and a bell or horn will sound to attract the operator’s attention when the alarm condition is
detected.
The operator can silence the alarm with a button, and the window will remain lit as long as the
process is in the alarm state. When the alarm clears (process condition returns to normal), the
lamps in the window go out.
Annunciator panels were relatively costly to install in a plant because they had dedicated wiring
to the alarm initiating devices in the process plant.
Since incandescent lamps were used, a lamp test button was always provided to allow early
detection of failed lamps. Modern electronic distributed control systems usually require less
wiring since the process signals can be monitored within the control system, and the engraved
windows are replaced by alphanumeric displays on a computer monitor.
Behavior of alarm systems, and colors used to indicate alarms, are standardized. Standards such
as is an 18.1 or en 60073 simplify purchase of systems and training of operators by giving
standard alarm sequences.
Principle
Whenever there is a change of input contacts from normally open to close or from normally
close to open position, annunciator changes from rest condition to alarm condition. Hence there
is an immediate recognition of fault input which will have a corresponding visual and audio
alarm as per the particular selected program sequence.
The base unit of alarm annunciator has four programmable keys for mute, acknowledge, reset &
test function.
 On pressing the Mute key, the internal buzzer can be deactivated.
 Acknowledge key is used to accept the fault condition,
 Reset key enables to reset the alarm annunciator to its default state and
 Test key helps to perform the complete test of the system

Page 34 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 20 Annunciators vs. SCADA alarm systems
SCADA systems were formerly considered the preferred alternative to discrete annunciators. A
software-based solution, with almost endless ability to analyze, present and process alarms, has
the potential for replacing discrete alarms switches altogether. However, software carries its own
reliability risks.
New annunciator panels are utilizing long lasting and bright LEDs that significantly reduce the
cost and maintenance of the panels. These new versions of the traditional system are still
preferred over computer-based systems especially in critical plants like nuclear power
generation, oil and gas.
In addition to the above, latest annunciator designs now feature clever electronics to give them
very high immunity to noise, and can therefore reduce the amount of false alarms due to noise.
Process switches
Another type of instrument commonly seen in measurement and control systems is the process
switches. The purpose of a switch is to turn on and off a device like heaters, motors, valves
etc… with varying process conditions.
Usually, switches are used to activate alarms to alert human operators to take special action or
can be used to trip or initiate interlocks.
In other situations, switches are directly used as control devices like there is level switch
installed in a tank to prevent damage of a pump, pump will be automatically stopped/tripped
when level reaches low in the tank.
Process Switch with Alarms
The following P&ID of a compressed air control system shows both uses of process switches:

Page 35 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 21 P&ID of a compressed air control system

The “PSH” (pressure switch, high) activates when the air pressure inside the vessel reaches its
high control point. The “PSL” (pressure switch, low) activates when the air pressure inside the
vessel drops down to its low control point.
Both switches feed discrete (on/off) electrical signals to a logic control device (symbolized by
the diamond) which then controls the starting and stopping of the electric motor-driven air
compressor.
Another switch in this system labeled “PSHH” (pressure switch, high-high) activates only if the
air pressure inside the vessel exceeds a level beyond the high shut-o ff point of the high-pressure
control switch (PSH).
If this switch activates, something has gone wrong with the compressor control system, and the
high-pressure alarm (PAH, or pressure alarm, high) activates to notify a human operator.
All three switches in this air compressor control system are directly actuated by the air pressure
in the vessel: in other words, these are direct process-sensing switches allowing us to build
on/off control systems and alarms for any type of process.
Example of Process Alarms and Switches
For example, the chlorine wastewater disinfection system shown earlier may be equipped with a
couple of electronic alarm switches to alert an operator if the chlorine concentration ever
exceeds pre-determined high or low limits:

Page 36 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 22 chlorine wastewater disinfection systems

1.3.2 Adopt instrumentation and Control standards


A. Traceability and calibration hierarchy

 Hierarchy of the standards and calibration services


To be able to compare measuring results, they must be “traceable” to a national or international standard
via a chain of comparative measurements. To this end, the displayed values of the measuring instrument
used or a measurement standard are compared over one or several stages to this standard. At each of
these stages, calibration with a standard previously calibrated with a higher-ranking standard is carried
out. In accordance with the ranking order of the standards from the working standard or factory standard
and the reference standard to the national standard the calibration bodies have a calibration hierarchy.
This ranges from the in-house calibration laboratory to the accredited calibration laboratory and to the
national metrological institute (Fig 2.23).

Figure 1. 23 Calibration hierarchy described by the example of Germany

Page 37 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Traceability in practice
The German Calibration Service DKD (Deutscher Kalibrier dienst) designates the following as
essential elements of traceability:
• The chain of comparison must not be interrupted.
• In each stage of the calibration chain, the measurement uncertainty must be known so
that the total measurement uncertainty can be calculated. As a rule, a higher-ranking
measuring instrument should have a measuring accuracy three to four times higher than
the instrument calibrated by it.
• Each stage of the calibration chain must be documented as must the result.
• All bodies carrying out a stage in this chain must prove their competence by means of
accreditation.
• Depending on the required measuring accuracy and technical requirements, calibrations
must be repeated at appropriate intervals
B. Calibration on an international level
 BIPM
On an international level, the BIPM (International Bureau of Weights and Measures,
abbreviation of French: Bureau International des Poised Measures) coordinates the development
and maintenance of primary standards and the organization of international comparative
measurements. Decisions about the representation of the primary standards are made by the
CGPM (General Conference for Weights and Measures, abbreviation of French: Conference
General Des Poised Measures). The participants of the conferences, which take place every four
to six years, are the representatives of the 51 signatory states of the international Meter
Convention and the representatives of those 26 associated member states without full voting
rights.
 National metrological institutes
On a national level, institutes are responsible in most cases for metrology. They maintain the
national standards to which all calibrated measuring instruments can be traced and ensure that
these primary standards are comparable on an international level.
 Accredited calibration laboratories

Page 38 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Accredited calibration laboratories often take on calibration as external service providers for
those companies that do not have the required equipment themselves. However, they themselves
can also be part of a company and calibrate all measuring instruments within it.
To this end, they are equipped with their own working or factory standards which are calibrated
at the proper time intervals with the smallest possible measurement uncertainty using the
reference standard of the appropriate national metrological institute or other accredited
calibration laboratories.
 Professional calibration
The professional execution of calibrations is governed by various standards, regulations and
directives. For a measuring instrument to be calibrated in the first place, it must fulfill certain
basic requirements. The physical conditions under which calibration can be carried out must also
be known and taken into account.
Under these conditions, it is possible to select a calibration procedure suitable for the
requirements.
 Standards, regulations and calibration directives
In essence, regulations for the calibration of measuring instruments take effect whenever a
company decides to observe a standard or directive for its calibration or when it manufactures
products whose production is subject to legal regulations.
 Quality assurance standards
Of great importance for quality assurance are standards and directives such as the ISO 9000
series of standards, which is being implemented more and more frequently in all industrialized
nations. In Clause 7.6 “Control of monitoring and measuring equipment” of the ISO 9001:2008
standard “Quality management systems – Requirements”, there is a specific requirement that
any inspection equipment that directly or indirectly affects the quality of the products must be
calibrated. This includes, for example, test equipment used as a reference in measurement rooms
or directly in the production process.
The ISO 9000 standards do not stipulate a validity period for calibrations – which would not
make a lot of sense given the different technologies of measuring devices – but they do specify
that any inspection equipment must be registered and then a distinction must be made as to
whether or not it must be regularly calibrated. Inspection plans must be drawn up in which the
scope, frequency, method and acceptance criteria are defined. Individual calibrations are to be
documented in detail. Labels on the measuring instruments (Fig. 1.24) or appropriate lists must
show when each piece of inspection equipment needs to be recalibrated.

Page 39 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 24 Calibration sticker on a mechanical pressure gauge

It is essential to recalibrate when a measuring instrument has been altered or damaged during
handling, maintenance or storage.
C. Requirements for measurement management systems
Closely related to the ISO 9000 series of standards in terms of its structure is the ISO
10012:2004 standard “Measurement management systems – Requirements for measurement
processes and measuring equipment”. It defines the requirements of the quality management
system that can be used by companies in order to establish confidence in the measurement
results obtained. In measurement management systems, it is not only the measuring device but
the entire measuring process that is considered. This means that those responsible not only have
to determine the measurement uncertainty during calibration, but also have to verify and
evaluate the measurement uncertainty in use. To this end, statistical methods are also used.
 Industry-specific directives
In addition to such universal standards, individual sectors of industry have their own directives
for the quality assurance of measuring devices, for example the automotive industry. American
automobile manufacturers have developed the QS 9000 directive in which the ISO 9000
standards have been substantially supplemented by industry and manufacturer specific
requirements, and in part tightened.
In the meantime, the American QS 9000, the German VDA 6.1 and other country-specific
regulations have been combined to some extent in the international ISO/TS 16969 standard. This
saves many suppliers multiple certifications.
 Legal provisions
Quality assurance standards and directives must only be observed by companies that want to be
certified. The situation is completely different when, for example, drugs, cosmetics or foodstuffs
are being manufactured. Here legal regulations, whose compliance is controlled by state
agencies, often apply. Due to international trade relations, the regulations of the American Food
and Drug Administration (FDA) are important. Thus, the Code of Federal Regulation (CFR)
requires the “calibration of instruments, apparatus, gauges, and recording devices at suitable
intervals in accordance with an established written program containing specific directions,

Page 40 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
schedules, limits for accuracy and precision, and provisions for remedial action in the event
accuracy and/or precision limits are not met”. European laws have similar stipulations.

 Here are some examples of conditioning instrumentation and control standards:

 ISA (Instrumentation, Systems and Automation) Society (formerly Instrument Society


of America)
The International Society of Automation (ISA), formerly known as The Instrumentation,
Systems, and Automation Society
The Society includes the following: -
 Engineers  Educators and
 Technicians  Students
 Businesspeople
The society is more commonly known by its acronym, ISA, and the society's scope now
includes many technical and engineering disciplines. ISA is one of the foremost professional
organizations in the world for setting standards and educating industry professionals in
automation. Instrumentation and automation are some of the key technologies involved in nearly
all industrialized manufacturing. Modern industrial manufacturing is a complex interaction of
numerous systems. Instrumentation provides regulation for these complex systems using many
different measurement and control devices. Automation provides the programmable devices that
permit greater flexibility in the operation of these complex manufacturing systems.
In addition to the member-driven aspects of the ISA, major ISA interests and products are
divided into departments headed by a department vice president.
These departments are:
 Automation & Technology  Publications
 Industries & Sciences  Standards & Practices
 Image & Membership  Strategic Planning
 Professional Development  Web
Technical divisions
ISA’s technical divisions, established for the purpose of increased information exchange within
tightly focused segments of the fields of instrumentation, systems, and automation are organized
under the Automation & Technology or Industries & Sciences Departments, depending upon the
nature of the division.

Page 41 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
The divisions in the Automation & Technology Department are:

Page 42 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Analysis
 Automatic Control Systems
 Computer Technology
 Management
 Process Measurement &Control
 Robotics & Expert Systems
 Safety
 Telemetry & Communications
 Test Measurement

Industries & Sciences Divisions are:


 Aerospace Industries
 Chemical & Petroleum Industries
 Construction & Design
 Food & Pharmaceuticals Industries
 Mining & Metals Industries
 Power Industry
 Pulp & Paper Industries
 Water & Wastewater Industries

 ANSI (American National Standards Institute)


The American National Standards Institute (ANSI) is a private non-profit organization that
oversees the development of voluntary consensus standards for products, services, processes,
systems, and personnel in the United States. The organization also coordinates U.S. standards
with international standards so that American products can be used worldwide.
ANSI accredits standards that are developed by representatives of other standards
organizations, government agencies, consumer groups, companies, and others. These standards
ensure that the characteristics and performance of products are consistent, that people use the
same definitions and terms, and that products are tested the same way. ANSI also accredits
organizations that carry out product or personnel certification in accordance with requirements
defined in international standards. ANSI's members are government agencies, organizations,
corporations, academic and international bodies, and individuals. In total, the Institute represents
the interests of more than 125,000 companies and 3.5 million professionals.
Process

Page 43 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Though ANSI itself does not develop standards, the Institute oversees the development and use
of standards by accrediting the procedures of standards developing organizations. ANSI
accreditation signifies that the procedures used by standards developing organizations meet the
Institute's requirements for openness, balance, consensus, and due process.
ANSI also designates specific standards as American National Standards, or ANS, when the
Institute determines that the standards were developed in an environment that is equitable,
accessible and responsive to the requirements of various stakeholders.
Voluntary consensus standards quicken the market acceptance of products while making clear
how to improve the safety of those products for the protection of consumers. There are
approximately 9,500 American National Standards that carry the ANSI designation.
 The American National Standards process involves: consensus by a group that is open to
representatives from all interested parties
 Broad-based public review and comment on draft standards
 Consideration of and response to comments
 Incorporation of submitted changes that meet the same consensus requirements into a
draft standard
 Availability of an appeal by any participant alleging that these principles were not
respected during the standards development process.

 ASME (American Society of Mechanical Engineers)


The American Society of Mechanical Engineers (ASME) is a professional association that, in its
own words, "promotes the art, science, and practice of multidisciplinary engineering and allied
sciences around the globe" via "continuing education, training and professional development,
codes and standards, research, conferences and publications, government relations, and other
forms of outreach.” ASME is thus an engineering society, a standards organization, a research
and development organization, a lobbying organization, a provider of training and education,
and a nonprofit organization. Founded as an engineering society focused on mechanical
engineering in North America, ASME is today multidisciplinary and global.
ASME has over 130,000 members in 158 countries worldwide.
ASME is one of the oldest standards-developing organizations in America. It produces
approximately 600 codes and standards covering many technical areas, such as fasteners,
plumbing fixtures, elevators, pipelines, and power plant systems and components. ASME's
standards are developed by committees of subject matter experts using an open, consensus-based
process. Many ASME standards are cited by government agencies as tools to meet their

Page 44 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
regulatory objectives. ASME standards are therefore voluntary, unless the standards have been
incorporated into a legally binding business contract or incorporated into regulations enforced
by an authority having jurisdiction, such as a federal, state, or local government agency. ASME's
standards are used in more than 100 countries and have been translated into numerous
languages.
The largest ASME standard, both in size and in the number of volunteers involved in its
preparation, is the ASME Boiler and Pressure Vessel Code (BPVC). The BPVC provides rules
for the design, fabrication, installation, inspection, care, and use of boilers, pressure vessels, and
nuclear components. The code also includes standards on materials, welding and brazing
procedures and qualifications, nondestructive examination, and nuclear in-service inspection.

 NEC (National Electric Code)


The National Electrical Code (NEC), or NFPA 70, is a regionally adoptable standard for the safe
installation of electrical wiring and equipment in the United States. It is part of the National Fire
Codes series published by the National Fire Protection Association (NFPA), a private trade
association. Despite the use of the term "national", it is not a federal law. It is typically adopted
by states and municipalities in an effort to standardize their enforcement of safe electrical
practices. In some cases, the NEC is amended, altered and may even be rejected in lieu of
regional regulations as voted on by local governing bodies.
The "authority having jurisdiction" inspects for compliance with these minimum standards.

 IEC (International Electro-technical Commission)


The International Electro Technical Commission (IEC) is the leading global organization that
prepares and publishes International Standards for all electrical, electronic and related
technologies.
The International Electro Technical Commission is an organization which was formed as a result
of the Resolution of the Chamber of Government Delegates at the International Electrical
Congress of St. Louis (U.S.A.), in September 1904.
The name of the organization is the International Electro technical Commission. The abbreviated
title is "the IEC". The organization is constituted as a corporate association with legal entity in
accordance with Articles 60 et seq. of the Swiss Civil Code.
The work of the IEC is conducted under the IEC Statutes and Rules. These IECEx Conformity
Mark Licensing Regulations are subordinate to the IEC Statutes and Rules with the IEC Statutes
and Rules taking precedence over any requirement contained within these IECEx Conformity
Mark Licensing Regulations, should a conflict arise.

Page 45 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
The IECEx Scheme is the Scheme of the IEC for Conformity Assessment to Standards relating
to Equipment for Use in Explosive Atmospheres, as provided for in accordance with Article 2 of
the IEC Statutes and Rules, 2001 edition, (incorporating amendments approved by Council in
2004 and 2005).
The standard is suitable for use whenever any reference to an instrument is required in the
chemical, petroleum, power generation, air conditioning, metal refining, and numerous other
industries.

The standard is intended to provide sufficient information to enable anyone reading a flow
diagram and having a reasonable amount of plant knowledge to understand the means of
measurement and control of the process without having to go into the details of the
instrumentation that require the knowledge of an instrument specialist.

1.4. Instrumentation and control devices Configuration

Configuration refers to the arrangement of elements in a system to achieve a specific purpose.


It can be divided into two types: hardware configuration and software configuration.

Hardware configuration refers to the physical arrangement of hardware components in a


system. It includes the selection of hardware components, their compatibility, and their
placement in the system.

Software configuration refers to the arrangement of software components in a system. It


includes the selection of software components, their compatibility, and their placement in the
system.

Calibration is the comparison of measurement results provided by a device under test with
those of a calibration standard of established accuracy in measurement technology and
metrology. The calibration is just a comparison of measurement between the test equipment to
the standard equipment. Calibration can be done on a variety of instruments in a variety of
industries. Let’s have a look at some of the most common calibration procedures:
1. Electrical Calibration: Electrical calibration is the process of ensuring that any
instrument that measures or tests electrical properties such as voltage, current, resistance,
inductance, capacitance, time, and frequency is operating properly. The following
instruments are frequently submitted for electrical calibration: Data loggers Electric

Page 46 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
meters Multi-meters Oscilloscopes Frequency counters Insulation Testers Loop testers
etc.
2. Mechanical Calibration: Mechanical instruments are prone to drift as a result of
repeated use, mechanical stress, and exposure to fluctuating air conditions etc., and
because of such condition the mechanical calibration is the much-needed remedy to
overcome the error induced in the equipment’s. Mass, volume, density, force, torque,
dimension, angle, flatness, and vibration are the major properties that are calibrated
during mechanical calibration in a temperature-controlled atmosphere. The following are
some of the most commonly tested mechanical calibration instruments: Accelerometers
Scales/Balances Force Gauges & Load Cells Micrometers, Vernier, and height gauges
Screwdrivers & Torque Wrenches Sets of Weight and Mass
3. Flow Calibration: A flow meter (also known as a flow sensor) is a device that measures
the linear or non-linear, mass or volumetric flow rate of a liquid or gas. The flow rate is
the rate at which a process fluid moves through pipelines, orifices, or vessels at a given
time, and it is measured by instruments through a controlled manner to monitor and
regulate the speed and efficiency of industrial flow processes and devices.
The configuration process is based on electronic data sheets (EDS-Files) provided by the device
manufacturers, and contain relevant communication parameters, both commonly described as:

 Device Net is a digital, multi-drop network, used in Ethernet I/P layer, serving as a
communication network between industrial controllers offering a single point of
connection for configuration by supporting both I/O and explicit messaging.
 Control Net offers good real-time capabilities, also used in Ethernet I/P layer, providing
high speed deterministic transmission for time-critical I/O data and messaging data.

The first step in operating a smart differential pressure transmitter is to set-up properly. This
involves correct electrical wiring and positioning.

ELECTRIC WIRING

Reach the wiring block by removing the Electrical cover this cover can be locked closed by the
cover locking screw (figure 6.1). To release the cover, rotate the locking screw clockwise

Page 47 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 25 Housing rotating Set Screw
Test and communication terminal allow, respectively, to measure the current in the 4-20mA
loop, without opening it, and to communicate with transmitter. To measure it, connect a
multimeter in the mA scale in the”-“and “+” terminals, use a HART configurator in “COMM”
and “-“terminals.

The wiring block has screws on which fork or ring-type terminals can be fastened. See figure 6.2

Figure 1. 26 wiring block


For convenience there are two ground terminals: one inside the cover and one external located
close the conduit entries.

Use of twisted pair (22 AWG or greater then) cables is recommended.

Avoid routing signal wiring close to power cables or switching equipment.

Unused outlet connection should be plugged and sealed accordingly.

The LD301 is protected against reverse polarity.

Page 48 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
The Figure 6.3 –conduit installation diagram shows the correct installation of the conduit, in
order to avoid penetration of water, or other substance, which may cause malfunctioning of
equipment.

Figure 1. 27 Conduit installation diagram

Hhen the sensor is in the horizontal position, the weight of the fluid pushes the diaphragm
down, making it necessary a lower pressure trim, see Figure 6.5

Page 49 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 28 sensor positions
Connection of LD301 working as transmitter should be done as in Figure 1.29.
Connection of LD301 working as a controller should be as indicated as in Figure 1.30
Connection of LD301 in multi-drop configuration should be done as in Figure 1.31. Note that a
maximum of 15 transmitters can be connected on the same line and that they should be
connected in parallel.
Take care to power supply as well, when many transmitters are connected on the same line.
The current through the 250-ohm resistor will be high causing a high voltage drop. Therefore,
make sure that the power supply voltage is sufficient.
The Hand-Held terminal can be connected to the communication terminals of the transmitter or
at any point of the signal line by using the alligator clips. It is also recommended to ground the
shield of shielded cables at only one end. The unground end must be carefully isolated.
Note: Make sure that the transmitter is operating within the operating area as shown on the load
curve (Figure 1.32). communication requires a minimum load of 250 ohm.

Page 50 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 29 wiring diagram for the LD301 working as a transmitter

Figure 1. 30 wiring diagram for the LD301 working as a controller (optional)

Figure 1. 31 wiring diagram for the LD301 in multidrop configuration

Page 51 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 32 Load curve
When using the instrument as a transmitter or a controller, the connection diagram is pretty
much the same. Their responses to the input though are different.

1.5. Material, Tools, equipment and testing devices

1.5.1. Materials and Tools


A. Pliers (assorted)
These made from metal with insulators in the handle and are used for cutting, twisting, bending,
holding, and gripping wires and cables.

Figure 1. 33 Different types of pliers

B. Screw drivers (assorted)


These tools are made of steel hardened and tempered at the tip used to loosen or tighten screws
with slotted heads. They come in various sizes and shapes.

Page 52 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 34 different types screw driver

C. Soldering iron/gun
Soldering irons are device that convert electrical energy to heat energy through systematical
designed high resistive wire as heating elements. They are used to solder electronic circuits
or connecting wires and other materials using soldering leads as well as using other catalysts
that aids either to increase strength of connection or to clean contacts.

Figure 1. 35 different types of soldering iron


D. Wrenches (assorted)
The wrenches most often used in maintenance are classified as open-end, box-end, socket,
adjustable, ratcheting and special wrenches. The Allen wrench, although seldom used, is
required on one special type of recessed screw. Solid, nonadjustable wrenches with open parallel
jaws on one or both ends are known as open-end wrenches. Box-end wrenches are popular tools
because of their usefulness in close quarters. They are called box wrenches since they box, or
completely surround the nut or bolt head.

Page 53 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 36 Different types wrenches

1.5.2. Device, instrument and Equipment


i. Water level
It is a device used for measuring or machining surface elevation of locations is too far apart for a
sprit level to span. Alcohol such as ethanol is often used rather than water because alcohols have
low viscosity and surface tension.

It is used for measuring angles of 90: (Right angle) Measurements in mile meter are marked in
its scale. it is used to measure 90 right angle accurately.

Figure 1. 37 Different types of water level

ii. Tri-square
Measuring angles frequently you will have to determine angles between parts or units of aircraft.
In layout work, it is also sometimes necessary to measure angles. The tools most frequently used
for measuring angles are Tri-squares, the combination set, angle gages, and levels.

Figure 1. 38 Tri-square

Page 54 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
iii. Measuring tape
They are extended easily or coiled in their cases for stowage, and you can conveniently carry
one of these in your pocket. Another thing, they are not as bulky to handle as the large steel
tapes. You have to pull the FLEXIBLE STEEL TAPE, shown in figure 85, from its case by
hand. When you want it back in the case, wind it with a crank. Tapes of this type are long,
flexible steel rules, usually furnished in 3m, 8m, and 15m- lengths.

Figure 1. 39 measuring tape


iv. Calipers
Layout and measuring devices are precision tools. They are carefully machined, accurately
marked and, in many cases, are made up of very delicate parts. When using these tools, be
careful not to drop, bend, or scratch them. The finished product will be no more accurate than
the measurements or the layout; therefore, it is very important to understand how to read, use,
and care for these tools.

A vernier scale is a device that lets the user measure more precisely than could be done unaided
when reading a uniformly divided straight or circular measurement scale. It is a scale that
indicates where the measurement lies in between two of the marks on the main scale. Verniers
are common on sextants used in navigation, scientific instruments used to conduct experiments,
machinists' measuring tools (all sorts, but especially calipers and micrometers) used to work
materials to fine tolerances. An ordinary vernier caliper has jaws you can place around an object,
and on the other side jaws made to fit inside an object. These secondary jaws are for measuring
the inside diameter of an object. Also, a stiff bar extends from the caliper as you open it that can
be used to measure depth. Gauges are several type, they may include stain gage, dial gauge and
combination gauge…etc.

Figure 1. 40 vernier scale

Page 55 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
v. Gauges
Used in determining the size of wires/conductors, the gauge ranges from 0 to 60 awg (American
wire gauge).

Figure 1. 41 wire gauge

1.5.3. Equipment/testing devices


A. Signal sources & generators
GS610 - Source measure unit The GS610 is a high accuracy, high speed programmable voltage
and current source that incorporate both generation and measurement functions as well as USB
storage and an Ethernet interface. As the GS610 can operate as a current source or a current
sink, a wide range of electrical characteristics can be evaluated.
 Wide range sink and source operation (3.2 A, 110 V, 60 W)
 Precise pulse generation (down to 100 µs width with 1 µs resolution)
 Battery simulator version available
GS820 - Multi channel source measure unit The GS820 is a highly accurate multi-channel
voltage/current source measure unit that incorporates voltage generation/current generation as
well as USB storage and an Ethernet interface. Since the two source channels and two
measuring channels can be operated arbitrarily, almost all electrical characteristics can be
evaluated.
 Dual sink and source operation: 7 V and 3.2 A or 18 V and 1.2 A
 Precise pulse generation (down to 100 μsec width with 0.1 μsec res.)
 50 V version available, 50 V and 0.6 A or 20 V and 1.2 A
FG400 – Arbitrary/function generators The FG400 provides basic and 25 types of application
specific waveforms as standard and generates signals quickly and easily. Acquire signals using a
Yokogawa Oscilloscope or Scope Corded and use the 16 bit arbitrary waveform capabilities to
reproduce them or add them to other signals.

Page 56 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 1 or 2 independent or synchronised channels
 0.01 μHz to 30 MHz
 Precise phase and frequency control between channels
 20 V peak to peak and 42 V isolation between outputs
Refer others tools and equipment form your UC of calibration instrumentation and
controlling device and also Level I from UC use hand tools and testing equipment
B. Calibration bench
Calibration Bench is the ultimate multifunction calibration station from Time Electronics. Each
calibration bench is custom-made to meet specific user requirements. Offering versatility and
precision it is ideal for laboratories and workshops in need of multi-product testing and
instrument calibration that meets the highest industry standards.

A wide range of modules can be fitted to the primary console creating a highly flexible system
that is both functional and easy to use. Further expansion can be achieved by adding the
secondary console, mounted under the primary.

Calibration modules cover

 Electronic signal  Power supplies


 Temperature  DMMs
 Loop and pressure  Oscilloscopes and
applications.  Signal generator
Functions are clearly defined on each module and a competent technician will quickly master
the operation of the system without training or constant reference to manuals. Various fittings,
functions, and additional devices can be added to Calibration Bench to create a comprehensive
work environment.

Figure 1. 42 Calibration Bench

Page 57 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Pressure: Ranges from vacuum to 600bar. Also available is an Automatic Pressure Calibrator
that allows 4 preset points to be selected at the push of a button, or by Easy Cal remote
control.
Power: Variable AC mains, variable DC and fixed quad or dual DC supplies can be fitted.
Loop and Temperature: High accuracy loop calibrator modules with source measure and
sink functions. Temperature calibrators capable of measuring and simulating
RTDs/thermocouples
External Options: Pneumatic and hydraulic handheld calibration pumps, dry block
calibrators, solder stations, vices, laboratory furniture, and more.
C. Air condition Equipped room
For calibration purpose calibration bench and equipment’s are not sufficient by themselves,
depending on the requirement or types of calibration and essentiality to perform correct and
standardized calibration there are also additional requirements. Air condition equipped room is a
room which installed with device that help to maintain internal temperature of the room to
optimum level as required for specific/ desired purpose. This is very necessary particularly in
laboratory during calibration and test to control or avoid undesired intervention of
environmental phenomenon.

In addition there are several areas that require air conditioning situation. Typical applications
that benefit from precision air conditioning include:

 Medical equipment suites (MRI, CAT scan)


 Hospital facilities (operating, isolation rooms)
Requirements during calibration:

 Clean rooms
 Laboratories
 Data centers
 Server and computer rooms
 Telecommunications (wiring closets, switch gear rooms, cell sites)
 Printer/copier/CAD centers
D. Air supply equipment or instrument
Compressor: Compressors used for instrument air delivery are available in various types and
sizes, from rotary screw (centrifugal) compressors to positive displacement (reciprocating
piston) types. The size of the compressor depends on the size of the facility, the number of

Page 58 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
control devices operated by the system, and the typical bleed rates of these devices. The
compressor is usually driven by an electric motor that turns on and off, depending on the
pressure in the volume tank. For reliability, a full spare compressor is normally installed.

Power Source: A critical component of the instrument air control system is the power source
required to operate the compressor. Because high-pressure natural gas is abundant and readily
available, gas pneumatic systems can run uninterrupted on a 24-hour, 7-day per week schedule.
The reliability of an instrument air system, however, depends on the reliability of the
compressor and electric power supply. Most large natural gas plants have either an existing
electric power supply or have their own power generation system. For smaller facilities and
remote locations, however, a reliable source of electric power can be difficult to assure. In some
instances, solar-powered battery-operated air compressors can be cost effective for remote
locations, which reduce both methane emissions and energy consumption. Small natural gas-
powered fuel cells are also being developed.

Dehydrators: Dehydrators, or air dryers, are an integral part of the instrument air compressor
system. Water vapor present in atmospheric air condenses when the air is pressurized and
cooled, and can cause a number of problems to these systems, including corrosion of the
instrument parts and blockage of instrument air piping and controller orifices. For smaller
systems, membrane dryers have become economic. These are molecular filters that allow
oxygen and nitrogen molecules to pass through the membrane, and hold back water molecules.
They are very reliable, with no moving parts, and the filter element can be easily replaced. For
larger applications, desiccant (alumina) dryers are more cost effective.

Volume Tank: The volume tank holds enough air to allow the pneumatic control system to have
an uninterrupted supply of high-pressure air without having to run the air compressor
continuously. The volume tank allows a large withdrawal of compressed air for a short time,
such as for a motor starter, pneumatic pump, or pneumatic tools, without affecting the process
control functions.

E. Power supply equipment


A power supply is an electronic device that supplies electric energy to an electrical load.

The primary function of a power supply is to convert one form of electrical energy to another.
As a result, power supplies are sometimes referred to as electric power converters.

Page 59 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Some power supplies are discrete, stand-alone devices, whereas others are built into larger
devices along with their loads. Examples of the latter include power supplies found in desktop
computers and consumer electronics devices.

Figure 1. 43 Digital DC PS, analog DC PS, Digital dual PS

Power supplies are categorized in various ways, including by functional features.

Example:

Regulated power supply is one that maintains constant output voltage or current despite
variations in load current or input voltage.

Unregulated power supply can change significantly when its input voltage or load current
changes.

Adjustable power supplies allow the output voltage or current to be programmed by mechanical
controls (e.g., knobs on the power supply front panel), or by means of a control input, or both.

Generally depending on the voltage Types of Power Supplies:

DC power supply: A DC power supply is one that supplies a constant DC voltage to its load.
Depending on its design, a DC power supply may be powered from a DC source or from an AC
source such as the power mains.

AC-to-DC supply: Some DC power supplies use AC mains electricity as an energy source.
Such power supplies will sometimes employ a transformer to convert the input voltage to a
higher or lower AC voltage. A rectifier is used to convert the transformer output voltage to a

Page 60 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
varying DC voltage, which in turn is passed through an electronic filter to convert it to an
unregulated DC voltage.

An instrument to be calibrated:

The manufacturer usually does the initial calibration on its equipment. Subsequent calibrations
may be done in-house, by a third-party lab, or by the manufacturer. The frequency of
recalibration will vary with the type of equipment. Deciding when to recalibrate a flow meter,
for example, depends mainly on how well the meter performs in the application. If liquids
passing through the flow meter are abrasive or corrosive, parts of the meter may deteriorate in a
very short time. Under favorable conditions, the same flow meter might last for years without
requiring recalibration. As a rule, however, recalibration should be performed at least once a
year. Of course, in critical applications frequency will be much greater.

Signal generator:

A signal generator is an electronic device that generates repeating or non-repeating electronic


signals in either the analog or the digital domain. It is generally used in designing, testing,
troubleshooting, and repairing electronic or electro acoustic devices, though it often has artistic
uses as well.

There are many different types of signal generators with different purposes and applications and
at varying levels of expense. These types include function generators, RF and microwave signal
generators, pitch generators, arbitrary waveform generators, digital pattern generators and
frequency generators. In general, no device is suitable for all possible applications.

Traditionally, signal generators have been embedded hardware units, but since the age of
multimedia PCs, flexible, programmable software tone generators have also been available.

A function generator is a device which produces simple repetitive waveforms. Such devices
contain an electronic oscillator, a circuit that is capable of creating a repetitive waveform.
(Modern devices may use digital signal processing to synthesize waveforms, followed by a
digital to analog converter, or DAC, to produce an analog output). The most common wave form
is a sine wave, but saw tooth, step (pulse), square, and triangular waveform oscillators are
commonly available as are arbitrary waveform generators (AWGs). If the oscillator operates
above the audio frequency range (>20 kHz), the generator will often include some sort of
modulation function such as amplitude modulation (AM), frequency modulation (FM), or phase

Page 61 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
modulation (PM) as well as a second oscillator that provides an audio frequency modulation
waveform.

Figure 1. 44 PSG Analog Signal Generator, Function-generator, Tone-generator-and-wire-


tracker
i. Oscilloscope

An oscilloscope is a laboratory instrument commonly used to display and analyze the waveform
of electronic signals. In effect, the device draws a graph of the instantaneous signal voltage as a
function of time.

An oscilloscope's primary function is to provide a graph of a signal's voltage over time. Usually
the Y axis represents the voltage and the X axis time. This is useful for measuring such things as
clock frequencies, duty cycles of pulse-width-modulated signals, propagation delay, or signal
rise and fall times based on the input to its probes.

Figure 1. 45 Analog Oscilloscope

A typical oscilloscope can display alternating current (AC) or pulsating direct current (DC)
waveforms having a frequency as low as approximately 1 hertz (Hz) or as high as several

Page 62 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
megahertz (MHz). High-end oscilloscopes can display signals having frequencies up to several
hundred Gigahertz (GHz). The display is broken up into so-called horizontal divisions (hor div)
and vertical divisions (vert div). Time is displayed from left to right on the horizontal scale.
Instantaneous voltage appears on the vertical scale, with positive values going upward and
negative values going downward.

The oldest form of oscilloscope, still used in some labs today, is known as the cathode-ray
oscilloscope. It produces an image by causing a focused electron beam to travel, or sweep, in
patterns across the face of a cathode ray tube (CRT). More modern oscilloscopes electronically
replicate the action of the CRT using a liquid crystal display (liquid crystal display) similar to
those found on notebook computers. The most sophisticated oscilloscopes employ computers to
process and display waveforms. These computers can use any type of display, including CRT,
LCD, and gas plasma.

In any oscilloscope, the horizontal sweep is measured in seconds per division (s/div),
milliseconds per division (ms/div), microseconds per division (s/div), or nanoseconds per
division (ns/div). The vertical deflection is measured in volts per division (V/div), milli-volts per
division (mV/div), or micro-volts per division (μV/div). Virtually all oscilloscopes have
adjustable horizontal sweep and vertical deflection settings. In the design and construction of
circuits, the oscilloscope is a very handy equipment, electronic labs cannot do without it. Its
functions are:

 Show and calculate the frequency and amplitude of an oscillating signal.


 Shows the voltage and time of a particular signal. This function is the most used in all
labs.
 Helps to troubleshoot any malfunction components of a project by looking at the
expected output after a particular component.
 Shows the content of the AC voltage or DC voltage in a signal.

Page 63 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 46 Oscilloscope
ii. Standard gauges
The standard gauge is a widely used railway track gauge. Approximately 55% of the lines in the
world are used this gauge.

Pressure gauges are manufactured in many configurations and sizes from 50mm up to 300 mm
dial size in ranges of 2.5 kPa up to 100,000kPa with brass or stainless-steel wetted parts. Scales
can be offered in different units.

Regarding application several types of standard gauges are there.

General Purpose Pressure Gauges: - The ASG Series of General Purpose pressure gauges
guarantees long life & durability for indoor, outdoor and harsh environmental conditions.

Figure 1. 47 pressure gauge


Low Pressure Gauges: - Low pressure gauges operated by a capsule system and only suitable
for use on air and some gases. Pressure ranges from -2.5-0 kPa up to 0-60 kpa pressure or
vacuum. Offered with stainless steel casing and brass connections for the use on non-corrosive
applications, but can be offered with stainless steel wetted parts on request. Dial sizes available
are 63mm, 100mm and 150mm

Page 64 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Test Gauge: - Precision test gauges manufactured to the highest standard in quality. Used for
the testing of industrial gauges or equipment of the same standard. For quality control testing
requirements, it is not always necessary to use a primary standard such as a dead weight tester,
therefore a secondary standard such as a test gauge can be used, being a more convenient and
economical method of testing.
Safety Pattern Pressure Gauges: - These gauges are generally used within the gas industry are
designed with operator safety in mind in case of a bourdon tube rupture and that no projectiles
will blow out from the front of the gauge. Safety pattern construction consists of a front baffle
wall, Perspex window and a blowout disc in the rear of the case. Dial sizes available are 100mm
and 150mm with pressure ranges up to 100,000 kPa.

Self-Check -1.1 Written Test

Directions: Answer all the questions listed below. Write down the correct answer in sheet
provided.
I. Choose the best answer (each 2point)
1. ________the ratio of the error to the full scale output.
A. Accuracy B. Tolerance C. A & B D) None

2. ____________re-ranging could (usually) only be accomplished by re-calibration, since the


same adjustments.
A. In analog instruments B. In digital instruments C. A&B D. None
3. ___________is defined as “the region between the limits within which a quantity is
measured, received or transmitted, expressed by stating the lower and upper range values.”
A. Accuracy B. Calibration tolerances C. The calibration range
4. ________ for quality control testing requirements, it is not always necessary to use a primary
standard such as a dead weight tester.
A. Test Gauge B. Low Pressure Gauges C. Standard gauges D. All

5. ______ is an electronic device that supplies electric energy to an electrical load.


A. AC-to-DC supply B. power supply C. Volume tank D) none

6. ______ is device that converts electrical energy to heat energy.


A. compressor B. soldering iron/gun C. pressure switch D) all

7. _____________ high accuracy loop calibrator modules with source measure and sink functions.

Page 65 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
A. Loop and Temperature B. Recorders C. Control valves D. All

8. ____________ is indicated by an index and graduated scale


A. Analogue indicators B. digital indicators C. A & B D. none

II. Part II fill the blank space


1. List down why needed calibration? (5%)
____________________________________,______________________________
_____________________________________,_____________________________
2. What is some advanced calibrators can: (Have 4 point)
II.1. ______________________________
II.2. _______________________________
II.3. _________________________________

Operation sheet (1.1) - Analog Oscilloscope Procedures:


The purpose of this operation is to introduce students to the basic tools used by engineers and
technicians in analyzing electronic equipment: the function generator, the analog oscilloscope,
and the digital oscilloscope.
During this operation, the student will use the function generator to generate a number of signals
and to analyze those signals using either of the ‘oscilloscopes. The student will become familiar
with the basic waveforms -- sine, square, and triangle waves -- and the components of the
waveforms -amplitude, period, and frequency.
Equipment:
Qty Equipment
1 Leader LFG-1300S Function Generator
1 BNC to 2 alligator clips cable
1 Tektronix 2225 Analog Oscilloscope
1 Tektronix P6103 10X probe
1 Hewlett-Packard 54502A Digital Oscilloscope
1 HP 10430A 10X probe

Overview:
In this part of the lab, you will use the function generator to generate a signal and then use the
analog oscilloscope to make some measurements of the data. If you need any assistance in
identifying or configuring the equipment, please see the on-duty GSA.

Page 66 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Function Generator Setup:
1. Turn on the function generator.
2. Make sure that the Sweep, Amplitude Modulation, and DC offset groups are turned OFF.
Note: The Sweep and Amplitude Modulation buttons should be in the OUT position, and the DC
offset knob should be pushed IN.
1. Select a TRIANGLE wave.
2. Set frequency to 1.0 kHz.
3. Set the amplitude knob to 12 o'clock.
4. Make sure all the attenuation buttons are OUT.
5. Attach the BNC end of the cable to the BNC socket in the function generator's Output
Group.
6. Tie back the black lead alligator clip by clipping the black alligator clip to the insulated
wire to avoid short circuits and blown fuses.
Analog Oscilloscope Setup:
1. Turn on the power to the oscilloscope.
2. Connect the Tektronix 10X probe to the Channel 1 input on the oscilloscope.
3. Tie back the black lead on the probe, then expose the hook and clip it to the red alligator
clip on the function generator cable.
4. Set CH1/BOTH/CH2 switch to CH1.
5. In the Vertical Control Group, set the Channel 1 AC/GND/DC switch to GND.
6. Use the CH1 vertical position knob to move the ground (0V) reference (horizontal line)
to the center line on the screen.
7. After referencing the signal to ground, set the AC/GND/DC switch to AC.
8. Intensity knob: adjust the signal intensity and focus to a comfortable level, by using the
intensity and focus knobs, respectively.
Note: Keep the signal intensity to within a reasonable range to minimize the chances of burning
the phosphor and damaging the display screen.
9. Set the Volts-per-Division knob to 2 V/div.
Note: you are using a 10X probe, so be sure to take all readings from the 10X position.
10. Set the NORMAL/INVERT switch to NORMAL.
11. Set the ADD/ALT/CHOP switch to ALT.
12. Set X1/ALT/MAG to X1.
13. Set the seconds-per-division knob to 0.1 ms/div.
14. Set the rising/falling-edge switch to positive.

Page 67 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
15. Set the trigger-mode switch to AUTO.
16. Set the trigger-source switch to VERT MODE.
17. Set the trigger-coupling switch to AC.
Note: If the signal is still running, try adjusting the trigger level or holdoff knobs.
18. Make sure that the 'cal' knobs on the V/div and S/div knobs are both pushed in and
turned all the way to the right. They will click into position. This is so the scope will be
calibrated properly.
Procedures:
Step1. Using the settings completed above, answer the following questions.
1. What is the setting on the Volts/Div control knob? _______ volts/div
2. How many vertical divisions from peak-to-peak? _______ div
3. What is the peak to peak voltage (Vpp)?
Vpp = _______ volts/div * _______ div = _______ volts
4. What is the setting on the Sec/Div control knob? _______ ___ seconds/div
5. How many horizontal divisions from positive going crossing to positive going
crossing? _______ div
6. What is the period of the signal (T)?
T = _______ seconds/div * _______ div = _______ ___ seconds
7. What is the frequency of the signal (f)? _______ ___ hertz
8. Draw the displayed signal on Graph 4.1.
Be neat, to scale, and concise. Be sure to note the (scale) V/div and Sec/div settings.

Graph 4.1

Page 68 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Step2. Next you will adjust the function generator to output another signal and repeat the
measurements in Step 1.
2. Generate a square wave between 5 and 10 kHz with an amplitude setting of 3 o'clock.
3. Adjust the V/div and Sec/div settings to maximize the display of the signal on the CRT.
Make sure you show the signal from peak-to-peak and at least one full cycle (period) of the
signal.
4. What is the setting on the Volts/Div control knob? _______ volts/div
5. How many vertical divisions from peak-to-peak? _______ div
6. What is the peak to peak voltage (Vpp)?
Vpp = _______ volts/div * _______ div = _______ volts
7. What is the setting on the Sec/Div control knob? _______ ___ seconds/div
8. How many horizontal divisions from positive going crossing to positive going crossing?
_______ div
9. What is the period of the signal (T)?
T = _______ sec/div * _______ div = _______ ___ seconds
10. What is the frequency of the signal (f)? _______ ___ hertz.
11. Draw the displayed signal on Graph 4.2.
Be neat, to scale, and concise. Be sure to note the (scale) V/div and Sec/div settings.

Graph 4.2

Page 69 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
step3. In this part of the lab, you will experiment with the triggering of the analog oscilloscope.
You will need to make some changes to the controls of the analog oscilloscope and the function
generator, and you will then have to answer a few questions.
1. Set TRIGGER MODE to NORM.
Set TRIGGER SOURCE to CH1.
2. Turn the TRIGGER LEVEL knob all the way to the right.
Turn the TRIGGER HOLDOFF knob all the way to the right.
3. Generate a 1.4 kHz sine wave with the function generator; setting the amplitude knob
all the way to the left. Then display the signal on the oscilloscope.
4. On the oscilloscope, turn the horizontal position (coarse) knob until the left edge of
the trace is shown.
5. While watching the display, turn the TRIGGER LEVEL knob to the left until you get
a steady display.
6. Keep turning the TRIGGER LEVEL knob to the left, and watch the trigger level go
down. Note that the oscilloscope ceases to trigger when the level moves below the
signal.
7. Set the TRIGGER LEVEL as close to the top of the signal as possible while still
retaining a steady picture.
8. What is the peak-to-peak voltage (Vpp)? _______ volts
9. On the function generator, depress the 10 dB attenuation button.
10. Use the TRIGGER LEVEL knob to steady the display. It's a delicate adjustment, so
watch the screen carefully while you turn the knob to the left.
11. (Why did the signal run or disappear? (not steady or not locked)
12. Without changing the V/div setting, what is the new Vpp? _______ volts
13. Using the formula on page 3, calculate the attenuation in dB? _______dB (note
sign!)
14. Disconnect the analog oscilloscope, and turn off the power.

Page 70 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Operatiion sheet (1.2) Digital Oscilloscope Procedures:

Overview:
In this part of the operation, you will use the function generator to generate a signal and then use
the digital oscilloscope to make some measurements of the data. If you need any assistance in
identifying or configuring the equipment, please see the on-duty GSA.
Digital Oscilloscope Setup:
1. Generate a 6kHz triangle wave with the amplitude turned fully to the right using the
function generator.
2. Connect the HP probe to the Channel 1 input on the oscilloscope, then connect the
probe to the red lead from the function generator.
3. Press the AUTOSCALE button to make the oscilloscope automatically adjust its
settings.
4. Select the Channel menu by pressing the CHAN button.
i. Select Channel 1.
ii. Set v/div to 5.0 v/div, if it is not already set.
iii. Set offset to 0 volts, if not already set.
iv. Select AC. Select 1MΩ Ω, if not already set.
v. Select More by pressing the button to the right of the more label.
vi. Make sure that 10:1 is selected. If not, use the wheel to set it.

Page 71 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
vii. Press the More button.
5. Select the Time base menu by pressing the TIME BASE button.
i. Set the time base to 50 µ µs/div, if not already set.
ii. Set delay to 0 seconds.
iii. Select reference: center, if not already set.
iv. Set window: off, if it is on.
v. Select realtime.
6. Select the Trigger menu by pressing the TRIG button.
i. Select AUTO.
ii. Select EDGE.
iii. Select source: channel 1.
7. Select the Display menu by pressing the DISPLAY button.
i. Select NORM.
ii. Set persistence: 'minimum' or 'single'.
iii. Set No. of screens: 1. Select GRID.
iv. Set connect dots: ON.
PROCEDURES:
Step1. Using the settings completed above, answer the following questions.
1. Select the ∆t/∆v menu by pressing the ∆ ∆t/∆ ∆v button.
2. Select Vmarkers and Tmarkers.
3. Select Vmarker 1, then use the data entry wheel to set the marker at the bottom of
the signal.
Be sure to adjust the voltage level not the channel number.
4. Similarly, set Vmarker 2 to the top of the signal.
5. Read ∆V at the bottom of the screen.
i. Vmarker 1: _______ ___ volts
ii. Vmarker 2: _______ ___ volts
iii. ∆V: _______ ___ volts
6. Using the same procedure as for the Vmarkers, set the ∆T start marker to either a
positive or negative peak.
7. Similarly, set the ∆T stop marker to the next positive or negative peak. (Choose the
same polarity as you did in Step 1.2.)
8. Read ∆T and 1/∆T (period and frequency) at the bottom of the screen.
i. Start Marker: _______ ___ seconds

Page 72 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
ii. Stop Marker: _______ ___ seconds
iii. ∆T (period): _______ ___ seconds
iv. 1/∆T (frequency): _______ ___hertz
9. Turn the markers off.
Step2. Using the automatic measurement capability of the digital oscilloscope, you will repeat
the measurements you made in Step 1.
1. Press the blue shift button, press button 1 (Vpp), and then press button 1 for channel
1 when 'c#' appears at the bottom of the screen in reverse text.
2. Measure Vpp: _______ ___ volts
Note: If 'm#' or 'f#' appears in reverse text at the bottom of the screen, use the entry wheel to
change this to 'c#'. If this fails, try hitting RECALL and CLEAR simultaneously to reset the
oscilloscope. You would then need to restart with the pressing the AUTOSCALE in the digital
oscilloscope setup section.
3. Press the blue shift button, press button 9 (Freq.), and then press button 1 for channel
1 when 'c#' appears at the bottom of the screen in reverse text.
Measure Frequency (f): _______ ___ hertz
4. Press the blue shift button, press button s-V (period) to the right of button 9, and
then press button 1 for channel 1 when 'c#' appears at the bottom of the screen in
reverse text.
Measure Period (T): _______ ___ seconds
5. Draw the displayed signal on Graph 4.3.
Be neat, to scale, and concise. Be sure to note the (scale) V/div and Sec/div settings.

Page 73 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Graph 4.3
1. To clear the measurements from the screen, press the blue shift button and then press the
clear button.
2. Disconnect the digital oscilloscope from the function generator and turn it off.
3. Make sure all probes and connectors are disconnected from the equipment and neatly
placed on the shelf above the work area.
Be sure to turn off both oscilloscopes and the function generator.

Operatiion sheet (1.3) Digital Oscilloscope Procedures:

Operation sheet
 Plan and prepare for configuration.
Learning Activity:
 Identify instrumentation and control devices for configuration.
 Specify minimum specification required to configure instrumentation and control
devices.

Page 74 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Refer to the diagram below. Analyze the process flow and identify the important elements
involve that is part of the instrumentation. Answer the questions that follow.

Page 75 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 1. 48 Sample level control loop

1. Describe the process as you understand it.


2. Identify the elements of the level control loop? Describe their function in the above
process control application.
3. What are the important specifications required of each device that are involve in the level
control loop? Define those specifications.

Page 76 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
UNIT 2. Calibrate Instrumentation and Control Devices
This learning guide is developed to provide you the necessary information regarding the
following content coverage and topics:
 Devices normal functions
 Condition instrumentation and control devices
 Calibrate or adjusting instrumentation and control devices
 Maintain configured and calibrated devices
 unplanned events or conditions
This guide will also assist you to attain the learning outcome stated in the cover page.
Specifically, upon completion of this Learning Guide, you will be able to:

 Check normal functions of devices


 Condition instrumentation and control devices to be calibrated
 Calibrate or adjust instrumentation and control devices
 Maintain configured and calibrated devices with standard
 Respond to unplanned events or conditions

2.1. Devices normal functions

Introduction

Page 77 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Checking is commonly used to verify grounding and bonding connections in electrical systems.
These tests also verify the proper operation of disconnecting means and the function of
overcurrent protection devices like fuses and circuit breakers.
Checking normal functions of devices
PV systems should be thoroughly tested at the time of calibrating and periodically over their
lifetime to ensure proper performance and safe operation. Baseline measurements at the time of
system commissioning are compared to the system ratings and expectations for acceptance, and
serve as a baseline for comparison with future measurements. Changes in test results over time
are used to track system degradation, and identify problems that require attention or service for
safety or performance reasons. Circuits or components that are modified or replaced should be
retested accordingly.
There are several types of electrical tests conducted on PV systems that are used to verify NEC
requirements and system performance.
Many of these tests can be conducted with common electrical test equipment, while some
measurements require special meters and instruments. In many cases, system performance
information is measured, recorded and displayed by PV system inverters or charge controllers,
and can be used to verify system functions and proper operation.
The following summarizes common types of testing conducted on PV systems what information
it provides:
 Continuity and resistance testing verifies the integrity of grounding and bonding
systems, conductors, connections and other terminations.
 Polarity testing verifies the correct polarity for PV dc circuits, and proper terminations
for dc utilization equipment
 Voltage and current testing verify that PV array and system operating parameters are
within specifications.
 Insulation resistance testing verifies the integrity of wiring and equipment, and used to
detect degradation and faults to wiring insulation.
 Performance testing verifies the system power and energy output are consistent with
expectations. These tests also require measurements of array temperature and solar
irradiance. For stand-alone or hybrid PV systems incorporating energy storage and
additional energy sources, the following additional tests may be conducted:
 Measurements of battery voltage, capacity and specific gravity.
 Verification of charge controller set points and temperature compensation.
 Verification of charging current and load control functions.

Page 78 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Verification of performance and wiring integrity for other sources, such as
generators.
Multi-function PV system testers are now available, such as the Seaward PV150, that conduct
many of the recommended tests, including continuity and resistance, polarity, voltage and
current tests, and insulation resistance tests. By combining these test functions into single
instruments, testing personnel avoid having to purchase, carry and maintain multiple meters.
Multi-functional PV system testers simplify and speed up testing.
These instruments can also store data for later retrieval and processing into commissioning test
reports that become part of the system documentation record. See Figure 2.1.

Figure 2. 1 The Seaward PV150 handheld meter provides multiple PV array testing functions.

2.2. Condition instrumentation and control devices


Conditioning instrumentation and control devices is a process of adjusting, calibrating, or
modifying the devices to ensure their accuracy, reliability, and performance. Conditioning can
involve various types of control devices and controllers, such as:

 Access control systems: These are devices or systems that use identification methods to
manage the movement of personnel, vehicles, materials, etc. through entrances and
exits. They can also be used for security monitoring and tracking purposes.

Page 79 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Flow controllers: These are devices that measure and regulate the flow of media in
manufacturing processes. They can use mechanical valves or electronic sensors to
control the flow rate, pressure, temperature, etc. of gases or liquids.
 Level controllers: These are devices that monitor and control the levels of tanks, vats,
etc. by means of pumps or valves. They can use various types of sensors, such as
conductive, capacitance, optical, or ultrasonic, to detect the level of products in
containers.
 Electrical calibration: This is a type of calibration that tests and adjusts the accuracy of
instruments that measure electrical parameters, such as voltage, current, resistance,
frequency, etc. Electrical calibration can use standard instruments or sources to compare
the output of the device under test with the input signal

Any instrumentation system must include an input transducer (sensor), such as a strain
gauge, whose response to a particular stimulus can be measured electrically. The other
component that is generally present in modern instrumentation systems is a digital processor,
such as a computer or a micro-controller. These programmable components have the
flexibility to be used for a variety of functions. The most important function that they
perform is to convert data into information. In the simplest situation the processing required
to extract information may only involve converting an input signal by a scale factor so that
the final result is in conventional units. For example, the output voltage signal from a strain
gauge may be converted to the corresponding actual strain. Alternatively, within a more
sophisticated system the signal from a strain gauge placed on an engine mounting might be
processed to extract the vibrational spectrum of an engine, which is then used to detect any
unusual frequency that might be indicative of wear. This information can then be displayed
to a 2 user, stored for later analysis, transmitted to a remote location or used by a controller.
The signal from a transducer is usually analogue in nature, ie. it is continuously varying and
can take any value (within an allowed range). This continuous analogue data has to be
converted to a digital format prior to being transferred to the digital processor. Any
instrumentation system must therefore include an analogue-to-digital (A/D) converter (ADC
for short) to convert an analogue signal into a digital format. A typical ADC will be an
existing component that has been designed to convert an analogue input voltage, typically
with a range of a few volts, into a digital word, which usually contains 8 or more bits.
However, the output from a typical transducer, such as a strain-gauge, might have an
amplitude of less than 10 mV. This transducer output signal must therefore be amplified in

Page 80 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
an analogue signal conditioning circuit before it can be converted into a digital word.
Another aspect of the performance of the ADC that must also be taken into consideration
when designing the signal conditioning circuit is that the ADC samples the transducer output
at specific time intervals. An unfortunate consequence of this is that several frequencies will
become indistinguishable at the ADC output. This is referred to as aliasing, and the effect
can only be avoided by using a low-pass, anti-alias filter to ensure that only the low
frequencies that can be represented accurately are present in the signal applied to the ADC
input. Since the 3 requirements for the anti-alias filter arises from a fundamental property of
the ADC, this type of filter should always be present.

Figure 2. 2 A block diagram of a typical instrumentation system with several different output
devices

As shown in Figure 1 the characteristics of typical sensors and ADCs mean that the data
collection (or acquisition) part of a typical modern instrumentation system can be split into
the three functional blocks, a sensor, signal conditioning circuits and an ADC. The digital
output from the ADC can then be processed in a programmable digital processor to extract
information that can be displayed to an operator, stored in a memory or transmitted via a
data link or used in feedback control.

2.3. Calibrate or adjust instrumentation and control devices

2.3.1 Basic introduction to calibration


There are as many definitions of calibration as there are methods. According to ISA’s The
Automation Systems and Instrumentation Dictionary, the word calibration is defined as “a test
during which known values of measurand are applied to the transducer and corresponding output
readings are recorded under specified conditions.” Therefore, it makes sense that calibration is
required for a new instrument. We want to make sure the instrument is providing accurate
indication or output signal when it is installed.

Page 81 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
The definition includes the capability to adjust the instrument to zero and to set the desired
span.
An interpretation of the definition would say that a calibration is a comparison of measuring
equipment against a standard instrument of higher accuracy to detect, correlate, adjust, rectify
and document the accuracy of the instrument being compared.
Purpose of Calibration:

1. The calibration of any measuring system is very important to get meaningful results.
2. In the case where the sensing system and measuring system are different, then it is
imperative to calibrate the system as an integrated whole in order to take into account the
error producing properties of each component.
3. Calibration is usually carried out by making adjustments such that the readout device
produces zero output for zero-measured input, and similarly, it should display an output
equivalent to the known measured input near the full-scale input value.
4. It is important that any measuring system calibration should be performed under
environmental conditions that are as close as possible to those conditions under which actual
measurements are to be made.
5. It is also important that the reference measured input should be known to an as much greater
degree of accuracy – usually, the calibration standard for the system should be at least one
order of magnitude more accurate than the desired measurement system accuracy.

Typically, calibration of an instrument is checked at several points throughout the calibration


range of the instrument.

 The calibration range is defined as “the region between the limits within which a
quantity is measured, received or transmitted, expressed by stating the lower and upper
range values.” The limits are defined by the zero and span values.
 Calibration and ranging are two tasks associated with establishing an accurate
correspondence between any instrument’s input signal and its output signal.
To calibrate an instrument means to check and adjust (if necessary) its response so the output
accurately corresponds to its input throughout a specified range. In order to do this, one must
expose the instrument to an actual input stimulus of precisely known quantity. For a
pressure gauge, indicator, or transmitter, this would mean subjecting the pressure instrument to
known fluid pressures and comparing the instrument response against those known pressure
quantities.

Page 82 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
To range an instrument means to set the lower and upper range values so it responds with the
desired sensitivity to changes in input.
For example, a pressure transmitter set to a range of 0 to 200 PSI (0 PSI = 4 mA output;

200 PSI = 20 mA output) could be re-ranged to respond on a scale of 0 to 150 PSI (0 PSI =
4 mA ; 150 PSI = 20 mA).

 The zero value is the lower end of the range.


 Span is defined as the algebraic difference between the upper and lower range values.
The calibration range may differ from the instrument range, which refers to the capability of the
instrument.

For example, an electronic pressure transmitter may have a nameplate instrument range of 0–
750 pounds per square inch, gauge (psig) and output of 4-to-20 milliamps (mA). However, the
engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA.
Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA.

In this example, the zero-input value is 0 psig and zero output value is 4 mA. The input span is
300 psig and the output span is 16 mA. Different terms may be used at your facility. Just be
careful not to confuse the range the instrument is capable of with the range for which the
instrument has been calibrated.

 In analog instruments, re-ranging could (usually) only be accomplished by re-


calibration, since the same adjustments were used to achieve both purposes.
 In digital instruments, calibration and ranging are typically separate adjustments (i.e. it
is possible to re-range a digital transmitter without having to perform a complete
recalibration), so it is important to understand the difference.

2.3.2 Characteristics of Calibration


Every calibration should be performed to a specified tolerance. The terms tolerance and
accuracy are often used incorrectly. In ISA’s The Automation, Systems, and Instrumentation
Dictionary, the definitions for each are as follows:

Accuracy: The ratio of the error to the full-scale output or the ratio of the error to the output,
expressed in percent span or percent reading, respectively.

Tolerance: Permissible deviation from a specified value; may be expressed in measurement


units, percent of span, or percent of reading.

Page 83 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
As you can see from the definitions, there are subtle differences between the terms. It is
recommended that the tolerance, specified in measurement units, is used for the calibration
requirements performed at your facility. By specifying an actual value, mistakes caused by
calculating percentages of span or reading are eliminated. Also, tolerances should be specified in
the units measured for the calibration.

For example, you are assigned to perform the calibration of the previously mentioned 0-to-300
psig pressure transmitter with a specified calibration tolerance of ±2 psig. The output tolerance
would be:

(2 psig/300 psig)*16 mA = 0.1067 mA

The calculated tolerance is rounded down to 0.10 mA, because rounding to 0.11 mA would
exceed the calculated tolerance. It is recommended that both ±2 psig and ±0.10 mA tolerances
appear on the calibration data sheet if the remote indications and output milliamp signal are
recorded.

Note the manufacturer’s specified accuracy for this instrument may be 0.25% full scale (FS).
Calibration tolerances should not be assigned based on the manufacturer’s specification only.

Calibration tolerances should be determined from a combination of factors. These factors


include:

 Requirements of the process


 Capability of available test equipment
 Consistency with similar instruments at your facility
 Manufacturer’s specified tolerance

2.3.3 Typical instrument error and calibration error


The instrument error can occur due to a variety of factors: drift, environment, electrical
supply, addition of components to the output loop, process changes, etc. Since a calibration is
performed by comparing or applying a known signal to the instrument under test, errors are
detected by performing a calibration.

Typical errors that occur include:

Page 84 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
A. Instrument errors
Any given instrument is prone to errors either due to aging or due to manufacturing tolerances.
Here are some of the common terms used when describing the performance of an instrument.

 Range
The range of an instrument is usually regarded as the difference between the maximum and
minimum reading.

For example: a thermometer that has a scale from 20 to 100 oC has a range of 80oC. This is also
called the full-scale deflection (f.s.d.).

 Accuracy
The accuracy of an instrument is often stated as a % of the range or full-scale deflection.

For example: a pressure gauge with a range 0 to 500 kPa and an accuracy of plus or minus 2%
f.s.d. could have an error of plus or minus 10 kPa. When the gauge is indicating 10 kPa the
correct reading could be anywhere between 0 and 20 kPa and the actual error in the reading
could be 100%. When the gauge indicates 500 kPa the error could be 2% of the indicated
reading.

 Repeatability
If an accurate signal is applied and removed repeatedly to the system and it is found that the
indicated reading is different each time, the instrument has poor repeatability. This is often
caused by friction or some other erratic fault in the system.

 Stability
Instability is most likely to occur in instruments involving electronic processing with a high
degree of amplification. A common cause of this is unfavorable environment factors such as
temperature and vibration.

For example, a rise in temperature may cause a transistor to increase the flow of current which
in turn makes it hotter and so the effect grows and the displayed reading DRIFTS. In extreme
cases the displayed value may jump about. This, for example, may be caused by a poor electrical
connection affected by vibration

 Time lag error

Page 85 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
In any instrument system, it must take time for a change in the input to show up on the indicated
output. This time may be very small or very large depending upon the system. This is known as
the response time of the system. If the indicated output is incorrect because it has not yet
responded to the change, then we have time lag error.

A good example of time lag error is an ordinary glass thermometer. If you plunge it into hot
water, it will take some time before the mercury reaches the correct level. If you read the
thermometer before it settled down, then you would have time lag error.

A thermocouple can respond much more quickly than a glass thermometer but even this may be
too slow for some applications. When a signal changes a lot and quite quickly, (speedometer for
example), the person reading the dial would have great difficulty determining the correct value
as the dial may be still going up when in reality the signal is going down again.

 Reliability
Most forms of equipment have a predicted life span. The more reliable it is, the less chance it
has of going wrong during its expected life span. The reliability is hence a probability ranging
from zero (it will definitely fail) to 1.0 (it will definitely not fail).

 Drift
This occurs when the input to the system is constant but the output tends to change slowly. For
example, when switched on, the system may drift due to the temperature change as it warms up.

 NIST traceability
As defined previously, calibration means the comparison and adjustment (if necessary) of an
instrument’s response to a stimulus of precisely known quantity, to ensure operational accuracy.

In order to perform a calibration, one must be reasonably sure that the physical quantity used to
stimulate the instrument is accurate in itself.

For example, if I try calibrating a pressure gauge to read accurately at an applied pressure of
200PSI, I must be reasonably sure that the pressure I am using to stimulate the gauge is actually
200PSI. If it is not 200PSI, then all I am doing is adjusting the pressure gauge to register 200PSI
when in fact it is sensing something different.

Page 86 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
B. Calibration errors in instrumentation
A zero-shift calibration error shifts the function vertically on the graph. This error affects all
calibration points equally, creating the same percentage of error across the entire range.

Figure 2. 3 Zero shift calibration error graph

A span shift calibration error shifts the slope of the function. This errors effect is unequal at
different points throughout the range:

Figure 2. 4 Span shift calibration error

A linearity calibration error causes the function to differ from a straight line. This type of
error does not directly relate to a shift in either zero (b) or span (m) because the slope - intercept
equation only describes straight lines.

Page 87 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
If an instrument does not provide a linearity adjustment, the best you can do for this type of error
is split the error between high and low extremes, so the maximum absolute error at any point in
the range is minimized:

Figure 2. 5 linearity calibration error


A hysteresis calibration error occurs when the instrument responds differently to an increasing
input compared to a decreasing input.

The only way to detect this type of error is to do an up-down calibration test, checking for
instrument response at the same calibration points going down as going up:

Figure 2. 6 Hysteresis calibration error


Hysteresis errors are almost always caused by mechanical friction on some moving element
(and/or a loose coupling between mechanical elements) such as bourdon tubes, bellows,
diaphragms, pivots, levers, or gear sets.

Page 88 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Flexible metal strips called flexures which are designed to serve as frictionless pivot points in
mechanical instruments may also cause hysteresis errors if cracked or bent.

In practice, most calibration errors are some combination of zero, span, linearity, and hysteresis
problems. It occurs widely with things involving magnetization and demagnetization.

The calibration may be correct at the maximum and minimum values of the range but the graph
joining them may not be a straight line (when it ought to be). This is a non linear error. The
instrument may have some adjustments for this and it may be possible to make it correct at mid-
range as shown.

2.3.4 Calibration of instrument and control device procedures


The purpose of this section to describe procedures for efficiently calibrating different types of
instruments. Calibration of measurement equipment can be done on a variety of instruments in a variety
of industries.

A. Typical calibration
 Electrical Calibration:

Figure 2. 7 Electrical Calibration


Electrical calibration is the process of ensuring that any instrument that measures or tests
electrical properties such as voltage, current, resistance, inductance, capacitance, time, and
frequency is operating properly. Electrical calibration is a high-end process which needs the use
of precise instruments or calibrators to assess the performance of important criteria in other
devices referred to as units under test.

Page 89 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
The following instruments are frequently submitted for electrical calibration:
 Data loggers
 Electric meters
 Multi-meters
 Oscilloscopes
 Frequency counters
 Insulation Testers
 Loop testers etc.

 Mechanical Calibration:

Figure 2. 8 Mechanical Calibration (micrometer and varnier caliper)


Mechanical instruments are prone to drift as a result of repeated use, mechanical stress, and
exposure to fluctuating air conditions etc., and because of such condition the mechanical
calibration is the much-needed remedy to overcome the error induced in the equipment’s. Mass,
volume, density, force, torque, dimension, angle, flatness, and vibration are the major properties
that are calibrated during mechanical calibration in a temperature-controlled atmosphere.

The following are some of the most commonly tested mechanical calibration instruments:
 Accelerometers
 Scales/Balances
 Force Gauges & Load Cells
 Micrometers, Vernier, and height gauges
 Screwdrivers & Torque Wrenches

Page 90 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Sets of Weight and Mass

 Flow Calibration:

Figure 2. 9 Flow Calibration


A flow meter (also known as a flow sensor) is a device that measures the linear or non-linear,
mass or volumetric flow rate of a liquid or gas. The flow rate is the rate at which a process fluid
moves through pipelines, orifices, or vessels at a given time, and it is measured by instruments
through a controlled manner to monitor and regulate the speed and efficiency of industrial flow
processes and devices.

Calibration is required for flow equipment that helps maximize production, profitability, and
compliance with regulatory standards. These flow meters that verify product or feedstock quality
and quantity, fuel/energy quantity, or function in a vital process require flow calibration services
on a regular basis to guarantee that measurements are accurate, allowing operations to proceed
safely and on time.

The four most common types of flow meters that need calibration are:
 Thermal Mass Flowmeters

Page 91 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Laminar flowmeters
 Gas and Air Rotameters
 Turbine meters

 Pipette Calibration: (Types of Calibration)

Figure 2. 10 Pipette Calibration

Pipette calibration is necessary for accurate and precise pipetting results in laboratories that uses
this measurement device often. The calibration process and methods must be followed for all
types of pipettes used in laboratories, including single-channel, multi-channel manual pipettes,
and electronic pipettes. The primary goal of pipette calibration is to guarantee that dispensing is
done with the desired precision.

 Pressure Calibration:

Page 92 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 2. 11 Pressure Calibration
Pressure calibration is a critical operation performed in a variety of industries where
measurement equipment is required to monitor process performance and safety, with gas and
hydraulic pressure being the most common measurements. Many businesses are now certified to
quality standards such as ISO9000. There is a different procedure that must be followed in order
to maintain quality standards, and because many industrial processes depend upon pressure
measurement, pressure calibration is a crucial aspect of a company’s quality assurance.

Pressure calibration is carried out using a different pressure balances and calibrators, as well as
high-accuracy pressure sensors and pressure gauges.

The following are some examples of pressure devices that are calibrated on a regular basis:
 Digital Pressure Gauges
 Digital Indicators
 Transducers
 Transmitters
 Analogue Pressure Gauges
 Barometers
 Test Gauges
 Temperature Calibration:

Page 93 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 2. 12 Temperature Calibration
Temperature calibration is undertaken and carried out in a controlled environment in all
processes where temperature readings play an important role for the equipment’s to run without
interruption. Thermistors, thermocouples, or Platinum resistance thermometers (PRTs),
sometimes known as resistance temperature devices (RTDs), are commonly employed in
temperature calibration.

It’s important to remember that measuring the temperature from a temperature sensor using an
RTD or thermocouple indicator and then comparing the readings to the in-line field indicator is
not a temperature calibration. A temperature calibration can only be done by comparing the
probe being tested to a recognized reference in a stable temperature environment.

The following are some examples of equipment that need temperature calibration on a
regular basis:
 Data Acquisition Systems
 Thermometers/Thermocouples
 Dial Thermometers
 Chambers/Furnaces
 Infrared Meters
 PRTs and Thermistors
 Thermal Cameras
Steps or precautions to be observed during calibration of a measurement system:

 Specified environmental conditions are to be maintained so that similar conditions


prevail when the system is calibrated and when the actual measurements are made.

Page 94 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 The device to be calibrated is checked for any physical defects.

 The standard measurement system used for calibration should be at least ten times
more accurate than the desired measurement system accuracy i.e accuracy ratio of
10:1

B. Calibration Procedures in Linear, Non-Linear and Discrete Instruments

Calibration refers to the adjustment of an instrument so its output accurately corresponds to its
input throughout a specified range. The only way we can know that an instrument’s output
accurately corresponds to its input over a continuous range is to subject that instrument to
known input values while measuring the corresponding output signal values. This means we
must use trusted standards to establish known input conditions and to measure output signals.
The following examples show both input and output standards used in the calibration of pressure
and temperature transmitters:

Figure 2. 13 Calibration of pressure transmitters

Page 95 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 2. 14 Calibration of temperature transmitters
 Procedures for efficiently calibrating different types of instruments

 Linear instruments
The simplest calibration procedure for an analog, linear instrument is the so-called zero-and-
span method. The method is as follows:

1. Apply the lower-range value stimulus to the instrument, wait for it to stabilize
2. Move the “zero” adjustment until the instrument registers accurately at this point
3. Apply the upper-range value stimulus to the instrument, wait for it to stabilize
4. Move the “span” adjustment until the instrument registers accurately at this point
5. Repeat steps 1 through 4 as necessary to achieve good accuracy at both ends of the
range

An improvement over this crude procedure is to check the instrument’s response at several
points between the lower- and upper-range values. A common example of this is the so-
called five-point calibration where the instrument is checked at 0% (LRV), 25%, 50%, 75%,
and 100% (URV) of range. A variation on this theme is to check at the five points of 10%,
25%, 50%, 75%, and 90%, while still making zero and span adjustments at 0% and 100%.
Regardless of the specific percentage points chosen for checking, the goal is to ensure that we
achieve (at least) the minimum necessary accuracy at all points along the scale, so the
instrument’s response may be trusted when placed into service.

Yet another improvement over the basic five-point test is to check the instrument’s response at
five calibration points decreasing as well as increasing. Such tests are often referred to as Up-

Page 96 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
down calibrations. The purpose of such a test is to determine if the instrument has any
significant hysteresis: a lack of responsiveness to a change in direction.

Some analog instruments provide a means to adjust linearity. This adjustment should be moved
only if absolutely necessary! Quite often, these linearity adjustments are very sensitive, and
prone to over-adjustment by zealous fingers. The linearity adjustment of an instrument should be
changed only if the required accuracy cannot be achieved across the full range of the instrument.
Otherwise, it is advisable to adjust the zero and span controls to “split” the error between the
highest and lowest points on the scale, and leave linearity alone.

The procedure for calibrating a “smart” digital transmitter – also known as trimming – is a bit
different. Unlike the zero and span adjustments of an analog instrument, the “low” and “high”
trim functions of a digital instrument are typically non-interactive. This means you should only
have to apply the low- and high-level stimuli once during a calibration procedure. Trimming the
sensor of a “smart” instrument consists of these four general steps:

1. Apply the lower-range value stimulus to the instrument, wait for it to stabilize
2. Execute the “low” sensor trim function
3. Apply the upper-range value stimulus to the instrument, wait for it to stabilize
4. Execute the “high” sensor trim function

Likewise, trimming the output (Digital-to-Analog Converter, or DAC) of a “smart” instrument


consists of these six general steps:

1. Execute the “low” output trim test function


2. Measure the output signal with a precision milliammeter, noting the value after it
stabilizes
3. Enter this measured current value when prompted by the instrument
4. Execute the “high” output trim test function
5. Measure the output signal with a precision milliammeter, noting the value after it
stabilizes
6. Enter this measured current value when prompted by the instrument

Page 97 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
After both the input and output (ADC and DAC) of a smart transmitter have been trimmed (i.e.
calibrated against standard references known to be accurate), the lower- and upper-range values
may be set. In fact, once the trim procedures are complete, the transmitter may be ranged and
ranged again as many times as desired. The only reason for re-trimming a smart transmitter is to
ensure accuracy over long periods of time where the sensor and/or the converter circuitry may
have drifted out of acceptable limits. This stands in stark contrast to analog transmitter
technology, where re-ranging necessitates re-calibration every time.

 Nonlinear instruments
The calibration of inherently nonlinear instruments is much more challenging than for linear
instruments. No longer are two adjustments (zero and span) sufficient, because more than two
points are necessary to define a curve.

Examples of nonlinear instruments include expanded-scale electrical meters, square root


characterizers, and position-characterized control valves.

Every nonlinear instrument will have its own recommended calibration procedure, so I will
defer you to the manufacturer’s literature for your specific instrument. I will, however, offer one
piece of advice: when calibrating a nonlinear instrument, document all the adjustments you
make (e.g. how many turns on each calibration screw) just in case you find the need to “re-set”
the instrument back to its original condition. More than once I have struggled to calibrate a
nonlinear instrument only to find myself further away from good calibration than where I
originally started. In times like these, it is good to know you can always reverse your steps and
start over!

 Discrete instruments
The word “discrete” means individual or distinct. In engineering, a “discrete” variable or
measurement refers to a true-or-false condition. Thus, a discrete sensor is one that is only able to
indicate whether the measured variable is above or below a specified setpoint.

Examples of discrete instruments are process switches designed to turn on and off at certain
values. A pressure switch, for example, used to turn an air compressor on if the air pressure ever
falls below 85 PSI, is an example of a discrete instrument.

Discrete instruments require periodic calibration just like continuous instruments. Most discrete
instruments have just one calibration adjustment: the set-point or trip-point. Some process

Page 98 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
switches have two adjustments: the set-point as well as a dead band adjustment. The purpose of
a dead band adjustment is to provide an adjustable buffer range that must be traversed before the
switch changes state. To use our 85 PSI low air pressure switch as an example, the set-point
would be 85 PSI, but if the dead band were 5 PSI it would mean the switch would not change
state until the pressure rose above 90 PSI (85 PSI + 5 PSI).

When calibrating a discrete instrument, you must be sure to check the accuracy of the set-
point in the proper direction of stimulus change. For our air pressure switch example, this would
mean checking to see that the switch changes states at 85 PSI falling, not 85 PSI rising. If it
were not for the existence of dead band, it would not matter which way the applied pressure
changed during the calibration test. However, dead band will always be present in a discrete
instrument, whether that dead band is adjustable or not.

C. Calibration of instrument with their procedure


Owing to the physical limitations of measuring devices and the system under study, every
practical measurement will always have some errors. Several types of errors occur in a
measurement system. These include;

Static Errors:
They are caused by limitations of the measuring device or the physical laws governing its
behavior.

Dynamic Errors:

They are caused by the instrument not responding fast enough to follow the changes in
measured variable. A practical example can be seen in a situation where the room thermometer
does not show the correct temperature until several minutes after the temperature has reached a
steady value.

Random Errors:

These may be due to causes which cannot be readily established; could also be caused by
random variations in the system under study.

Basic Steps in Instrument Calibration

Calibration is a process whereby we ascertain the output of an instrument after being used over a
definite period, by measuring and comparing against a standard reference and to carry out the

Page 99 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
necessary adjustments required to confirm whether its present accuracy conforms to that
specified by its manufacturer. There are three basic steps involved in the calibration of an
instrument. These include:

(a) To collect measured values (Output values) of standard values (input values)
provided by a standard input reference.

(b) To complete verification/calibration tables for upscale and down scale values (5 or 3
points)

(c) To calculate the error on the output signal and to compare the result with the
expected accuracy.

If ERROR = EXPECTED ACCURACY, no adjustment is required. In other words we are


verifying the accuracy of the instrument.

If ERROR is greater than the EXPECTED ACCURACY, we carry out necessary adjustments to
reduce this error within the expected accuracy (calibration process)

Five Point Calibration Basics


In a five point calibration for an instrument, the output is measured at 0%, 25%, 50%, 75% &
100% of the calibration range of the instrument. In the five-point calibration process, output
readings are taken for upscale and down scale values of the calibration range to determine the
repeatability and hysteresis of the instrument. In a five-point calibration, LRV = 0% Input; URV
= 100%

Steps Involved in a Five Point Calibration


In a five-point calibration exercise, three basic steps are involved. They include:

(a) Zero Adjustments (at LRV)

(b) Span Adjustments (at IJRV)

(c) Linearity Adjustments - at 25%, 50% & 75%

These basic step above are illustrated in the flow chart for a five-point calibration below:

Page 100 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 2. 15 Illustration of five-point calibration

Example of instrumentation and Control device with their procedure

 Pressure Switch Calibration and Adjustment


Before we get down to the nitty-gritty of how to calibrate and adjust a pressure switch, let us get
to understand some basic concepts with pressure switch calibration:

Setpoint:

This is the pressure at which the pressure switch is required to operate. A pressure switch may
be set to operate on either a rising pressure (high level alarm) or a falling pressure (low level
alarm). Most switches are designed to operate at a 'gauge' pressure setpoint i.e. relative to
atmospheric pressure. Some applications require an 'absolute' pressure setpoint i.e. relative to
absolute zero pressure, and an absolute pressure switch is required for these. Ideally the range of
the switch should be chosen such that the setpoint is between 25% to 75% of this range.

Dead-band or Reset:

This is a setting that determines the amount of pressure change required to re-set the switch to
its normal state after it has tripped. The dead-band or reset or switching differential is the
difference in the rising and falling pressures at which, the pressure switch operates. For a fixed

Page 101 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
differential output switch this is typically about 1% to 3% of the switch range. For an adjustable
differential output switch it may be adjusted from about 5% to 12% of the switch range.

The pressure switch is a ubiquitous device. It is practically everywhere in your plant. But how
do you calibrate this simple device? The answer is here. Just follow the simple steps that I have
outlined below.

Before you calibrate your pressure switch, confirm the following:

• The setpoint of the pressure switch

• The dead-band of the switch

• Also depressurize and isolate the pressure switch from the process. If opening the switch
exposes voltages or energy that is not intrinsically safe, please follow the specified
procedure for your plant. For example, if in an explosive environment, use a continuously
monitoring gas detector to monitor for the presence of explosive gasses.

Figure 2. 16 Calibration Procedure of the Pressure Switch

 Step 1: Connect the pressure switch to a pressure source e.g air supply via a hand
pressure regulator and test gauge, as shown in the diagram above.
 Step 2: Use an Ohmmeter or a Digital Multimeter (DMM) set to the continuity range to
check and verify that the switch contacts are as indicated: NO (Normally open) and NC
(Normally close).

Page 102 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Step 3: Connect the Ohmmeter or DMM between the normally open contacts (NO) and
the common terminal (C) of the switch. The meter should read "open circuit". Adjust the
hand pressure regulator to increase the pressure to the setpoint of the pressure switch
until the contacts change over. The meter should now read "short circuit". Note the
pressure reading and write it down. This pressure is the switch setpoint for a "rising"
pressure.
 Step 4: Increase the pressure to the switch to its maximum rating. Slowly reduce the
pressure to the switch until the switch changes over from closed to normally open again.
Note and write down this pressure reading. This pressure is the switch setting for a
"falling" pressure.
 Step 5: From the readings you have taken work out the pressure difference between the
rising and falling pressure settings. This is called the "dead-band" of the switch. The
dead-band calculated should be equal to or less than the manufacturers' dead-band. The
maximum dead-band is usually stated by the manufacturer. The switch is unserviceable
if the maximum dead-band is more than the manufacturer's recommendation (dead-band
on the nameplate of the switch)
 To calibrate the switch for a low pressure, go through the steps in this order: Step 1 to
Step 2 to Step 4 to Step 3 to Step 5

 RTD Transmitter Calibration


The RTD transmitter is usually factory calibrated to the temperature range shown on the device
name plate. When the performance deteriorates and the transmitter needs recalibration, the
transmitter is normally calibrated by using a resistance decade box.

Materials Required for Calibration


To calibrate the RTD transmitter, the following equipment will be required:

1. Voltmeters (digital) of suitable accuracy and very high resolution - Imv


2. A 24VDC power source
3. A 5 dial Resistance Decade Box with high precision providing IOOQ steps

Calibration Steps:

Page 103 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Connect the above equipment as in the setup below:
Lead simulator resistors (If required)

Figure 2. 17 RTD Transmitter Calibration diagram

How to Calibrate RTD Transmitter


Locate the RTD transmitter terminal by removing the housing cover

2. If an RTD is already connected, remove all the RTD lead connections


3. Determine the RTD resistance at the desired base(0 0 C) and full scale temperatures
4. Turn the power supply on
5. Set resistance decade box to the resistance that corresponds to the desired base temperature.
Adjust the zero pot (potentiometer) of transmitter until the output is 4mA
6. Set resistance decade box to the resistance that corresponds to the desired full scale
temperature. Adjust the span pot (potentiometer) of transmitter until the output is 20mA

7. Repeat the above steps until both 4 and 20mA readings are obtained without readjusting
span and zero potentiometers.

 Thermocouple Transmitter Calibration


To calibrate a thermocouple transmitter will require a thermocouple simulator with an accuracy
of at least four times greater than the thermocouple sensor we desire to calibrate.

Equipment and Materials Required

Page 104 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
The following equipment/ materials are required to successfully calibrate a thermocouple
transmitter:

1. Thermocouple Simulator (of at least four times the accuracy of sensor)

2. Digital Voltmeters (Five-digit readout) with accuracy of at least ±0.01% with resolution
1mV

3. 24 VDC Power Supply of at least 35 - 40mA current output


4. Thermocouple wire of the same type of wire the thermocouple transmitter is constructed
of.

Equipment Setup

Below is the equipment set up for the calibration

Alternative read out

Thermocouple

Figure 2. 18 Thermocouple transmitter equipment setup

Calibration Procedure:

1. Remove the thermocouple transmitter terminal housing cover

2. If the transmitter is already connected, remove all the thermocouple lead connections.

Page 105 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
3. Determine the base and full scale temperatures. Read: How to convert thermocouple
millivolt to temperature.

4. Turn power supply on.

5. Consult the thermocouple simulator manual for instructions on setting the thermocouple
type and engineering units.
6. Set the simulator to the base (zero) temperature and adjust the zero pot until the output
is 4mA or 40mV at the test terminals
7. Set the simulator to the full scale temperature and adjust the span pot until the output is
20mA
8. Repeat steps (1-7) above until both the 4 and 20mA readings are obtained without re-
adjusting the span and zero pots.

 Smart Transmitters Calibration


Smart transmitter is remarkably different from that of a conventional analog transmitter.
Consequently, the calibration methods for both devices are also very different. Remember that
calibration refers to the adjustment of an instrument so its output accurately corresponds to its
input throughout a specified range. Therefore, a true calibration requires a reference standard,
usually in the form of one or more pieces of calibration equipment to provide an input and
measure the resulting output. If you got here looking for information on analog pressure
transmitter calibration, you may consult:
How to Calibrate Your DP Transmitter
The procedure for calibrating a smart digital transmitter is known as Digital trimming. A digital
trim is a calibration exercise that allows the user to correct the transmitter's digital signal to
match plant standard or compensate for installation effects. Digital trim in a smart transmitter
can be done in two ways:

(a) A Sensor Trim: It consist of matching the process variable (be it pressure, level,
flow or temperature) reading of the transmitter to a precision input. This process normally
involves trimming the digital circuit of the input Analog-to-Digital converter in the smart
transmitter.

(b) A 4 - 20mA or Current Loop Trim: This is done by trimming the output Digital-
to-Analog converter in the transmitter.

Page 106 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Actions That Do Not Constitute Proper Calibration in Smart Transmitters

Before we discuss in detail what constitute a proper calibration, let us mention certain common
practice that are not proper calibrations:

(a) Changing the range (LRV and IJRV) of a smart transmitter constitute a configuration
change and not a calibration. This range change merely affects the mathematical computation
done by the microprocessor. It has no effect on the digital process variable as read by a hand-
held digital communicator.

(b) Using only the zero and span adjustments to calibrate a smart transmitter often corrupts
the internal digital readings. You may not notice this if you don't use a hand-held digital
communicator to read the range or digital process data.

(c) Using a hand-held digital communicator to adjust the current loop so that an accurate
input to the transmitter agrees with some readout device on the loop does not constitute a proper
calibration.

Procedure for Calibrating a Smart Transmitter:

To do a proper calibration on a smart transmitter will involve both a sensor trim and/or a 4 -
20m A trim depending on the application where the transmitter is being used. A smart
transmitter typically has high and low trim functions which unlike the zero and span
adjustments of an analog transmitter, are non-interactive. That is adjusting the high trim
function has no effect on the low trim function and vice versa.

Before proceeding to the section below note that a smart transmitter has three outputs which
must be clearly understood:

(a) Digital Process Variable (PV) usually read by a hand-held communicator

(b) Digital Value of the output current in mA (PVAO) which the communicator also reads.

(c) The analog 4 - 20mA signal output which can be read with a suitable milliammeter but
cannot be read by the digital hand-held communicator. If they are not clearly understood please
see: Introduction to Smart Transmitters for a clearer understanding.
For the smart transmitter to be properly calibrated, the error between the applied input to the
transmitter and the digital output (PV) must be within the error specification of the manufacturer

Page 107 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
otherwise a sensor trim will be required to correct this. Similarly, the error between the digital
milliamp value (PVAO) and the analog mA value must be within the error specification of the
manufacturer otherwise a 4 - 20m A trim is required.

Performing a Sensor Trim:

Before performing a sensor trim, run a test, commonly referred to as the AS-FOUND TEST to
confirm the consistency of the sensor and the input Analog-to-Digital converter. Connect the
test setup as shown below:

Figure 2. 19 Performing a Sensor Trim on a Smart Transmitter


Use a precision calibrator to measure the applied input to the transmitter. Read the resulting
output (PV) with a hand-held communicator. Calculate the resulting error between the applied
input and the output (PV) since both are in the same engineering units. Note that the desired
accuracy for this test will be the manufacturer's accuracy specification. If this test does not
pass, then follow the manufacturer's recommended procedure for trimming the sensor. Below
are general guidelines for performing a sensor trim: (a) Apply the lower-range value stimulus to
the transmitter, wait for it to stabilize

(b) Execute the "low" sensor trim function

(c) Apply the upper-range value stimulus to the transmitter, wait for it to
stabilize (d) Execute the "high" sensor trim function

Stimulus as used here should be understood to mean the process variable input to the
transmitter.

Performing a 4 - 20mA Trim:

Before performing a 4 - 20mA trim, run a test, commonly referred to as the AS-FOUND TEST
to confirm the consistency of the output Digital-to-Analog converter and the analog output of

Page 108 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
the transmitter. This procedure may also be called a 4-20 mA trim, a current loop trim, or a
Digital-to-Analog converter trim. Connect the test setup as shown below:

Figure 2. 20 Performing a 4 - 20mA Trim on a Smart Transmitter


Use a hand-held digital communicator to put the smart transmitter into a fixed current output
mode. The input value for this test is the mA value that you instruct the transmitter to produce.
The output value is obtained using a precision milliammeter to measure the resulting current.
Calculate the error between the digital mA value produced by the transmitter and the analog mA
value measured by the current meter. The desired accuracy for this test should also reflect the
manufacturer's accuracy specification. If the test does not pass, then follow the manufacturer's
recommended procedure for trimming the output section. The trim procedure should require two
trim points close to or just outside of 4mA and 20 mA. Do not confuse this with any form of re-
ranging or any procedure that involves using zero and span buttons on the transmitter. Below
are the general guidelines for performing a 4 - 20mA trim:

(a) Execute the "low" output trim test function on the transmitter.

(b) Measure the output signal with a precision milliammeter, noting the value after it stabilizes

(c) Enter this measured current value when prompted by the transmitter

(d) Execute the "high" output trim test function

(e) Measure the output signal with a precision milliammeter, noting the value after it stabilizes

(f) Enter this measured current value when prompted by the transmitter

After both the input and output (ADC and DAC) of a smart transmitter have been trimmed (i.e.
calibrated against standard references known to be accurate), the lower- and upper-range values
(LRV and URV) may be set. In fact, once the trim procedures are complete, the transmitter may

Page 109 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
be ranged and ranged again as many times as desired. The only reason for re-trimming a smart
transmitter is to ensure accuracy over long periods of time where the sensor and/or the converter
circuitry may have drifted out of acceptable limits. The situation is very different in an analog
transmitter, where re-ranging necessitates re-calibration.

Transmitter Damping:

Many HART transmitters support a parameter called damping. If this is not set to zero, it can
have an adverse effect on tests and adjustments. Damping induces a delay between a change in
the transmitter input and the detection of that change in the digital value for the transmitter input
reading and the corresponding output value. It is advisable to adjust the transmitter's damping
value to zero prior to performing tests or adjustments. After calibration, be sure to return the
damping constant to its required value.

 I/P or E/P pressure transducer calibration


Before using your I/P or E/P transducer, they must be properly calibrated in your system. Such
things as orientation, supply pressure, temperature and flow rates could affect the output
pressure, especially with the use of any of our 500 series units. Please find the instruction
sheets (see section 2) for your unit below. You can also follow along with the quick start guide
or watch the video below on how to calibrate your pressure transducer.

The following quick start calibration guide is for our Type 500 unit.

1. Open protective covers to expose Zero and Span adjustment screws.

2. Make appropriate air supply connections and an accurate pressure gauge to verify
calibration results.
3. Connect electrical signal and supply the IP or EP with the minimum input signal that
corresponds to the lowest psi output range for the unit. IE.. for a 4-20 mA input unit,
apply a 4 mA signal to the IP.

Page 110 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
4. Observe the output pressure and use the “Zero” screw to adjust the minimum output
pressure. Turn zero screw counter-clockwise will increase pressure, clockwise to
decrease pressure. Please see our calibration guide for proper screw orientation.
5. Increase signal to maximum input. IE for a 4-20 mA unit, apply a 20 mA signal.
6. Observe the output pressure and use the “Span” screw to adjust the maximum pressure.
7. Repeat steps 3 to 6 in order to verify results. Adjust as necessary.

The zero and span screws to demonstrate what they actually control.

Figure 2. 21 zero and span screws adjustment


 Current to Pressure Transducer Calibration

Current to Pressure (I/P) Converter Calibration Procedure

A “current to pressure” converter (I/P) converts an analog signal\ (4 to 20 mA) to a proportional


linear pneumatic output (3 to 15 psig).

Its purpose is to translate the analog output from a control system into a precise, repeatable
pressure value to control pneumatic actuators/operators, pneumatic valves, dampers, vanes, etc.

Principle:
Its force balance principle is a coil suspended in a magnetic field on a flexible mount.At the
lower end of the coil is a flapper valve that operates against a precision ground nozzle to create a
backpressure on the servo diaphragm of a booster relay. The input current flows in the coil and
produces a force between the coil and the flapper valve, which controls the servo pressure and
the output pressure.

Page 111 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Calibration:
1. Air Filter Regulator
2. I/P Converter
3. Master Pressure gauge (For Measurement of I/P Output pressure)
4. mA Source (to feed mA to I/P Converter)

Calibration Setup:

Figure 2. 22 Current to Pressure (I/P) Converter Calibration setup


Calibration Procedure:

1. Inlet air supply to be set at 20 psi in AFR


2. Feed 4mA from source to I/P converter
3. Observe master pressure gauge, it should show 3psi
4. If it is not showing (3psi) adjust I/P ZERO up to obtain the 3psi
5. Feed 20mA from source to I/P converter
6. Observe master pressure gauge , it should show 15 psi
7. If it is not showing 15 psi- adjust I/P SPAN up to obtain the 15 psi
8. After completion of above procedure again feed 4mA-observe 3psi 20mA-
observe 15psi
9. Up to achieve the correct output from I/P converter repeat the step 2 to 7.

Page 112 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
After calibration following to be done:

1. All the readings to be entered in calibration report.


2. Calibration sticker to be adhesived.

 Calibration of Temperature Sensor with Indicator

Learn the calibration of the temperature sensor with an Indicator with a temperature bath,
master sensor, and multi-function calibrator. Temperature Sensor is broadly classified as RTD
sensor and Thermocouple Sensor.

- RTD Sensor is further classified as 2wire, 3wire and 4 wire RTD Sensor
- Thermocouple Sensor is further classified as Type – B, E, J, K, N, R, S, and T.
Temperature Sensor is connected to Temperature Indicator to display Temperature directly in
Units of Temperature i.e. Degree Celsius, Kelvin, Fahrenheit. This calibration procedure can
be used for either RTD or thermocouple calibration and validation.

Figure 2. 23 Temperature sensor with indicator


The below 10 steps will help you to calibrate your temperature sensor using a temperature bath
and a master sensor.
Step 1: Let’s choose a temperature bath for our calibration.
Even if Temperature Bath has its own Temperature Indicator/display, it cannot be used as a
master for Calibration. It is used as the only source, i.e., only to generate and read the
temperature in the bath. Another calibrated RTD Sensor or Thermocouple is used for Calibration
and this is called a Master sensor. Temperature Bath is selected according to the range of
Temperature Sensor.

Page 113 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Step 2:
Insert the Temperature Sensor which is to be calibrated into the temperature bath.
In this article, we considered the temperature sensor to be calibrated to have an inbuilt
temperature indicator or display. We use this indicator to note down the temperature readings
during the calibration. If the temperature sensor does not have an inbuilt indicator, then we have
to connect a multi-function calibrator to read the temperature readings.
Insert the master RTD Sensor/Thermocouple into another slot of the same Temperature
Bath.
Connect a calibrated multi-function calibrator to this master sensor to read the sensor output in
temperature units like deg C. (as we don’t have an inbuilt indicator in our master sensor we need
to connect a device to read the temperature readings) Make sure to insert the above two sensors
at equal depth to have the ideal effects of the generated temperature. If you don’t have a multi-
function calibrator then you can use calibrated multimeters. Note down the output and do the
conversion to temperature using RTD or thermocouple standard temperature tables.
Step 3: Decide the calibration points of the Temperature Sensor before starting the actual
calibration. For example, I want to do calibration at these points 50 deg c, 75 deg C, 100 deg C,
125 deg C, and 150 deg C.
Step 4: Set the required setpoint in the temperature bath.
Allow the temperature bath to reach its temperature to the desired setpoint. Let the bath
stabilized at the required temperature.
Step 5: Note down Master Sensor reading & Temperature Sensor (unit under calibration)
with Indicator reading.
As per our example set 50 deg c in the temperature bath, wait 30 seconds, note down master
sensor and UUC readings.
Step 6:
Take 5 Readings of each setpoint at the interval of 30 Seconds.
For example, set 50 deg c in the temperature bath, wait 30 seconds, note down master sensor and
UUC readings.
Step 7: If both readings are changing (Master and UUC). Take the average of 5 readings for a
single setpoint.
Step 8: Repeat the above step 4 to step 7 for every setpoint or calibration point. That means, do
the same calibration steps for 75 deg C, 100 deg C, 125 deg C, and 150 deg C setpoints.
Step 9: After Calibration is completed, set the temperature bath temperature to room
temperature.

Page 114 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Step 10: Switch off the temperature bath after reaching room temperature.
If the Temperature bath is switched off directly at a setpoint other than room temperature, it
loses accuracy, stability, and uniformity.

LRV and URV settings, digital trim (digital transmitters)

The advent of smart field instruments containing microprocessors has been a great advance for
industrial instrumentation. These devices have built-in diagnostic ability, greater accuracy (due
to digital compensation of sensor nonlinearities), and the ability to communicate digitally with
host devices for reporting of various parameters.

A simplified block diagram of a smart pressure transmitter looks something like this:

Figure 2. 24 block diagram of a smart pressure transmitter


It is important to note all the adjustments with in this device, and how this compares to the
relative simplicity of an all-analog pressure transmitter:

Page 115 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 2. 25 block diagram of analog pressure transmitter
Note how the only calibration adjustments available in the analog transmitter are the zero and
span settings. This is clearly not the case with smart transmitters.

Not only can we set lower- and upper-range values (LRV and URV) in a smart transmitter, but it
is also possible to calibrate the analog-to-digital and digital-to-analog converter circuits
independently of each other.

What this means for the calibration technician is that a full calibration procedure on a smart
transmitter potentially requires more work and a greater number of adjustments than an all-
analog transmitter.

A common mistake made among students and experienced technicians a like is to confuse the
range settings (LRV and URV) for actual calibration adjustments.

Just because you digitally set the LRV of a pressure transmitter to 0.00PSI and the URV to
100.00PSI does not necessarily mean it will register accurately at points within that range!

The following example will illustrate this fallacy.

Suppose we have a smart pressure transmitter ranged for 0 to 100PSIwith an analog output range
of 4 to 20mA, but this transmitters pressure sensor is fatigued from years of use such that an
Actual applied pressure of 100PSI generates a signal that the analog-to-digital converter
interprets as only 96PSI.

Assuming everything else in the transmitter is in perfect condition, with perfect calibration, the
output signal will still be in error:

Page 116 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Figure 2. 26 block diagram of a smart pressure transmitter with signal reading
As the saying goes, a chain is only as strong as its weakest link. Here we see how the calibration
of the most sophisticated pressure transmitter maybe corrupted despite perfect calibration of
both analog/digital converter circuits, and perfect range settings in the microprocessor. The
microprocessor thinks the applied pressure is only 96PSI, and it responds accordingly with a
19.36mA output signal. The only way anyone would ever know this transmitter was in accurate
at 100PSI is to actually apply a known value of 100PSI fluid pressure to the sensor and note the
in correct response. The lesson here should be clear: digitally setting a smart instrument LRV
and URV points does not constitute a real calibration of the instrument.

For this reason, smart instruments always provide a means to perform what is called a digital
trim on both the ADC and DAC circuits,

 to ensure the microprocessor sees the correct representation of the applied stimulus and
 To ensure the microprocessors output signal gets accurately converted into a DC current,
respectively.
Some technicians use the LRV and URV settings in a manner not unlike the zero and span
adjustments on an analog transmitter to correct errors such as this.

Following this methodology, we would have to set the URV of the worn-out transmitter to
96PSI instead of 100 PSI, so an applied pressure of 100PSI would give us the 20mA output
signal we desire.

In other words, we would let the microprocessor think it was only seeing 96PSI, then skew the
URV so it outputs the correct signal anyway. Such an approach will work to an extent, but any

Page 117 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
digital queries to the transmitter (e.g. using a digital-over-analog protocol such as HART) will
result in conflicting information, as the current signal represents full scale (100PSI) while the
digital register inside the transmitter shows 96PSI.

The only comprehensive solution to this problem is to trim the analog-to-digital converter so
the transmitter’s microprocessor knows the actual pressure value applied to the sensor.

Once digital trims have been performed on both input and output converters, of course, the
technician is free to re-range the microprocessor as many times as desired without re-calibration.
This capability is particularly useful when re-ranging is desired for special conditions, such as
process start-up and shut-down when certain process variables drift into uncommon
regions.

Standard in Instrumentation

The next few subsections describe various standards used in instrument shops to
calibrate industrial instruments.
 Electrical standards

Electrical calibration equipment used to calibrate instruments measuring voltage,


current, and resistance must be periodically calibrated against higher-level
standards maintained by outside laboratories.

In years past, instrument shops would often maintain their own standard cell
batteries (often called Weston cells) as a primary voltage reference. These
special-purpose batteries produced 1.0183volts DC at room temperature with low
uncertainty and drift, but were sensitive to vibration and non-trivial to actually use.
Now, electronic voltage references have all but displaced standard cells in
calibration shops and laboratories, but these references must be checked and
adjusted for drift in order to maintain their NIST traceability.

One enormous benefit of electronic calibration references is that they are able to
generate accurate currents and resistances in addition to voltage (and not just
voltage at one fixed value, either).

Page 118 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Modern electronic references are digitally-controlled as well, which lends them
well to automated testing in assembly-line environments, and/or programmed
multi-point calibrations with automatic documentation of as-found and as-left
calibration data.

If a shop cannot afford one of these useful references for bench top calibration use,
an acceptable alternative in some cases is to purchase a high-accuracy
multimeter and equip the calibration bench with adjustable voltage, current,
and resistance sources. These sources will be simultaneously connected to the
high-accuracy multimeter and the instrument under test, and adjusted until the
high-accuracy meter registers the desired value.

The measurement shown by the instrument under test is then compared against the
reference meter and adjusted until matching (to within the required tolerance).

The following illustration shows how a high-accuracy voltmeter could be used to


calibrate a hand held voltmeter in this fashion:

Figure 2. 27 High-accuracy voltmeter for calibration

It should be noted that the variable voltage source shown in this test arrangement
need not be sophisticated. It simply needs to be variable (to allow precise

Page 119 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
adjustment until the high-accuracy voltmeter registers the desired voltage value)
and stable (so the adjustment will not drift appreciably overtime).

 Temperature standards

The most common technologies for industrial temperature measurement are


electronic in nature: RTDs and thermocouples. As such, the standards used to
calibrate such devices are the same standards used to calibrate electrical
instruments such as digital multimeters (DMMs).

However, there are some temperature-measuring instruments that are not electrical
in nature. This category includes bimetallic thermometers, filled bulb
temperature systems, and optical pyrometers.

In order to calibrate these types of instruments, we must accurately create the


calibration temperatures in the instrument shop. A time-honored standard for low-
temperature industrial calibrations is water, specifically the freezing and boiling
points of water.

Pure water at sea level (full atmospheric pressure) freezes at 32degrees Fahrenheit
(0 degrees Celsius) and boils at 212 degrees Fahrenheit (100degrees Celsius). In
fact, the Celsius temperature scale is defined by these two points of phase change
for water at sea level.

To use water as a temperature calibration standard, simply prepare a vessel for one
of two conditions:

 thermal equilibrium at freezing or


 thermal equilibrium at boiling.
Thermal equilibrium in this context simply means equal temperature throughout
the mixed-phase sample.

Page 120 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
In the case of freezing, this means a well-mixed sample of solid ice and liquid
water.

In the case of boiling, this means a pot of water at a steady boil (vaporous steam
and liquid water in direct contact).

Temperature calibration provides a means of quantifying uncertainties in


temperature measurement in order to optimize sensor and/or system accuracies.

Uncertainties result from various factors including:

a) Sensor tolerances which are usually specified according to published standards


and manufacturers specifications.

b) Instrumentation (measurement) in accuracies again specified in manufacturers


specifications.

c) Drift in the characteristics of the sensor due to temperature cycling and ageing.

d) Possible thermal effects resulting from the installation, for example thermal
voltages created at interconnection junctions.

A combination of such factors will constitute overall system uncertainty.


Calibration procedures can be applied to sensors and instruments separately or in
combination.

Calibration can be performed to approve recognized standards (National and


International) or may simply constitute checking procedures on an “in-house”
basis.

Temperature calibration has many facts, it can be carried out thermally in the case
of probes or electrically (simulated) in the case of instruments and it can be
performed directly with certified equipment or indirectly with traceable standards.

Page 121 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Thermal (temperature) calibration is achieved by elevating (or depressing) the
temperature sensor to a known, controlled temperature and measuring the
corresponding change in its associated electrical parameter (voltage or resistance).
The accurately measured parameter is compared with that of a certified reference
probe; the absolute difference represents a calibration error. This is a comparison
process. If the sensor is connected to a measuring instrument, the sensor and
instrument combination can be effectively calibrated by this technique.

Absolute temperatures are provided by fixed point apparatus and comparison


measurements are not used in that case.

Electrical Calibration is used for measuring and control instruments which are
scaled for temperature or other parameters. An electrical signal, precisely
generated to match that produced by the appropriate sensor at various temperatures
is applied to the instrument which is then calibrated accordingly. The sensor is
effectively simulated by this means which offers a very convenient method of
checking or calibration.

A wide range of calibration “simulators” is available for this purpose; in many


cases, the operator simply sets the desired temperature and the equivalent
electrical signal is generated automatically without the need for computation.
However this approach is not applicable to sensor calibration for which various
thermal techniques are used.

 Pressure standards

In order to accurately calibrate a pressure instrument in a shop environment, we


must create fluid pressures of known magnitude against two broad categories:

• devices that inherently produce known pressures

• devices that accurately measure pressures created by some (other) adjustable


source.

Page 122 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
A dead weight tester (sometimes referred to as a dead- test calibrator) is an
example in the former category. These devices create accurately known pressures
by means of precise masses and pistons of precise area:

Figure 2. 28 Dead- test calibrator


After connecting the gauge (or other pressure instrument) to be calibrated, the
technician adjusts the secondary piston to cause the primary piston to lift off its
resting position and be suspended by oil pressure alone. So long as the mass
placed on the primary piston is precisely known, Earth’s gravitational field is
constant, and the piston is perfectly vertical, the fluid pressure applied to the
instrument under test must be equal to the value described by the following
equation:

P=F/A

Where,

P=Fluid pressure

F=Force exerted by the action of gravity on the mass (F weight = mg)

A=Area of piston

The primary piston area, of course, is precisely set at the time of the dead weight
tester’s manufacture and does not change appreciably throughout the life of the
device.

Page 123 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Pneumatic dead weight tester. In these devices, a constant flow of gas such as
compressed air or bottled nitrogen vents through able port operated by the primary
piston.

Figure 2. 29 Pneumatic dead weight tester


For low-pressure calibrations, the simple manometer is a much more practical
standard. Manometers, of course, do not generate pressure on their own. Connect
both devices to a source of variable fluid pressure, typically instrument air through
a precision pressure regulator:

Figure 2. 30 Manometer
The difference in liquid column heights (h) within the manometer shows the
pressure applied to the gauge. So long as the manometers liquid density is
precisely known, Earth’s gravitational field is constant, and the manometer tubes
are perfectly vertical, the fluid pressure indicated by the manometer must be equal
to the value described by the following equation (two different forms given):

Page 124 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
With pressure-measuring test instruments of suitable accuracy (preferably NIST-
traceable), the same sort of calibration jig maybe used for virtually any desired
range of pressures:

Figure 2. 31 Electronic manometers


The electronic test gauge is designed for very low pressures (inches of water
column), they are sometimes referred to as electronic manometers.

 Test equipment suitable for field pressure calibrations includes lack-


tube manometers made from flexible plastic tubing hung from any
available anchor point near eye level, and test gauges typically of the helical
bourdon tube variety.
 Portable electronic test gauges are also available for field use, many with
built-in hand pumps for generating precise air pressures.
 A pneumatic pressure calibrator for field use was a device manufactured
by the Wallace & Tiernan Corporation, affectionately called a Wally box by
at least one generation of instrument technicians.
It consisted:

 a large dial pressure gauge (several inches in diameter)


• multi-turn needle and a very fine scale,

Page 125 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
• Connected to a network of valves and regulators which were used to set
different air pressures from any common compressed air source. The entire
mechanism was housed in an impact-resistance case for ruggedness.

• One of the many nice features of this calibration instrument was a selector
valve allowing the technician to switch between two different pressures
output by independent pressure regulators.

• Once the two pressure regulator values were set to the instruments lower-
and upper-range values (LRV and URV), it was possible to switch back and
forth between those two pressures at will, making the task of adjusting an
analog instrument with inter active zero and span adjustments much easier
than it would have been to precisely adjust a single pressure regulator again
and again.

 Flow standards
Most forms of continuous flow measurement are inferential; that is, we measure
flow indirectly by measuring some other variable (such as pressure, voltage, or
frequency) directly.

In the case of an orifice plate used to measure fluid flow rate, this would mean
calibrating the differential pressure transmitter to measure pressure accurately and
replacing the orifice plate if it shows signs of wear. Direct validation of flow
measurement accuracy is needed.

Most techniques of flow rate validation take the form of measuring accumulated
fluid volume over time.

For simple validation of liquid flow rates, the flow maybe diverted from its
normal path in the process and into a container where either accumulated volume
or accumulated weight maybe measured overtime.

Page 126 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
If the rate of flow into this container is constant, the accumulated volume (or
weight) should increase linearly overtime.

• The actual flow rate may then be calculated by

The accuracy of this technique rests on some additional factors, though:

• The accuracy of the level transmitter (as a volume measuring instrument).

• The ability to ensure only one flow path in or out of that vessel.

 Direct flow validation is the use of a device called a flow prover.


A flow prover is a precision piston-and-cylinder mechanism used to precisely
measure a quantity of liquid overtime. Process flow is diverted through the
prover, moving the piston over time.

Sensors on the prover mechanism detect when the piston has reached certain
positions, and time measurements taken at those different positions enable the
calculation of average flow.

 Analytical standards
An analyzer measures intrinsic properties of a substance sample such as its
density, chemical content, or purity. In order to calibrate an analyzer, we must
expose it to known quantities of substances with the desired range of properties
(density, chemical composition, etc.).

The calibration of a pH analyzer PH is the measurement of hydrogen ion activity


in an aqueous solution. The standard range of measurement is 0pH to 14pH, the
number representing a negative power of 10 approximately describing the
hydrogen ion morality of the solution

Page 127 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
The pH of a solution is typically measured with a pair of special electrodes
immersed in the solution, which generate a voltage proportional to the pH of the
solution. In order to calibrate a pH instrument, you must have a sample of liquid
solution with a known pH value.

For pH instrumentation, such calibration solutions are called buffers, because they
are specially formulated to maintain stable pH value seven in the face of (slight
levels of) contamination.

PH buffers may be purchased in liquid form or in powder form.

Liquid buffer solutions maybe used directly out of the bottle, while powdered
buffers must be dissolved in appropriate quantities of de-ionized water to generate
a solution ready for calibration use.

 Pre-mixed liquid buffers are convenient to use, but have a fairly limited
shelf life.

 Powdered buffer capsules are generally superior for long-term storage, and
also enjoy the advantage of occupying less storage space in their dry state
than a liquid buffer solution.

 The following photograph shows a few 7.00pH (+/- 0.02pH) buffer capsules
ready to be mixed with water to form a usable buffer solution:

After preparing the buffer solution in a cup, the pH probe is inserted into the
buffer solution and given time to stabilize. One stabilized, the pH instrument
maybe adjusted to register the proper pH value. Buffer solutions should not be
exposed to ambient air for any longer than necessary (especially alkaline buffers
such as 10.0pH) due to contamination.

Pre-mixed liquid buffer storage containers should be capped immediately after


pouring into working cups. Used buffer solution should be discarded rather than

Page 128 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
re-used at a later date.

 Analyzers designed to measure the concentration of certain gases in air must


be calibrated in a similar manner.

 Oxygen analyzers, for example, used to measure the concentration of free


oxygen in the exhaust gases of furnaces, engines, and other combustion
processes must be calibrated against known standards of oxygen
concentration.

An oxygen analyzer designed to measure oxygen concentration over arrange of


ambient (20.9% oxygen) to 0% oxygen maybe calibrated with ambient air as one
of the standard values, and a sample of pure nitrogen gas (containing 0% oxygen)
as the other standard value.

An oxygen analyzer intended for the measurement of oxygen concentrations in


excess of ambient air would require a different standard, most likely a sample of
100% pure oxygen, as a calibration reference.

An analyzer designed to measure the concentration of hydrogen sulfide (H 2S), a


toxic gas produced by an aerobic bacterial decomposition of organic matter, will
require a sample of gas with a precisely known concentration of hydrogen sulfide
mixed in it as a calibration reference.

A typical reference gas concentration might be 25 or 50 parts per million (ppm).

Gas mixtures with such precise concentration values as this may be purchased
from chemical laboratories for the purpose of calibrating concentration analyzers,
and are often referred to as span gases because they are used to set the span of
analyzer instruments.

A self-calibration system is a system of solenoid (electrically controlled on-off)


valves and reference gas bottles setup in such a way that a computer is able to

Page 129 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
switch the analyzer off-line and subject it to standard reference gases on a regular
schedule to check calibration.

Many analyzers are programmed to automatically calibrate themselves against


these reference gases, thus eliminating tedious work for the instrument technician.

A typical self-calibration system for a gas analyzer might look like this:

Figure 2. 32 Typical self-calibration gas analyzer


The gas analyzer is equipped with its own auto-calibration controls and
programming, allowing it to periodically shut off the process sample and switch to
known reference gases for ‘zero’ and ‘span’ calibration checks.

If these checks indicate excessive drift or any other questionable results, the
analyzer has the ability to a gas maintenance alarm to alert an instrument
technician to a potential problem that may require servicing.

2.4. Maintain configured and calibrated devices

There are five types of maintenance for machines and equipment:

Page 130 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
1. Preventative maintenance: This type of maintenance is performed at appropriate
intervals to prevent malfunction and should be proactive rather than reactive. It includes
cleaning, lubrication, inspection, calibration, and replacement of worn parts.
2. Predictive maintenance: This type of maintenance is performed by monitoring the
condition of the equipment to predict when maintenance is required. It includes vibration
analysis, oil analysis, and thermography.
3. Corrective maintenance: This type of maintenance is performed after a failure has
occurred. It includes repair or replacement of the failed component.
4. Routine maintenance: This type of maintenance is performed on a regular basis to
ensure that the equipment is operating correctly. It includes cleaning, lubrication,
inspection, and calibration.
5. Emergency maintenance: This type of maintenance is performed in response to an
unexpected failure or breakdown. It includes repair or replacement of the failed
component.

To maintain configured and calibrated devices, you need to


perform calibration and preventative maintenance procedures. Calibration is a process of
comparing the output of measuring instruments and devices against the input signal of a
standard, verified instrument. It detects and identifies any error in the device under
test. Preventative maintenance is a type of maintenance that is performed at appropriate intervals
to prevent malfunction and should be proactive rather than reactive.
Calibration programs are required by regulatory authorities for equipment used in industries like
power plants, oil & gas, cement, manufacturing, processing, packing, etc. Calibration
requirements for laboratory instruments include specific directions, schedules, limits of accuracy
and precision, remedial actions, and systems to prevent usage of instruments failing calibration.
Qualified individuals responsible for calibrating and maintaining instrumentation should exist. A
second person check of all calibration and maintenance should be performed. The calibration
program and procedures should be reviewed and approved by quality. An instrument/equipment
master list system should be established for identification of all master equipment related to
instrumentation in a manufacturing/process area or laboratory. Retired equipment records
pertaining to retired/obsolete equipment must be kept according to the company’s records
retention procedures.
Calibration: A comparison of two instruments or measuring devices one of which is a standard
of known accuracy (traceable to national standards) to detect, correlate, report or eliminate by

Page 131 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
adjustment, any discrepancy in accuracy of the instrument measuring device being compared to
the standard.

 Calibration Programs Required by Regulatory Authorities


“Automatic, mechanical, or electronic equipment or other types of equipment, including
computers, or related systems that will perform a function satisfactorily, may be used in the
industries like power plant, oil & gas, cement, manufacture, processing, packing etc. If such
equipment is so used, it shall be routinely calibrated, inspected, or checked according to a
written program designed to assure proper performance.
 Maintenance at appropriate intervals to prevent malfunction & shall be “preventative”
not “reactive” maintenance.
 Calibration requirements for Laboratory Instruments
 Specific Directions
 Schedules
 Limits of accuracy & precision
 Remedial Actions
 Systems to prevent usage of instruments failing calibration
• “Control, weighing, measuring, monitoring and test equipment that is critical for assuring the
quality of intermediates or APIs should be calibrated according to written procedures and an
established schedule.

Each Manufacturing / Process Area:


 Written calibration procedures that use traceable calibration standards or calibration
equipment.
 Preventive maintenance procedures and / or referenced manuals
 Qualified individuals (having the appropriate education, training, background and
experience) responsible for calibrating & maintaining instrumentation
 Second person check of all calibration and maintenance
 Qualified individuals responsible for monitoring the calibration and maintenance
program.
 Ensure the calibration program and procedures are reviewed and approved by Quality
Instrument / Equipment Master List
 System for identification of all Master equipment related to instrumentation in a
manufacturing/process area or laboratory

Page 132 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Include instrumentation details (serial number, model number & location)
 If automation components are tracked separately through the configuration management
then it is not necessary to include (must verify)
 Procedures must exist that identify the calibration and maintenance requirements for
each instrumentation / equipment on the master list
Retired Equipment
 Records pertaining to retired / obsolete equipment must be kept according to company’s
records retention procedures
 Records should include date the unit was retired, person responsible and the reason for
retirement / discard
Instrument Identification & Calibration Status
 Each instrument given a unique identifier
 Instrumentation details associated with this number must be documented and available
(e.g. serial number, model number, location, etc.)
 Each instrument should be labeled with the unique identifier
 Calibration status of each instrument , the date of calibration, the next calibration date
and the identification of person performing calibration should be readily available
 Appropriate systems to document calibration status include calibration logs, MAXIMO,
and calibration stickers
 System must be in place to prevent use of an instrument that is not qualified, unusable
due to damage or malfunction, or has exceeded its established calibration interval
 System must be in place that identifies instruments that do not require calibration to be
performed beyond the original or factory calibration to distinguish from those
instruments that do require scheduled calibrations
Documentation required for excluding equipment

Traceability of Standards / Calibration Equipment


 Calibration reference standards / calibration equipment shall be traceable to national
standards and be accompanied by certificates of traceability / analysis
 If recognized standards are not available, an independent reproducible standard may be
used
 The calibration tolerance of a given standard should be as tight or tighter than the
tolerance of the instrument to be calibrated

Page 133 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 A procedure must be in place to ensure tracking and monitoring of standard’s expiration
date and re-calibration / re-certification requirements
 Re-calibration records must be retained
Instrument Calibration Tolerances
Instrument calibration tolerance limits should be established so problems are identified and
corrected in a timely manner

When assigning tolerances, considerations given to:

 Capability of the instrument being calibrated (what the manufacturer/OEM claims the
instrument can achieve).
 Parameters at which the instrument operates (ex: if testing accuracy of + 0.5% is
required, the instrument calibration tolerances should be <0.5%)
 Work environment – environmental conditions can affect the performance of the
instrumentation
Practice of using “Alert” & “Action” levels

“Alert” Tolerance (“Adjustment Limit”)

 Related to instrument performance


 Level at which the instrument is adjusted back into range
 Not required – an industry “best practice”
“Action” Tolerance (“Calibration Limit” or “Out-of-Tolerance”)

 Tied to process performance


 Level at which the potential for product impact exists
 Possible atypical investigation or reporting is required
Deviations beyond “Alert” level but not at “Action” level would not require investigation – May
require adjustment, changes to PMs

Setting of “Alert” and “Action” levels should be described in SOPs, be defendable and have
Quality review and approval

Calibration and Maintenance Frequencies

Page 134 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
May be determined for individual instruments or groups of instruments (similarity of
construction, reliability, and stability)

Considerations when determining calibration frequency:


 Accuracy of the measurement / instrument range
 Consequences of incorrect value caused by out of calibration
 Extent & criticality of use of the instrument & tendency to wear and drift
 The manufacturer’s/OEM’s recommendations
 Environmental & physical conditions (temperature, humidity, vibration)
 Previous calibration records, history of calibration problems & repair history
 Frequency of calibration checks prior to use or in-between intervals
 Redundant / back-up systems (provides secondary source of information available from
other calibrated primary instruments)
 Results of Qualification studies
 Process requirements
 Availability of built-in / automatic calibration checks
Changes to frequency must be approved per change control SOPs

References to specific instrument procedures listed in compendium (USP, BP or EP) for


calibration tests for specific laboratory instruments

Calibration time “windows” should be established around calibration due dates

 Policy for Calibration and Maintenance Intervals & Schedules Include:


 Extending Intervals
 Reducing Intervals
 Maximum Intervals
 Maintenance Requirements
 Manufacturer’s/OEM’s recommendations
 Parts that wear: gaskets, seals & bearings
 Parts requiring periodic replacement: filters, belts & fluids
 Parts requiring periodic inspection and cleaning
 Parts requiring periodic adjustments, tightening and lubrication
 Calibration and Maintenance Procedures

Page 135 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Shall include specific directions and limits for accuracy & precision
 Shall include guidance for remedial action when accuracy & precision limits are not met
 Normally provided in the manufacturer’s/OEM’s manuals
 Some compendial requirements exist for some specific laboratory instrumentation
 Performance checks (e.g. system suitability, daily balance performance checks) are NOT
suitable substitutes for regularly scheduled calibrations
Each calibration & maintenance procedure should include the following:

 Identification of department responsible to perform the calibration or maintenance


 Step-by-step calibration instructions, reference to appropriate calibration procedures or
instrument manuals
 Methods for preventive maintenance or reference to appropriate instrumentation manuals
 Calibration equipment used in the calibration (e.g. spectroscopy filters, voltmeters,
digital thermometers, etc)
 Calibration parameter and tolerance ( ± )
Each calibration & maintenance procedure should include the following:
 Required environmental controls or conditions, where appropriate
 Provisions for adjustments, if needed
 Requirement for recording actual measurements before (“as found”) and after adjustment
or preventive maintenance (“as left”)
 Actions to be taken if instrumentation cannot be calibrated (e.g. contact appropriate
service people, label and remove from service)
 A step to record all calibration & maintenance activities.
Note : “as found” recordings are not required where routine performance checks are available to
provide evidence indicating instrument is operating properly and is suitable for use.

Use of Contractors / Vendor Service Personnel


 Must have a procedure for approving contractor activities
 Contractors must have appropriate training / qualification
 Responsible to review & approve the contractors calibration and maintenance
procedures.
 Responsible to perform actions or steps not part of the contractors procedures.
Calibration & Maintenance Records

 All calibration records must be retained per document retention procedures

Page 136 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Should include “as found” measurements, results of adjustments (“as left”) and
appropriate review & approval of all results
 Tolerance or limit for each calibration point
 Identification of standard or test instrument used
 Identification of persons performing the work and checking the results with dates
 Review must ensure the approved activities have been completed and all results have
passed the established acceptance criteria
 Periodic review of historic calibration & maintenance data to evaluate appropriateness of
established frequencies

2.5. Unplanned events or conditions

Accidents, Malfunctions and Unplanned Events


Accidents, Malfunctions and Unplanned Events refers to events or upset conditions that are
not part of any activity or normal operation of the Project as has been planned by North cliff.
Even with the best planning and the implementation of preventative measures, the potential
exists for accidents, malfunctions or unplanned events to occur during any Project phase, and
if they occur, for adverse environmental effects to result if these events are not addressed or
responded to in an environmentally appropriate manner. Many accidents, malfunctions and
unplanned events are, however, preventable and can be readily addressed or prevented by
good planning, design, emergency response planning, and mitigation. By identifying and
assessing the potential for these events to occur, North cliff can also identify and put in place
prevention and response procedures to minimize or eliminate the potential for significant
adverse environmental effects, should an accidental event occur. The Project is being
designed, and will be constructed and operated, according to best practice for health, safety,
and environmental protection to minimize the potential environmental effects that could result
from the Project, as well as those that could result from accidents, malfunctions or unplanned
events. Prevention and mitigation will be accomplished by the following general principles:
 Us e best management practices and technology for carrying out the Project while
controlling permitted/allowable releases to the environment and consequent
environmental effects;
 Incorporate safety and reliability by design, and application of principles and practices
of process and mine safety management;

Page 137 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Develop and apply procedures and training aimed at safe operation of the facilities that
prevent or avoid the potential upsets that might lead to accidents, malfunctions or
unplanned events; and
 Implement effective emergency preparedness and response.

Page 138 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Self-check (2.2)

Directions: For the following questions, say TRUE if the statement is correct
and FALSE if it is incorrect (wrong).

1. Seaward PV150 is multi-function PV system tester.


2. Performance testing verifies the correct polarity for PV dc circuits.

3. Polarity testing verifies the system power and energy output.

Instruction 2: Choose the correct answer for the following question

1. What are the common types of testing conducted on PV systems?


A. Continuity and resistance testing C. Performance testing
B. Polarity testing D. All are correct answer
2. Conditioning control devices and controllers can involve the following except,
A. Access control systems C. Electrical calibration
B. Level controllers D. Process variable

Instruction 3: match the correct answer under column B to Column A

Page 139 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
A B
1. Random Errors A. refers to the adjustment of an instrument
2. Accuracy B. the region between the limits within which a
quantity is measured
3. Calibration ranges C. Permissible deviation from a specified value
4. Drift D. occurs when the instrument responds
differently to an increasing input compared to
a decreasing input
5. Tolerance E. tasks associated with establishing an accurate
correspondence

6. Calibration F. The ratio of the error to the full-scale output


7. Dynamic error G. occurs when the input to the system is
constant but the output tends to change slowly
8. hysteresis calibration error H. caused by limitations of the measuring device
9. I. caused by the instrument not responding fast
enough
10. Static error F. caused by random variations in the system
under study

Page 140 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Self-check (2.2)

Instruction: Discuses & explain the following questions

1. List down the purpose of Calibration


2. List any types of instrument error
3. Define a five-point calibration for an instrument
4. What is the purpose of calibration?
5. What are the three basic steps involved in the calibration of an instrument?
6. What is a live zero?
7. Write an equation for instrument Calibration of Zero and span adjustments?
8. What are two adjustments on the actual instrument?
9. The two adjustments in analog instruments are interactive, what does it mean?
10.How can damping digital transmitters & pneumatic transmitters?
11.What are the types of instruments?
12.Write the calibration procedure for a linear instrument?
13.What is Up- down calibrations?
14.What is the set-point or trip-point calibration adjustment?
15.Write the d/t b/n set-point & a dead band adjustment?

Page 141 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Operation sheet (2.1)

Pressure transmitter Calibration

Introduction
pressure is an important physical quantity. In industries it should be maintained ,controlled,
monitored at specific readings so that’s why pressure transmitters are so important in industries.
so calibration of pressure transmitter is what we are going to discuss now.
- Equipment’s required for calibration of pressure transmitter
 Pressure transmitter,
 Multi meter,
 HART communicator
Basic procedure for calibration
1. Isolate the Pressure Transmitter from the Process.
2. Slowly open the vent plug and the vent valve to release the pressure.
3. Connect the multimeter with the transmitter and ensure that output is 4ma when 0
pressures is applied.
4. Connect the hand held test pump (pressure source) to the transmitter.
5. Ensure there is no leak.
6. Apply pressure range at 0%, 25%, 50%, 75%, 100% and check there is any error.
7. If there is any error calibration should be done.
If transmitter is analog transmitter
8. Apply 0% pressure as per LRV with hand held test pump and check multimeter if it is
not 4ma adjust the zero pot in the transmitter and correct transmitter output to 4ma
9. Apply 100%pressure as per the URV and correct 20ma in multimeter by adjusting span
pot in the transmitter
10. Repeat these steps to rectify error.

Page 142 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
In case of SMART Transmitter

1. We have to use HART communicator, connect the communicator with the transmitter
select the HART Communicator Menu for lower range value trim and upper range value
trim.
2. Basic Set up – Calibration – Zero Trim/Sensor Trim —Lower/Upper range value trims.
3. HART communicator will automatically calibrate the transmitter.
4. Restore the process connection
5. Take the transmitter on line. Ensure there is no leak

small example of five-point calibration is given below


Low range value=0psi
upper range value=200psi

Operation sheet (2.2) ------------- Calibration of P/I converter.

Current to Pressure (I/P) Converter Calibration Procedure


A “current to pressure” converter (I/P) converts an analog signal\ (4 to 20 mA) to a proportional
linear pneumatic output (3 to 15 psig).

Page 143 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Its purpose is to translate the analog output from a control system into a precise, repeatable
pressure value to control pneumatic actuators/operators, pneumatic valves, dampers, vanes, etc.
Principle:
Its force balance principle is a coil suspended in a magnetic field on a flexible mount. At the
lower end of the coil is a flapper valve that operates against a precision ground nozzle to create a
backpressure on the servo diaphragm of a booster relay. The input current flows in the coil and
produces a force between the coil and the flapper valve, which controls the servo pressure and
the output pressure.
Calibration:
1. Air Filter Regulator
2. I/P Converter
3. Master Pressure gauge (For Measurement of I/P Output pressure)
4. mA Source (to feed mA to I/P Converter)
Calibration Setup:

Calibration Procedure:

10. Inlet air supply to be set at 20 psi in AFR

Page 144 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
11. Feed 4mA from source to I/P converter
12. Observe master pressure gauge, it should show 3psi
13. If it is not showing (3psi) adjust I/P ZERO up to obtain the 3psi
14. Feed 20mA from source to I/P converter
15. Observe master pressure gauge, it should show 15 psi
16. If it is not showing 15 psi- adjust I/P SPAN up to obtain the 15 psi
17. After completion of above procedure again feed 4mA-observe 3psi 20mA-
observe 15psi
18. Up to achieve the correct output from I/P converter repeat the step 2 to 7.
After calibration following to be done:

3. All the readings to be entered in calibration report.


4. Calibration sticker to be adhesived.

Page 145 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
UNIT 3. Inspect, test and calibrate instruments and control devices
This learning guide is developed to provide you the necessary information regarding the
following content coverage and topics:
 Calibration work inspection
 Examine instrumentation and control devices
 Report result
This guide will also assist you to attain the learning outcome stated in the cover page.
Specifically, upon completion of this Learning Guide, you will be able to:

 Undertake final inspections to ensure that the calibration is done


 Check and test instrumentation and control devices
 Prepare Report

3.1. Calibration work inspection


Control of Inspection, Measuring, & Test Equipment
Purpose:
To establish and maintain the documented procedures used to control, calibrate, and maintain
inspection, measuring, and test equipment.
Explanation:
The supplier shall develop procedures for the proper use and validation of inspection equipment.
Supplier Responsibilities:

Page 146 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
The supplier shall establish and maintain documented procedures to control, calibrate, and
maintain inspection, measuring, and test equipment (including test software) used by the
supplier to demonstrate the conformance of product to the specified requirements. Inspection,
measuring and test equipment shall be used in a manner which ensures that the measurement
uncertainty is known and is consistent with the required measurement capability.
Where test software or comparative references such as test hardware are used as suitable forms
of inspection, they shall be checked to prove that they are capable of verifying the acceptability
of product, prior to release for use during production, installation, or servicing, and shall be
rechecked at prescribed intervals. The supplier shall establish the extent and frequency of such
checks and shall maintain records as evidence of control.
Where the availability of technical data pertaining to the inspection, measuring, and test
equipment is a specified requirement, such data shall be made available, when required by
HTNA, for verification that the inspection, measuring, and test equipment is functionally
adequate.
Note: The term “measuring equipment” includes measurement devices.
Control Procedure:
The supplier shall:
 Determine the measurements to be made and the accuracy required, and select the
appropriate inspection, measuring and test equipment that is capable of the necessary
accuracy and precision.
 Identify all inspection, measuring and test equipment that can affect product quality, and
calibrate and adjust them at prescribed intervals, or prior to use, against certified equipment
having a known valid relationship to internationally or nationally recognized standards.
Where no such standards exist, the basis used for calibration shall be documented.
 Define the process employed for the calibration of inspection, measuring and test equipment,
including details of equipment type, unique identification, location, and frequency of checks,
check method, acceptance criteria and the action to be taken when results are unsatisfactory.
 Identify inspection, measuring and test equipment with a suitable indicator or approved
identification record to show the calibration status.
 Maintain calibration records for inspection, measuring and test equipment.
 Assess and document the validity of previous inspection and test results when inspection,
measuring or test equipment is found to be out of calibration.
 Ensure that the environmental conditions are suitable for the calibrations, inspections,
measurements and tests being carried out.

Page 147 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Ensure that the handling, preservation and storage of inspection, measuring and test
equipment is such that the accuracy and fitness for use is maintained.
 Safeguard inspection, measuring and test facilities, including both test hardware and test
software, from adjustments which would invalidate the calibration setting.
Inspection, Measuring, and Test Equipment Records:
The Supplier Quality System Requirements reflect the International standards
Calibration/Verification Records.
Records of the calibration / verification activity on all gages, measuring, and test equipment,
including employee-owned gages, shall include:
 Revisions following engineering changes (if appropriate).
 Any out of specification reading as received for calibration.
 Statement of conformance to specification after calibration.
Notification to customer if suspect material has been shipped

3.2. Examine instrumentation and control devices


Visual and Technical Checking of Instrumentation and Control System defines the minimum
requirements for initial inspection, loop check, and commissioning of control systems during
new plant construction, as well as those inspections, loop checks, and commissioning that might
be necessary following major revisions/modifications or repair.

This topic outlines specific steps that need to be taken in the course of Calibration. The
document is organized so that key sections may be extracted to be used as instructions for that
identified task.

This standard applies globally to all control systems undergoing initial inspection, loop
checking, and commissioning.

1.2.1 Visual inspection of devices


Typically, physical inspection is the first task to be performed once an instrument is turned over
from construction. Physical inspections do not have to be done in conjunction with loop
checking but depending on manpower and instrument location, physical inspections and loop
checking may be done at the same time. At a minimum, physical inspections must be
documented to provide evidence of what was checked and whether the device passed or failed. It
is recommended that field inspection reports be filled out for every piece of instrumentation.
Failed devices shall be corrected before proceeding to loop check.

Page 148 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
A. General Inspection Checks:

 Transmitter support stands installed


 Proper wiring, glancing, and conduit connection such as low point drains, grounding,
shielding and terminations, and bottom cabinet entry
 Verify proper labeling of all tags, warning signs, pressure ratings, etc…
 Confirm all covers, screws, fittings, etc. are installed and properly tightened
 Look for signs of moisture and/or corrosion in electrical conduit and process impulse
line
 Wire terminations are tightened

B. Pressure Transmitters Checking:

Verify sensing line size, material, slope, tap orientation, adequate supports, etc.
 Proper installation of all compression fitting.
 Root valves installation and location.
 Manifold valve selection and proper installation.
 Verify sufficient impulse line length for heat transfer.
 Verify loop seals where required.
 Verify configuration according to specification sheet.
 Instrument accessible for routine maintenance and correctly supported
 Environment acceptable, vibration, heat, splash, etc.
 Heat traced if necessary.
 Verify proper electrical connections.
C. DP (level and flow) Transmitters and orifice plate Checking:

 Verify beta ratio and flow direction for orifice plates directly from the handle or
nameplate.
 Verify sensing line size, material, slope, tap location, adequate supports, etc.
 Verify HI/LO pressure tap location relative to gas or liquid measurement.
 Proper installation of all compression fitting.
 Root valves installation and location.
 Manifold valve selection and proper installation.
 Inspect and confirm all sealed capillary sensing systems used for level measurement
against instrument specification.

Page 149 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Verify sufficient impulse line length for heat transfer.
 Verify loop seals where required.
 Verify configuration according to specification sheet.
 Instrument accessible for routine maintenance.
 Instrument correctly supported.
 Environment acceptable, vibration, heat, splash, etc.
 Heat traced if necessary.
 Verify proper electrical connections.
D. Temperature Element/ Transmitters Checking:

 Verify insertion length within process pipe and ensure firm contact with bottom of
well.
 Verify that either RTD or T/C elements are connected and properly terminated to the
transmitter
 Verify T/C element type and extension wire type according to specification
 Verify proper grounding and shielding according to specification and electrical
installation drawings
 Confirm that all threaded connections are tight (for example, nipple-union-nipple, head
cover and gasket)
 Verify transmitter configuration according to specification sheet
 Instrument accessible for routine maintenance
 Instrument correctly supported
 Environment acceptable, vibration, heat, splash, etc.
 Heat traced if necessary
 Verify proper electrical connections
E. Control Valves Checking:

 Verify control valve flow direction.


 Ensure proper installation of all tubing, fittings, and solenoid valves.
 Verify positioner and solenoid wiring for proper connections and tagging.
 Inspect overall installation of valve body and actuator. Look for signs of overstressed
piping or improper actuator installation causing unnecessary stress.
 For rotary valves, ensure actuator rotation is indexed correctly with valve rotation.
 Verify installation of bug screens or other means of preventing water ingress for all
vented openings.

Page 150 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Check fails action.
 Verify start-up flush kit has been removed where applicable.
 Verify all other ancillary valve equipment functions properly.
 Verify configuration according to specification sheet.
 Instrument accessible for routine maintenance, actuator removal, bonnet and plug
removal, hand wheel operation, positioner maintenance, and solenoid maintenance.
 Valve stroke is within specified time
 Proper support.
 Correct packing for application.
 Packing gland properly tightened.
 Verify all covers, screws, fittings, etc., are installed and properly tightened.
F. Field Switches Checking:

 Verify installation location, confirm against P&ID


 Verify wiring: NO/ NC contacts
 Verify wiring labeling
 Instrument accessible for routine maintenance
 Verify all covers, screws, fittings, etc., are installed and properly tightened

1.2.2 Programmable Electronic System (PES) Staging Requirements


The tasks to be performed to fully commission an entire control/safety and independent safety
system are outlined in Figure below, Task Flow Diagram. All steps will be clarified with
references to other standards when the content falls outside the scope of this standard. The intent
of this standard is to provide the minimum requirements for initial inspection, loop check, and
commissioning of a control system. The objectives of this standard are as follows:

 Verify the proper physical installation (wiring, grounding, labels, tags, pressure ratings,
area classification) of instruments.
 Ensure wiring is landed on the proper termination and verify overall wiring loop
integrity.
 Verify proper calibration range, engineering units, tag name, and diagnostics.
 Verify PES input range.
 Verify all logic including the interlock system
 Verify pre-alarms, bad quality, maintenance bypass switches, and proper configuration
of HMI displays.

Page 151 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
 Verify and confirm proper operation of the instrument and sensor according to supplier
and Air Products specifications.
 Verify proper installation, power and grounding, backup power, network
communications, system diagnostics, and operational functionality of the PES.
 Verify auxiliary systems (Foreign Device Interfaces, Historians, and Billing Systems).
 Verify remote access and remote control if applicable.
The following documents shall be available on site and shall be the latest revision:

3.3. Report result

People communicate in their spare time and in the professional area. They communicate either
in oral or in written form. If they communicate about technical topics, this process is called
technical communication. If they communicate in written form, they write or read “technical
reports”. If the technical report is communicated in oral form, it is a presentation to an audience.

Page 152 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Therefore, all documents in the following list are technical reports, if they deal with a technical
subject:
 Reports about laboratory experiments
 Construction and design reports
 Reports about testing and measurements
Technical reports shall be written so that they reach your readers. This requires a high level of
systematic order, logic and clarity. These understandability aspects must already be taken into
account, when you plan the necessary work steps. This is the only way to perform all work steps
accurately. As a result all facts about the described items or processes and the thoughts of the
writer of a technical report become clear for the reader without any questions and without doubt.
In technical a systematic approach is used to solve tasks and larger projects. Tasks are solved in
the sequence planning, realization and checking. This approved approach should be applied in a
similar way when creating technical reports. Here the necessary work steps can be grouped in
the phases planning, creation and finishing (with check-ups). However, before describing the
single measures in the planning process we will present a general overview of all required work
steps to create a technical report.

General overview of all required work steps


The following checklist 3-1 shows all required work steps to create technical reports.
 Accept and analyze the task
 Check or create the title
 Design a 4-point-structure
 Design a 10-point structure
 Search, read and cite literature
 Elaborate the text (on a computer)
 Create or select figures and tables
 Develop the detailed structure
 Perform the final check
 Print copy originals or create pdf file
 Copy and bind the report
 Distribute the report to the defined recipients
This work steps to be performed partly parallel overlapping
When you write a technical report, there is nearly always a task, which you either selected
yourself or it was defined by someone else. You should analyze this task precisely during the

Page 153 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
planning of the technical report, checklist 3-2.
Checklist 3-2 analysis of the task to write a technical report
 The task defined may by:-
 A professor or an assistant (in case of a report written during your studies)
 A supervisor
 The development team
 A consulting company
 A customer
 You yourself (e.g. If you write an article for a scientific journal)
 Understand the task correctly
 Belongs to the target group
 Write the report for appropriate person
 Take notes accordingly
 The report may contain contents
 Please write that down!
The difference is in the way the results are reported, in the first case, a specific value is reported
and in the second, it is reported as either in or out of tolerance (specification,) the minimum
information that must be supplied is illustrated by the content of a typical NIST report. Note that
a NIST report of test generally has nothing to do with calibrations, A NIST Report of
Calibration gives
 The value of the item calibrated
 The uncertainty of the calibration for the accepted reference standard and details about
the overall uncertainty
 The conditions under which the measurements were carried out, and
 Any special information regarding the calibration, it does not include uncertainties for
effects of transport to and from the calibrating laboratory, drifts with time, effects of
environmental conditions (i.e., temperature, humidity, barometric pressure, etc.)

Page 154 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Self-Check 3

Direction: Give short answer for the following question


1. List the records of the calibration / verification activity on all gages, measuring, and test
equipment. (4%)
a. _______________________________________________________________
b. ______________________________________________________________
c. ______________________________________________________________
d. ______________________________________________________________

2. The _____________________System Requirements reflect the International standards


Calibration/Verification Records. 2%
3. List some of temperature element/ transmitters checking: (3%)
4. List the technical reports? (Have three point)
5. What is visual inspection?

Page 155 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Expert Profile
The trainers who developed the curriculum

N Name Qualification Educational background Region College Mobile E-mail


o level number

1 Melaku Bekele A Electri/Automation Oromia [email protected]


Ofgaha Amboo poly TVET college O921051772
Control Technology

2 Bahiru Demeke A Oromia [email protected]


Electri/Elec Contr Bishoftu Poly TVET College O911394408

3 Worku Biru A Electri/Automation Oromia [email protected]


Sebeta Poly TVET College O912106303
Mekonnen Control Technology

4 Sisay A Electrical Power Arbaminch poly & satelite [email protected]


O945234339
H/Mariam Toga Enginering College om

5 Hassen Husen A Oromia


Electrical Eng M/G/M/B Poly TVET College O943402001
Hamid

Page 156 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023
Reference

1. https://documents.uow.edu.au/content/groups/public/@web/@ohs/documents/doc/
uow017046.pdf
2. http://online.anyflip.com/nclgi/xptp/mobile/index.html
3. 3.
4. Instrumentation and control tutorial 3 – signal, www.researchgate.net
5. https://www.status.co.uk/wp-content/uploads/files/Products/311.pdf
6. https://pdhonline.com/courses/e444/e444content.pdf
7. https://paktechpoint.com/loop-checking-procedures-of-instruments/
8. https://www.tiptech.com/blog/articles/understanding-the-quality-inspection-process/
9. https://www.usna.edu/ECE/ee426/Reading/Telemetry_WSN_Comms_Supp.pdf
10. https://www.google.com/search?q=Obtaining++and+checking+tools,
+equipment+and+testing+devices+for+instrumentation+and+control+configuration+
+devices+%EF%82%B7+configurator+or+programmer+%EF%82%B7+computer+%EF
%82%B7+multi-meter+%EF%82%B7+calibrators+%EF%82%B7+signal+generator+
%EF
%82%B7+oscilloscope&ei=Slv0X6j4BZyj1fAPiPSD2AE&start=10&sa=N&ved=2ahU
KEwioxd2v5YTuAhWcURUIHQj6ABsQ8tMDegQIBBA0&biw=903&bih=552
11. https://www.safetyandhealthmagazine.com/articles/10994-reporting-near-misses

Page 157 of 158 Ministry of Labor and skills Calibrating and Configuring instrumentation Version -1
Author/Copyright and Control Devices October, 2023

You might also like