ME3001-Lecture Notes 1 - Introduction
ME3001-Lecture Notes 1 - Introduction
Automation:
➢ Automation is often used for processes that were previously operated by humans.
➢ When automated, the process can operate without human assistance or interference.
➢ In fact, most automated systems are capable of performing their functions with greater
accuracy and precision, and in less time, than humans are able to do.
INPUT OUTPUT
MOTOR
Electric Turning
Power
INPUT OUTPUT
THERMOMETER
Temperature Value on
the scale
INPUT OUTPUT
CENTRAL HEATING
Desired Scaled Temperature
Temperature
1.1.1. History of Control Systems
➢ In ancient Egypt, automated water clock, and automatic controlled water
levels in aqueducts are developed.
➢In 1932, Nyquist developed asimple procedure for determining the stability
of closed loop systems .
➢In 1934, Hazen developed the servomechanisms for position control systems.
➢In 1940s, frequency response methods, Bode diagram and the Ziegler-Nichols
rules are introduced.
➢Around 1960, modern control theory has evolved due to digital computers.
Optimal control of both deterministic and stochastic systems are investigated.
➢From 1980s to 1990s, robust control and related topics are searched.
1.2. Classification of Control Systems
Automatic control systems may be classified in a number of ways, depending upon the purpose
of the classification. For instance, according to the effect of the output on the control action,
control systems are classified as open-loop control systems and closed-loop control systems.
There are many other ways of classifying control systems:
Control Systems
SISO
Continuous (Single Input –
Open-loop Linear Time- Centralized
time Deterministic Single Output)
invariant
•Control Systems can also be classified based on the control design strategy such as
Intelligent Control, Adaptive Control, Robust Control, Optimal Control, etc.
1.2.1. Open-Loop and Closed-Loop Control
A. Open-loop control systems
Fig. 1.2. Basic open-loop control system: control action is independent of the
output. No measurement is fed back !!
Systems in which the output quantity has no effect upon the input quantity are
called open loop control systems.
➢ If there are any disturbances, the output changes and there is no adjustment
of the input to bring back the output to the original value.
➢ A perfect calibration is required to get good accuracy and the system should
be free from any external disturbances.
Examples:
➢ The output or the controlled variable is measured and compared with the
reference input and an error signal is generated. This is the activating signal to the
controller which, by its action, tries to reduce the error. Thus the controlled variable
is continuously fed back and compared with the input signal. If the error is reduced to
zero, the output is the desired output and is equal to the reference input signal.
➢ In Watt’s flyball governor, the aim is to make steam engine run at a constant speed
(Fig. 1.6).
Working Principle: The amount of steam admitted to the turbine is adjusted according
to the difference between the desired and the actual engine speeds.
The speed governor is adjusted such that, at the desired speed, no pressurized oil will
flow into either side of the power cylinder. If the actual speed drops below the desired
value due to disturbance or loading, then the decrease in the centrifugal force of the
speed governor causes the piston of the pilot cylinder go downwards resulting the
control valve to move upwards, supplying more steam. Then, the speed of the engine
increases causing the pilot cylinder’s piston move upwards until the desired value is
reached. When the desired turbine speed is obtained, the pilot cylinder’s piston closes
the ports of the power cylinder. On the other hand, if the speed of the engine increases
above the desired value, then the increase in the centrifugal force of the governor causes
the control valve to move downwards. This decreases the supply of steam, and the
speed of the engine decreases until the desired value is reached.
Fig. 1.6. Watt’s Flyball Governor
1.3.2. Temperature Control System (Temperature control of an
electric furnace):
The system uses a specific motor to drive each axis to the desired position in the
x,y,z axis, respectively.
1.3.4. More Control Systems Examples
➢ Aerospace and Military Applications:
➢ Flights (Autopilot Control Applications ,Take off and Landing control),
➢ Space Shuttles (Orbit Tracking Control Applications, Take off and Landing, etc.),
➢ Unmanned vehicles,
➢ Missile guidance and control, etc.
➢ Computer systems:
➢Position control systems for printers , CD/DVD drives and Hard drives.
➢ Network and Internet traffic control.
➢ Robotic Systems:
➢ Position, speed and force control for Assembly robots,
➢ Balancing and motion control of humanoid robots ,
➢ Precision control of Robots for Medical operations,
➢ Mobile robots
1.3.4. More Control Systems Examples (continued):
➢ Biological systems :
➢Insulin delivery control systems,
➢ Tumor growth control, etc.
➢Artificial limbs, prosthetics,etc..
➢ Automobile industry :
➢Anti-lock brake system,
➢ Automatic car parking assistance,
➢ Cruise control, etc.
➢ Manufacturing systems:
➢ CNCs,
➢ Automatic packing machines,
➢ Assembly lines.
➢ Process control :
➢ Chemical processes,
➢ Nuclear power plants,
➢ Complex manufacturing processes
Comparison unit
(Error detector)
Feedback
element
H
Feedback path
1.4. Definitions of standard terminology (continued)
Plant or Process, G: The system to be controlled. A plant may be a piece of equipment,
perhaps just a set of machine parts functioning together, the purpose of which is to
perform a particular operation such as a mechanical device, a heating furnace, a
chemical reactor, or a spacecraft. Since a system is a combination of components that
act together and perform a certain objective, the word of system is not limited to
physical ones. The concept of the system can be interpreted to imply physical,
biological, economic, and the like dynamic phenomena.
•We need to have a mathematical model describing the plant.
Reference input, r: Also known as the set-point or desired output, is an external signal
applied in order to indicate a desired steady value for the plant output.
System output, c: Also known as the controlled output, is the signal obtained from the
plant which we wish to measure and control. Normally, the controlled variable is the
output of the system.
Error detection element or comparison unit compares the value of the controlled
variable to the desired value, and then signals an error if a deviation exists between
the actual and desired values. The error signal, e, is the difference between the
reference input r and the feedback signal b. ( i.e. )
Controller, D, is the element which ensures that the appropriate control signal is
applied to the plant. In many cases it takes the error signal as its input and provides an
actuating signal as its output.
Control input, u, also known as the actuating signal, manipulated variable m(t)
(control action or control signal) is applied to the plant G and is provided by the
controller D operating on the error e. The manipulated variable is the quantity or
condition that is varied by the controller so as to affect the value of the controlled
variable. Note that computing the necessary controller action is based on controller
error, or the difference between the set point and the measured process variable, i.e.
e(t) = r(t) – b(t) (error = set point – measured process variable)
Forward path is the path from the error signal e to the output c, and includes D and
G.
Disturbance, w, or noise is a signal which enters the system at a point other than the
reference input and has the effect of undermining the normal system operation. A
disturbance is a signal that tends to adversely affect the value of the output of a
system. If a disturbance is generated within the system, it is called internal, while an
external disturbance is generated outside the system and is an input.
Feedback signal, b, is the signal produced by the operation of H on the output c (t).
Linear vs. Nonlinear Control Systems: Depending upon the differential equations
used to describe a control system, control systems are called either linear or nonlinear.
x (t ) = 3x(t ) + u (t ) x (t ) = x 2 (t ) + u (t )
Linear System Nonlinear System
(Transfer function can be obtained, (Transfer function cannot be obtained,
Both classical and State-space design can be used) State space design must be used)
x (t ) = 3x(t ) + u (t ) x = 3 x + u
1.6. Other Control Systems Classifications (continued)
x (t ) = 3x(t ) + u (t ) x (t ) = 3t + u (t )
Linear Time-invariant system Linear Time-varying system
x = 3x 2 + cos(x) + u x = 3x 2 + cos(t ) + u
Nonlinear Time-invariant Nonlinear Time-varying
(Autonomous system) (Non-Autonomous system)
1.6. Other Control Systems Classifications (continued)
Centralized Control: In large scale systems, If the control action is governed from one
control centre.
Central
Controller