Advanced Control Systems: Instructor: Dr. Manas Kumar Bera Associate Professor, Dept. of EE NIT Rourkela
Advanced Control Systems: Instructor: Dr. Manas Kumar Bera Associate Professor, Dept. of EE NIT Rourkela
EE3302
1/47
Course information
2/47
Course Objectives
3/47
Course Outcomes
4/47
References
5/47
What is Control Engineering or Control Science?
6/47
Examples
7/47
Control is All Around Us
8/47
Control is All Around Us
I In addition to the engineering systems, variables in
biological systems such as the blood sugar and blood
pressure in the human body, are controlled by processes
that can be studied by the automatic control methods.
I Similarly, in economic systems variables such as
unemployment and inflation, which are controlled by
government fiscal decisions can be studied using control
methods.
I Our technological demands today impose extremely
challenging and widely varying control problems.
I These problems range from aircraft and underwater
vehicles to automobiles and space telescopes, from
chemical processes and the environment to
manufacturing, robotics and communication networks.
I Control system engineers are concerned with
understanding and controlling systems, to provide useful
economic products for society.
9/47
Example: An elevator
10/47
Elevator response
11/47
What are some common examples of control
systems?
I Temperature Control in a house or building,
I Aircraft or Ship Steering,
I Automobile and Train Control,
I Cruise Control,
I Satellite Tracking with Antennas,
I Motor Speed Control,
I Temperature, Pressure and Flow Control in Petrochemical
Processes,
I Nuclear Reactor Control,
I Voltage Regulator,
I Blood glucose regulation,
I Green Engineering: Environmental Monitoring, Energy
storage systems, power quality monitoring, solar energy,
wind energy
I Smart grid control systems
12/47
Methodology
I The first step in understanding the main ideas of control
methodology is realizing that we apply control in our
everyday life; for instance, when we walk, lift a glass of
water, or drive a car.
I The speed of a car can be maintained rather precisely, by
carefully observing the speedometer and appropriately
increasing or decreasing the pressure on the fuel pedal.
Higher accuracy can perhaps be achieved by looking
ahead to anticipate road inclines that affect the speed.
This is the way the average driver actually controls speed.
I If the speed is controlled by a machine instead of the
driver, then one talks about automatic speed control
systems, commonly referred to as cruise control systems.
I An automatic control system, such as the cruise control
system in an automobile, implements in the controller a
decision process, also called the control law, that dictates
the appropriate control actions to be taken for the speed to
be maintained within acceptable tolerances. 13/47
Methodology
14/47
Speed Control
15/47
Foundations and Methods
16/47
Foundations and Methods
17/47
Challenges in Control
18/47
Emerging Control Areas
I The increasing availability of vast computing power at low
cost, and the advances in computer science and
engineering, are influencing developments in control.
I Intelligent Autonomous Control Systems: Intelligent
autonomous controllers are envisioned emulating human
mental faculties such as adaptation and learning, planning
under large uncertainty, coping with large amounts of data
etc in order to effectively control complex processes;
I Hybrid Control Systems: Hybrid systems arise from the
interaction of discrete planning algorithms and continuous
processes, and as such, they provide the basic framework
and methodology for the analysis and synthesis of
autonomous and intelligent systems. Hybrid systems
frequently arise from computer aided control of continuous
processes in manufacturing, communication networks,
auto-pilot design, computer synchronization, traffic control,
and industrial process control
19/47
Emerging Control Areas
20/47
A Brief History of Automatic Control
I The use of feedback to control a system has a fascinating
history.
I The first applications of feedback control appeared in the
development of float regulator mechanisms in Greece in
the period 300 to 1 B.C.
I An oil lamp devised by Philon in approximately 250 B.C.
used a float regulator in an oil lamp for maintaining a
constant level of fuel oil.
I The first feedback system to be invented in modern Europe
was the temperature regulator of Cornelis Drebbel
(1572-1633) of Holland.
I Dennis Papin (1647-1712) invented the first pressure
regulator for steam boilers in 1681. Papin’s pressure
regulator was a form of safety regulator similar to a
pressure-cooker valve.
21/47
Water-level float regulator
22/47
A Brief History of Automatic Control
1764
The first automatic negative feedback controller used in
an industrial process is generally agreed to be James
Watt’s flyball governor, developed in 1769 for controlling
the speed of a steam engine.
James Watt
(1736 ∼
1819)
23/47
Steam Engine
24/47
Steam engine: Watt’s Flyball/ Centrifugal governor
25/47
Steam engine: Watt’s Flyball/ Centrifugal governor
I Solution adapted by James Watt in 1788 did the trick: if
properly tuned, the governor maintains a near constant
speed whatever the load or fuel supply conditions are.
I Idea: When rotation speed increases, balls move
outwards, reducing the valve aperture and steam
admission.
I When rotation speed falls, balls move inwards, increasing
the valve aperture and steam admission
I This operation principle called feedback control.
I The first automatic control system used in industrial
processes- Watt’s governor tamed steam engine and made
the Industrial Revolution possible.
I Tuning centrifugal governors turned out to be not simple at
all. Too enthusiastic governor could cause oscillatory
motion of steam engine (hunting phenomenon).
26/47
Block Diagram
27/47
1860
“On Governors” Maxwell developed a third
order dynamic model. The characteristic
roots were functions of (m, `, k) the design
parameters. Oscillations corresponded to
the proximity of the characteristic roots to James Clerk
the jω axis, as in “resonant” systems such Maxwell
as oscillators and bridges. (1831∼1879)
28/47
1877
Routh criterion developed a method of
counting root distribution of a polynomial
with respect to the left and right half planes,
without calculating the roots. Wins Adams Edward John
Prize. Routh
(1831∼1907)
29/47
1892
Lyapunov’s Ph.D thesis develops criteria for
the stability analysis of nonlinear systems. Aleksandr
Mikhailovich
Lyapunov
(1857∼1918)
30/47
1900∼1910
Wright Brother’s flight
research Wilbur Wright
Orville Wright
(1867∼1912)
(1871∼1948)
31/47
The Plane of the Wright Brothers–Milestones of
Science
32/47
I Prior to World War II, control theory and practice
developed differently in the United States and western
Europe than in Russia and eastern Europe.
I The main impetus for the use of feedback in the United
States was the development of the telephone system and
electronic feedback amplifiers by Bode, Nyquist, and Black
at Bell Telephone Laboratories.
I The frequency domain was used primarily to describe the
operation of the feedback amplifiers in terms of bandwidth
and other frequency variables.
I In contrast, the eminent mathematicians and applied
mechanicians in the former Soviet Union inspired and
dominated the field of control theory. Therefore, the
Russian theory tended to utilize a time-domain formulation
using differential equations.
33/47
1926
H. S. Black invents the feedback amplifier to
provide accurate gains using unreliable
components. These forms the basis of
operational amplifiers and integrated
circuits, fundamental to modern control, Harold Stephen
communication and signal ( audio and video Black
) processing systems. Black’s Amplifier. (April 14, 1898
∼ December
11, 1983)
34/47
1932
H. Nyquist develops a criterion to predict
closed loop system stability based on
measurements of the open loop frequency Harry Nyquist
response of the system. Nyquist Criterion. (February 7,
1889 ∼ April 4,
1976)
35/47
1942 ∼ 45
H. W. Bode, building on the work of Nyquist,
develops frequency response methods for
Hendrik Wade
designing control and feedback systems
Bode
using logarithmic plots.
(December 24,
1905 ∼ June
21, 1982)
36/47
I A large impetus to the theory and practice of automatic
control occurred during World War II when it became
necessary to design and construct automatic airplane
piloting, gun-positioning systems, radar antenna control
systems, and other military systems based on the
feedback control approach.
I The complexity and expected performance of these military
systems necessitated an extension of the available control
techniques and fostered interest in control systems and the
development of new insights and methods.
I Prior to 1940, for most cases, the design of control
systems was an art involving a trial-and-error approach.
During the 1940s, mathematical and analytical methods
increased in number and utility, and control engineering
became an engineering discipline in its own right
37/47
1940 ∼ 50
Servomechanism theory is researched and
developed to a high level to meet the
demands of World War II. MIT Radiation
Labs, Norbert Wiener, H.W. Bode, W.R. Norbert Wiener
Evans, N. Nichols and others drive the effort. (November 26,
1894 ∼ March
18, 1964)
38/47
1950 ∼ 60
Richard Bellman invents
dynamic programming. L.S.
Richard E.
Pontryagin develops the Lev Pontryagin
Bellman
Maximum Principle. (September 3,
(August 26,
1920 ∼ March 1908 ∼ May 3,
19, 1984) 1988 )
39/47
I Frequency-domain techniques continued to dominate the
field of control following World War II with the increased
use of the Laplace transform and the complex frequency
plane
I During the 1950s, the emphasis in control engineering
theory was on the development and use of the s-plane
methods and, particularly, the root locus approach.
I Furthermore, during the 1980s, the use of digital
computers for control components became routine. The
technology of these new control elements to perform
accurate and rapid calculations was formerly unavailable to
control engineers.
I With the advent of Sputnik and the space age, another
new impetus was imparted to control engineering. It
became necessary to design complex, highly accurate
control systems for missiles and space probes.
I Furthermore, the necessity to minimize the weight of
satellites and to control them very accurately has spawned
the important field of optimal control.
40/47
I Due to these requirements, the time-domain methods
developed by Lyapunov, Minorsky, and others have been
met with great interest in the last two decades.
I Recent theories of optimal control developed by L. S.
Pontryagin in the former Soviet Union and R. Bellman in
the United States, as well as recent studies of robust
systems, have contributed to the interest in time-domain
methods.
I It now is clear that control engineering must consider both
the time-domain and the frequency domain approaches
simultaneously in the analysis and design of control
systems.
41/47
1957 ∼
Soviet Union launches the Sputnik.
1960 ∼
Kalman introduces quadratic optimization Rudolf Emil
and state space methods. Space race is on. Kalman
U.S. decides to send a man to the moon. (May 19, 1930
Design of automatic pilots (autopilots.) Man ∼ July 2, 2016)
lands on Moon July 24th 1969.
42/47
1960’s
Tremendous progress in computer technology drives control
and communication.
1970’s
Researchers find that the Linear Quadratic Optimal Systems
can have very small stability margins.
1980’s
Research on Robust Control intensifies. H∞ optimal control is
introduced. Proofs of the stability of Adaptive Control Systems
are presented. Kharitonov’s Theorem and its extensions are
developed.
43/47
1990’s
UAV’s, disk drives, GPS, Machine Learning Control, Robotics.
Fragility of Optimal, High Order Systems.
2000’s
Genomic Signal Processing, Cancer treatment as a Control
Problem, Gene manipulation and control, Nano systems and
control, Atomic Force Microscopy. Developments in PID
Control.
2010’s
Control of UAV’s, drones and driverless cars. Control of Smart
Grids. Robotic surgery. Systems Biology. Flocking, Formation,
Consensus Control of Multi Agent Systems.
44/47
2020’s
Networked Control Systems, Cyber-Physical Systems, Internet
of Things, Industry 4.0, Artificial Intelligence, Reinforcement
Learning, Quantum Computing, Climate Change Mitigation,
Data Driven Control, etc.
45/47
Future Control Goals
I What does the future hold? The future looks bright. We are
moving toward control systems that are able to cope and
maintain acceptable performance levels under significant
unanticipated uncertainties and failures, systems that
exhibit considerable degrees of autonomy.
I We are moving toward autonomous underwater, land, air
and space vehicles; highly automated manufacturing;
intelligent robots; highly efficient and fault-tolerant voice
and data networks; reliable electric power generation and
distribution; seismically tolerant structures; and highly
efficient fuel control for a cleaner environment.
I Control systems are decision-making systems where the
decisions are based on predictions of future behavior
derived via models of the systems to be controlled, and on
sensor-obtained observations of the actual behavior that
are fed back. Control decisions are translated into control
actions using control actuators.
I Developments in sensor and actuator technology influence
46/47
Put Control in Your Future
I The area of controls is challenging and rewarding as our
world faces increasingly complex control problems that
need to be solved.
I Immediate needs include control of emissions for a cleaner
environment, automation in factories, unmanned space
and underwater exploration, and control of communication
networks.
I Control is challenging since it takes strong foundations in
engineering and mathematics, uses extensively computer
software and hardware and requires the ability to address
and solve new problems in a variety of disciplines, ranging
from aeronautical to electrical and chemical engineering,
to chemistry, biology and economics.
I We are very proud to be in control. Join us, and together
we will face future challenges.
47/47