BE1608 Signal Processing
BE1608 Signal Processing
Dr Alexey Pichugin
Dr Matthias Maischak
with illustrations from S. Haykin & B. van Veen’s “Signals and Systems” [TK5102.5.H38]
• 𝑥(𝑡) and 𝑦(𝑡) denote continuous-valued variables (and CT stands for “continuous time” signal or process);
• 𝑥[𝑡] and 𝑦[𝑡] denote discrete-valued variables (and DT stands for “discrete time” signal or process);
• We would call 𝑥(𝑡), or 𝑥[𝑡], independent variables, and 𝑦(𝑡), or 𝑦[𝑡], dependent.
The separation between CT and DT can be somewhat arbitrary. Many signals we want to process are essentially continuous:
It is common to discretize CT signals (this is usually called “sampling”) and then work with them using DT approaches:
Other signals can be essentially discrete (e.g., it is often the case in economics and finance), but CT could be a better setting for
modelling some of these phenomena.
• Stock prices vary all the time - that’s what day-traders are trying to exploit and develop algorithms to analyse financial and
economic factors to predict stock prices;
• Cruise control in a modern car operating via throttle feedback;
• Chemical reactor that oscillates in rates with time;
• Stability control via a MEMS (microelectrical mechanical system) accelerometer;
• Dynamics of an aircraft or space vehicle;
• An edge detection algorithm for medical images.
Did you notice that we only considered what system components do, not how? Effectively, we treat the components as “black
boxes”, to indicate that we do not care much for their internal specifics. We only consider functionality of the components, and
then use their interconnections to direct the flow of information through systems.
The following signal flow (block) diagrams show three key types of
interconnections that can be used:
The parallel interconnection is an application of two (or more) systems to the same input signal, and the output is taken as the
sum of the outputs of the individual systems
In this case we can say that 𝑦 = 𝑦1 + 𝑦2 or 𝑦(𝑥) = 𝐺1 (𝑥) + 𝐺2 (𝑥). It is sometimes useful to write the second equation
differently, to highlight the parallel nature of the processing, as 𝑦(𝑥) = (𝐺1 + 𝐺2 )(𝑥).
In both cascade and parallel interconnections, the signal flows through each one of them in the forward direction only. This
does not have to be the case in every situation, and many practical systems make use of feedback interconnections.
The feedback interconnection of two systems is a feedback of the output of system 𝐺1 to its input, through system 𝐺2 . In this
context, an error signal 𝑒 characterizes the error between a desired output signal and a direct measurement of the output.
This is an example of a practical system with feedback that all of you would probably recognize:
Of course, a real system contains combinations of several (possibly many) interconnections, of different types, e.g.
However, internally, we have the cascade interconnection of the system that generates 𝑠[𝑛] and 𝐺4 , indeed 𝑦[𝑛] = 𝐺4 (𝑠[𝑛]).
Here 𝑠[𝑛] is produced by the parallel interconnection of the system generating 𝑤[𝑛] and 𝐺3 , so 𝑠[𝑛] = 𝑤[𝑛] − 𝑧[𝑛], with
𝑧[𝑛] = 𝐺3 (𝑥[𝑛]), and 𝑤[𝑛] is produced by the cascade interconnection of 𝐺1 and 𝐺2 .
Overall then,
𝑦[𝑛] = 𝐺4 (𝑠[𝑛]) = 𝐺4 (𝑤[𝑛] − 𝑧[𝑛]) = 𝐺4 (𝐺2 (𝑣[𝑛]) − 𝐺3 (𝑥[𝑛])) = 𝐺4 (𝐺2 (𝐺1 (𝑥[𝑛])) − 𝐺3 (𝑥[𝑛])).
Suppose that 𝑥(𝑡) is a continuous-time signal. We say that signal 𝑦(𝑡) is obtained by amplitude scaling of 𝑥(𝑡) if
𝑦(𝑡) = 𝑐𝑥(𝑡),
and call constant 𝑐 a scaling factor. The same operation can be applied to discrete-time signals:
𝑦[𝑛] = 𝑐𝑥[𝑛].
Given two CT signals 𝑥1 (𝑡) and 𝑥2 (𝑡), we can consider the signal obtained by addition:
One important operation is called time-scaling. Given a CT signal 𝑥(𝑡), we say that signal 𝑦(𝑡) is obtained by time-scaling if
𝑦(𝑡) = 𝑥(𝑎𝑡).
For DT systems only integer values of 𝑎 make sense, so you can only compress, and, when compressing a DT signal, values of
𝑎 > 1 result in samples being lost.
Instead of scaling the time, we can consider its reflection, i.e. the replacement of 𝑡 by – 𝑡:
𝑦(𝑡) = 𝑥(−𝑡).
𝑦(𝑡) = 𝑥(𝑡 − 𝑡0 ).
PROPERTIES OF SYSTEMS
Causality: A system is said to be causal if present output signal(s) depend only on the present or past input signal(s).
Example 1:
Answer: consider several values of time 𝑦(1) = 𝑥(0), 𝑦(2) = 𝑥(1), etc. You can see that 𝑦(𝑡) always depends on past values
of 𝑥(𝑡), hence this system is causal.
Example 2:
Answer: again, consider several values of time 𝑦(1) = 𝑥(2), 𝑦(2) = 𝑥(3), etc. You can see that 𝑦(𝑡) always depends on future
values of 𝑥(𝑡), hence this system is non-causal.
Memory: A system is said to possess memory if the output signal depends on past or future input signals. A system is
memoryless if its output at time 𝑡 depends only on the input at the same time.
Hopefully you can see that both systems in Examples 1 and 2 possess memory.
Stability: A system is said to be bounded-input bounded-output (BIBO) stable if for any bounded input, the corresponding
output is also bounded.
Invertibility: A system is said to be invertible if the input signal can be recovered from the output signal.
In an invertible system you can determine its input signal 𝑥(𝑡) uniquely by observing its output signal 𝑦(𝑡).
Example:
1 1
Consider cascade interconnection where 𝑦1 (𝑡) = 2𝑥(𝑡). If we choose 𝑦(𝑦1 ) = 𝑦1 , then 𝑦(𝑡) = 2𝑥(𝑡) = 𝑥(𝑡).
2 2
Time-invariance: A system is called time-invariant (TI) if its behaviour does not depend on what time it is.
Mathematically, for DT systems, we can say that system 𝑥[𝑛] → 𝑦[𝑛] is time-invariant if for any input 𝑥[𝑛] and any time shift
𝑛0 it is true that
Example:
Answer: Let us denote by 𝑦1 (𝑡) the output produced by the shifted input 𝑥1 (𝑡) = 𝑥(𝑡 − 𝑡0 ).
We have 𝑦1 (𝑡) = sin(𝑥1 (𝑡)) = sin(𝑥(𝑡 − 𝑡0 )). At the same time, 𝑦1 (𝑡 − 𝑡0 ) = sin(𝑥(𝑡 − 𝑡0 )) = 𝑦1 (𝑡).