0% found this document useful (0 votes)
18 views

Memory and Correlation: DR Mohamed Seghier

The document discusses various concepts related to memory, correlation, and signal processing including: 1) Memory refers to how past inputs or outputs influence present behavior in a system. Correlation analysis provides a quantitative measure of memory by examining the degree to which a signal reflects its past values. 2) Autocorrelation is used to analyze memory and find repeating patterns in signals. It measures the similarity of a signal with itself over successive time lags. 3) Cross-correlation is used to measure similarity between two signals as a function of their displacement. It can be used to find features within signals or estimate time delays between signals.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Memory and Correlation: DR Mohamed Seghier

The document discusses various concepts related to memory, correlation, and signal processing including: 1) Memory refers to how past inputs or outputs influence present behavior in a system. Correlation analysis provides a quantitative measure of memory by examining the degree to which a signal reflects its past values. 2) Autocorrelation is used to analyze memory and find repeating patterns in signals. It measures the similarity of a signal with itself over successive time lags. 3) Cross-correlation is used to measure similarity between two signals as a function of their displacement. It can be used to find features within signals or estimate time delays between signals.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Memory and

Correlation

Dr Mohamed Seghier
By Cmglee - Own work, CC BY-SA 3.0,
https://commons.wikimedia.org/w/index.php?curid=20206883
Memory: present behavior (or output) is influenced by past
behaviors (inputs or outputs).
➔ The ability of a system to increase, decrease or redistribute
its energy.

Correlation in a signal relates to the degree to which the signal


at the present time reflects its values in the past.
→ Analysis of correlation provides a quantitative measure of
memory.
Properties of operators and transformations
A transformation operator represents any system.
X(t) : input
Y(y) : output Y(t) = T[x(t)]
T[.] : transformation operator

T is additive if: T(x1(t) + x2(t)] = T[x1(t)] + T[x2(t)]


T is homogenous if: T[a x(t)] = a T[x(t)] , where a is any real scalar

T is a linear transformation if it is both additive and homogenous.


➔T(a x1(t) + b x2(t)] = a T[x1(t)] + b T[x2(t)] , a and b are real scalars
➔Also called the superposition condition
➔By result, T[0] = 0.

Example: is y(t) = 10 (x(t-2)) + 1 a linear system?


T is memoryless if for all times t1, y(t1) depends on x(t1) and not on x(t) at
any other time. Otherwise, the system has a memory (e.g. a moving average
filter has memory).

Example:
Which systems have memory?
𝑦 = 𝑎𝑥2 + 𝑏𝑥 + 𝑐
𝑦 𝑛 = 𝑎𝑥 𝑛 + 𝑏𝑥[𝑛 − 1]
𝑡
𝑦(𝑡) = න 𝑥 𝑡 𝑑𝑡
0
𝑦 𝑛 − 𝑎𝑦 𝑛 − 1 = 𝑥[𝑛] |a|<1, y[-1]=y0.
T is time-invariant if shifting the input by t0 only results in shifting the output
by the same t0.
If T[x(t)] = y(t), then T[x(t – t0)] = y(t – t0)

Example: determine if a system is time-invariant?


System A: y(t)= t x(t) System B: y(t)= 10 x(t)

https://en.wikipedia.org/wiki/Time-invariant_system
A transformation T is causal if y(t) at any time depends on x(t) at that time
or at earlier times but not on the input at future times.

T is causal if, for any t0, y(t0) depends on x(t) for t<= t0 only.
➔The output cannot anticipate the input.

Example: y[n] = 0.5(x[n] + x[n+1]) is not causal.


Energy and Power signals
Example:
Example (a)
A signal can have infinite power, but that’s not a practical reality.
➔ Infinite energy indicates that the signal duration is likely infinite, so it
doesn’t make sense to deal with energy.
➔ Power is energy per unit time averaged over all time, making power more
meaningful than energy.
Example:
The deterministic autocorrelation function
Autocorrelation contains information about the memory in the system from which the
signal arose.
➔ It represents the average degree to which a signal varies in relation to its
past values and is a function of the time lag into the past.
Autocorrelation for DT signals
Properties

The autocorrelation is an even function (symmetry)

𝑅(−𝜏) = 𝑅(𝜏)

The autocorrelation function reaches its peak at the origin

|𝑅 𝜏 | ≤ |𝑅(0)|

The autocorrelation is not necessarily linear:

If z(t) = x(t)+y(t) then Rz(τ)≠Rx(τ)+Ry(τ)


Example:
Example:

1
= cos(Ω𝜏)
2
➔ The autocorrelation is periodic with period 2π/Ω and is
independent of the phase φ.
Example:

Plot the autocorrelation of a


sinewave with frequency 1 Hz and a
sampling frequency of 200 Hz.

N=1024; % Number of samples


f1=1; % Frequency of the sinewave
FS=200; % Sampling Frequency
n=0:N-1; % Sample index numbers
x=sin(2*pi*f1*n/FS); % Generate the signal, x(n)
t=[1:N]*(1/FS); % Prepare a time axis
subplot(2,1,1); % Prepare the figure
plot(t,x); % Plot x(n)
title('Sinwave of frequency 1Hz [FS=200Hz]');
xlabel('Time, [s]');
ylabel('Amplitude');
grid;
Rxx=xcorr(x); % Estimate its autocorrelation
subplot(2,1,2); % Prepare the figure
plot(Rxx); % Plot the autocorrelation
grid;
title('Autocorrelation function of the sinewave');
xlable('lags');
ylabel('Autocorrelation');
Periodicity

Autocorrelation is a mathematical
tool for finding repeating patterns,
such as the presence of a periodic
signal obscured by noise

➔ the autocorrelation of
a periodic function is,
itself, periodic with the
same period.
Example: https://doi.org/10.1016/S0922-3487(08)70227-2

Random series with periodicity.

Although the observations hardly


indicate that the time series exhibits
a periodicity, the autocorrelation is
clearly positive on that point.

➔ Autocorrelation analysis enables


one to determine the period, T, of
the fluctuations. In the example
shown here, one can read that the
period of the fluctuations is around
16 sampling intervals.
Example: periodicity in an EMG signal https://doi.org/10.1002/9780471740360.ebs0094

Two signals obtained from an experiment involving a Autocorrelation functions of the signals shown in
human pressing a pedal with his right foot. (a) The EMG of the (a) shows the autocorrelation function of the absolute
soleus muscle and (b) the force or torque applied to the pedal are value of the EMG and (b) shows the autocorrelation function of
represented. the force.
Randomness
This randomness is ascertained by computing
autocorrelations for data values at varying time lags.
If random, such autocorrelations should be near
zero for any and all time-lag separations. If non-
random, then one or more of the autocorrelations
will be significantly non-zero.
How to create confidence intervals for the autocorrelation sequence of a white
noise process?

L = number of datapoints (sample size)

https://www.mathworks.com/help/signal/ug
/confidence-intervals-for-sample-
autocorrelation.html

Advanced topic about Confidence intervals


of autocorrelations: https://doi.org/10.1016/0167-9473(87)90005-3
Example: extracting useful information from random signals

Two random signals measured from two different systems. The autocorrelation functions of the two signals are quite different
They seem to behave differently, but it is difficult to from each other: in (a) the decay is monotonic to both sides from the
characterize the differences based only on a visual analysis. peak value of 1 at t¼0, and in (b) the decay is oscillatory. These major
differences between the two random signals are not visible directly
https://doi.org/10.1002/9780471740360.ebs0094 from their time courses.
Example:

Estimate and plot the autocorrelation for a


white noise sequence with variable length
(N=64, 1024, and 16384).

How the results will change in the presence of a signal drift?


The deterministic autocovariance function

The autocovariance function C(τ) is a measure of the memory in the deviation of


x(t) around its mean value.

If x(t) = A, and x(t) is nonzero only on the interval [0 T0], then R(τ) = A2
If x(t) = A + f(t), then Rx (τ) = A2 + Rf(τ)

𝑅𝑥 𝜏 = 𝐶𝑥 𝜏 + 𝑥ҧ 2

➔ Autocovariance = looking at autocorrelation after removing the signal’s mean level


Example: (a biomedical example)
Cross-correlation
Cross-correlation is a measure of similarity of two series as a function of the
displacement of one relative to the other. This is also known as a sliding dot
product or sliding inner-product.

➔ It is commonly used for searching a long signal for a shorter, known


feature.

For continuous functions f and g, the cross-correlation is


defined as

for discrete functions, the cross-correlation is defined as


Example: consider two real valued functions f and g differing only by an
unknown shift along the x-axis. One can use the cross-correlation to find how
much g must be shifted along the x-axis to make it identical to f.
Example: Estimating a time delay between signals

A crosscorrelation method for cardiac output measurements.

thermodilution
Cross-correlation function
https://doi.org/10.1186/1475-925X-11-24
Example:

Create a reference signal (a feature). Insert this


reference signal into a long signal at an arbitrary
location. Add noise to the final long signal.
Use cross-correlation to identify the location of
the feature signal.
Normalized cross-correlation

The normalized Cross-correlation for signals x and y is a function in the interval


[-1,1], defined as follows:

𝓣 is the time lag,


N is the length of signals x and y,
μx is the mean of xt
μy is the mean of yt.
σx is the standard deviation of xt
σy is the standard deviation of yt.

The values of the normalized Crosscorrelation range between 1 (when the matching entities
are exactly the same) and −1 (when the matching entities are inverses of each other).
A value of zero indicates no relationship existing between the entities.
Normalized Correlation Coefficient
Correlation with no displacement
This coefficient, the Pearson product-moment correlation
coefficient or Pearson’s correlation coefficient, r, was
invented by Karl Pearson with Florence Nightingale David.
Example:

Estimate the correlation between different white


noise sequences with variable length (N=16,
N=1024).

Discuss statistical significance?

Advanced topic: estimate causality with cross-correlation analysis of biomedical signals


https://doi.org/10.1109/TBME.2007.906519

You might also like