0% found this document useful (0 votes)
84 views

Signal Reconstruction Using Neural Networks

This document discusses using recurrent neural networks for signal reconstruction. It begins by defining signal reconstruction and neural networks. It then explains how recurrent neural networks can be used to reconstruct signals due to their ability to remember patterns in sequential data. The document demonstrates training an RNN on Fourier-generated input data and shows the reconstructed signal improves with more epochs. Potential applications of this technique discussed include medicine, astronomy, data recovery, and meteorology.

Uploaded by

Aniket Sujay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views

Signal Reconstruction Using Neural Networks

This document discusses using recurrent neural networks for signal reconstruction. It begins by defining signal reconstruction and neural networks. It then explains how recurrent neural networks can be used to reconstruct signals due to their ability to remember patterns in sequential data. The document demonstrates training an RNN on Fourier-generated input data and shows the reconstructed signal improves with more epochs. Potential applications of this technique discussed include medicine, astronomy, data recovery, and meteorology.

Uploaded by

Aniket Sujay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 16

Signal Reconstruction Using

Neural Networks
Aniket Sujay
17122006
Let’s break down the title.
Signal Reconstruction.
● In signal processing, reconstruction usually means the determination of original
signal or expected signal from a sequence of samples.
● Given a type of signal we will train the neural network and then when some
“broken” input is provided it will give the original signal with some degree of
accuracy.
Neural Networks
● Neural networks or artificial neural networks are computing
systems that perform tasks by considering examples, without
being explicitly programmed to do so.
● ANNs are weighted, directed computational graphs. This means
that each node in the graph performs some computation on the
input, then sends the result to next node by multiplying the
output by a weight.
● It “learns”, by examining the differences in the given output and
the processed output and using the said difference to adjust the
calculation parameters.
Input Layer Output Layer

Processing
Units

Hidden Layer
● The first layer is the input layer from which data is fed into the graph.
● Each circle in the graph represents a processing unit.

● Each unit has a state of activation, which is equivalent to the output of the system.
● Connections within the system denotes the influence of the preceding unit on the
input that is fed into the next unit. This influence is represented by a number
called weight.
● And associated with each succeeding unit there is a bias or offset, which shifts the
incoming input.
● We define a propagation rule for each unit that determines the overall output of a
unit.
● A learning rule as a whole is defined for the graph.
● Ex: In most ANNs we define cost functions that represent the error of the
computation. The job of the network is to reduce the error as much as possible.
This done using extremely sophisticated algorithms like stochastic gradient
descent.
● After one computation is done, we need to adjust the different parameters of the
units so that the error reduction is ensured.
● This is done using an algorithm called back propagation.
● After repetitive calculations we obtain the result.
Recurrent Neural Networks?
● Our aim in this demonstration is to showcase how neural networks are adept in
working with sequential data.
● For this purpose a special kind of neural network was developed called recurrent
neural network.
● Recurrent networks are based on the concept of memory. Which means the
network remembers the patterns in data and based on the pattern, it predicts the
next number in the sequence.
● This memory capability is implemented by feedback connection in the unit.
Feedback

Output Layer

Input Layer
Demonstration.
● My demonstration consists of 4 parts.
● First is constructing the input signal. This done using the Fourier series.
● Second is training the RNN on the generated data.
● Then we have to calibrate the hyperparameters of the RNN for maximum
accuracy and fast convergence.
● Comparison and the original and reconstructed signal is done.
Results:

AFTER 50 EPOCHS AFTER 100 EPOCHS


AFTER 150 EPOCHS
Observations:
● We can see that the recurrent neural networks are highly
capable in recognising and remembering the sequential data
information.
● By adjusting the hyperparameters we can ensure even faster
convergence of the process.
● There can be limitless application of this kind of analysis
methods in signal processing fields.
What are the applications?
● Machine learning and specifically deep neural networks are at the core of
numerous cutting edge research fields.
● One of them is signal processing.
● Before the discovery of the probabilistic learning methods, all the signal analytics
work was deterministic. Now armed with the power of learning we can make
computer to predict patterns in signals.
● One of the major research field is medicine. Researchers are working in predicting
heart attacks before they happens, using deep neural networks. These model study
the signals and can give information about heart’s condition in real time.
Some more applications:
● Astronomy is another area where spectrum of the radiation coming from the
celestial bodies are observed.
● Astronomers are rapidly adapting machine and deep learning algorithms for
extracting novel information from mountains data that’s getting collected every
day.
● Data recovery is another field where deep learning algorithms excel. Deep
learning models are now being used to reconstruct broken data using the data
before system failure.
● Meteorology is yet another field where probabilistic learning is making headways.
Thank You!

You might also like