0% found this document useful (0 votes)
23 views10 pages

ANN Presentation

Uploaded by

jamesfds007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views10 pages

ANN Presentation

Uploaded by

jamesfds007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Artificial Neural Networks (ANN)

Architecture, Merits and Demerits,


Types of ANN
Architecture of ANN
• 1. Input Layer: Receives
input data.
• 2. Hidden Layers:
Perform computations
and feature extraction.
• 3. Output Layer:
Provides final output.
• 4. Weights and Biases:
Adjusted during
training.
• 5. Activation Functions:
Define output for each
neuron
Merits of ANN
• 1. Ability to learn from data and improve over
time.
• 2. High accuracy in handling complex tasks like
image and speech recognition.
• 3. Can generalize from unseen data.
Demerits of ANN
• 1. Requires large datasets for training.
• 2. High computational cost and time-
consuming.
• 3. Black-box nature: Lack of interpretability.
• 4. Prone to overfitting without proper
regularization.
Types of ANN
• 1. Feedforward Neural Network (FNN): Data flows
in one direction.
• 2. Convolutional Neural Network (CNN): Effective
for image processing.
• 3. Recurrent Neural Network (RNN): Handles
sequential data.
• 4. Modular Neural Network (MNN): Multiple
independent networks work together.
• 5. Radial Basis Function Neural Network (RBFNN):
Uses radial basis functions as activation.
Feedforward Neural Network (FNN)

• FNNs are the simplest type of neural network where


data flows in one direction—from input to output.
They consist of an input layer, one or more hidden
layers, and an output layer. Each neuron in a layer is
connected to every neuron in the next layer.
• Example: Predicting housing prices. Given input
features like the size of the house (square footage),
number of bedrooms, and location, an FNN processes
this data through its hidden layers to output a
predicted price. For instance, a house of 2,000 sq. ft. in
a good neighborhood may be predicted to cost
$300,000.
Convolutional Neural Network (CNN)
• CNNs are specifically designed for processing
structured grid data, such as images. They use
convolutional layers to automatically detect spatial
hierarchies in the data, focusing on local patterns.
• Example: Image classification. In a facial recognition
system, a CNN takes an image as input, processes it
through multiple convolutional layers to extract
features like eyes, nose, and mouth, and finally
classifies the image as a specific person. For instance, it
might recognize the image as "John Doe" based on the
learned features.
Recurrent Neural Network (RNN)
• RNNs are designed to handle sequential data by
maintaining a hidden state that captures information
about previous inputs. This architecture allows them to
model temporal dependencies.
• Example: Text generation. Given the input "The cat sat
on the," an RNN can predict the next word based on
the context of previous words. It might output "mat,"
generating a complete sentence like "The cat sat on the
mat." This is useful in applications like chatbots that
generate responses based on previous dialogue.
Modular Neural Network (MNN)
• MNNs consist of multiple independent neural networks
(modules) that work together on a complex task. Each
module can specialize in a specific function, enhancing
the overall system's performance.
• Example: Autonomous vehicles. An MNN might include
one module for image processing (detecting road signs
and pedestrians), another module for path planning
(calculating the best route), and a third module for
controlling the vehicle (steering and acceleration). Each
module operates independently but collaborates to
navigate the vehicle safely.
Radial Basis Function Neural Network
(RBFNN)
• RBFNNs use radial basis functions as activation
functions in their hidden layers. They excel at
interpolation and approximation tasks by modeling the
relationship between inputs and outputs based on
distance from center points (basis functions).
• Example: Stock price prediction. An RBFNN can take
historical stock prices as input and predict future
prices. For example, if the historical data shows that
stocks tend to rise after a certain pattern, the RBFNN
learns this relationship and provides an estimate for
future prices, helping investors make informed
decisions.

You might also like