0% found this document useful (0 votes)
25 views29 pages

Int254 Unit 3

The presentation covers the foundations of machine learning, focusing on soft computing, neural networks, and their applications. It contrasts soft computing with hard computing, discusses the structure and functioning of artificial neurons, and introduces the perceptron learning algorithm. Additionally, it includes case studies on practical problem-solving using neural networks, such as digit recognition and optical computation of neural network operations.

Uploaded by

siddupodagatla07
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views29 pages

Int254 Unit 3

The presentation covers the foundations of machine learning, focusing on soft computing, neural networks, and their applications. It contrasts soft computing with hard computing, discusses the structure and functioning of artificial neurons, and introduces the perceptron learning algorithm. Additionally, it includes case studies on practical problem-solving using neural networks, such as digit recognition and optical computation of neural network operations.

Uploaded by

siddupodagatla07
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 29

Lovely Professional University

Department of Computer Science & Engineering


Presentation on
INT254-Foundations of Machine Learning

Presented By:
Dr. Premananda Sahu
Department of Machine Learning 1
Topics to be covered…
• Soft Computing: What is Soft Computing?
• Comparison with Hard Computing
• Components of Soft Computing
• Neural Network: Biological Motivation and History
• Applications of Neural Networks
• Structure and Working of Artificial Neurons
• Perceptron Learning Algorithm
• Architecture and Working of Feedforward Neural
Networks
• Activation functions
• Case study: Practical Problem-solving using Neural
2
Networks
What is Soft Computing

• Fusion of Fuzzy Logic, Neuro Computing & Evolutionary


Computing

• Designed to enable solutions to real world problems

• Solutions which are difficult to model mathematically

SC
Soft = EC + NN
Evolutionary +
Neural FLFuzzy
Computi Computing Network Logic
ng

3
Soft Computing Vs Hard Computing
Hard Computing Soft Computing

Precisely analytical model Approximate model

Exact algorithms Clever algorithms (Heuristic)

Exact input data Ambiguous/noisy input data

Precision and categoricity Approximation

Rigidly written programs Trainable and adaptive

Precise answer Approximate answer


4
Components of Soft Computing

5
Neural Network: Biological Motivation

6
Neural Network: Biological Neuron

•Inputs (red arrow) represent signals received from other neurons.

•Dendrites: Branch-like structures that receive input signals.

•Cell Body (Soma): The central part of the neuron that processes inputs.

•Nucleus: The core of the neuron, containing genetic material.

•Axon: The extended structure that transmits signals to other neurons.

•Outputs (red arrow) are signals sent to other neurons

7
•Inputs
Neural
(x₀, x₁,Network: Artificial
x₂, ..., xₙ): Represent Neuron
numerical values fed(perceptron)
into the artificial neuron.
•Weights (w₁, w₂, ..., wₙ): Each input is multiplied by a corresponding weight, which
determines its importance.
•Bias (b): A constant value added to the weighted sum to adjust the output.
•Bias: type of error occurs due to wrong assumptions
•Activation Function (f): allowing network to learn complex patterns.
•Activation Function (f): Applied to the linear output to introduce non-linearity
•Output (y): The final processed value, determining the neuron’s response.

8
Comparison of Biological Neuron and Artificial Neuron

9
Neural Network
•Neural networks mimic biological brain functions.
•Consist of layers, neurons, and connections.
•Each neuron applies weights and biases.
•Activation functions introduce non-linear properties.
•Outputs help in decision-making processes.
•Process input data using mathematical transformations.

10
Neural Network: History

11
Applications of Neural Networks

12
Structure and Working of Artificial
Neurons

Consist of 3 main parts: input processing, activation function, and weight update (learning 13
Structure and Working of Artificial
Neurons

14
Structure and Working of Artificial
Neurons

15
Perceptron Learning Algorithm
• Fundamental supervised learning algorithm used for binary classification problems.
• Based on adjusting weights iteratively to correctly classify data points.

16
Perceptron Learning Algorithm

• Hyperparameters are variables that control the training of machine learning models.
• Learning rate (η) is crucial hyperparameter in machine learning that controls how much the model's
weights are updated during training.
• η is too high → The model converges (produce accurate predictions) too fast
• η is too low → The model learns too slowly 17
Perceptron Learning Algorithm-: Example
Calculation

18
Perceptron Learning Algorithm-: Example
Calculation

19
Perceptron Learning Algorithm-: Example
Calculation

20
Perceptron Learning Algorithm-: Example
Calculation

21
Architecture and Working of Feedforward Neural Networks

• ANN is mathematical model that can learn and used in ML.

• Feedforward Neural Network (FNN) is the simplest type of ANN where information flows only in one

direction.

• Flows from I/P to O/P without looping.

• FNN consists of 3 layers:

• I/P layer, Hidden layer, O/P layer

• loss function measures how well the neural network's predictions match the actual target values.
22
Architecture and Working of Feedforward Neural Networks

• Input Layer: Receives raw data or features (𝑥1,𝑥2,...,𝑥𝑛​).

• No computations are performed in this layer.

• Hidden layer : Contains multiple neurons that process input data.

• Each neuron applies weighted sum of inputs followed by an activation function.

• Output Layer : The number of neurons depends on the type of problem

• Binary classification: 1 neuron (sigmoid activation).

• Multi-class classification: n neurons (softmax activation).

• Regression: 1 neuron (linear activation).


23
Architecture and Working of Feedforward Neural Networks

24
Activation Functions

25
Case study: Practical Problem-solving using Neural
Networks

26
Case study: Practical Problem-solving using Neural
Networks

(a) Neural Network Structure for Digit Recognition


• Represents a feedforward neural network used for handwritten digit recognition.
•Input layer consists of pixel values from a grayscale (shades of gray, without any color)
image of the digit "3."
•Hidden layer performs weighted summation and applies an activation function (σ).
•The output layer has 10 neurons (0-9), where the neuron corresponding to "3" has the highest
activation.
•Key Concept: The weighted sum of inputs and weights is computed using a dot product,
followed by an activation function to introduce non-linearity. 27
Case study: Practical Problem-solving using Neural
Networks
(b) Optical Computation of Neural Network Operations
•Explains how optical components can perform vector-matrix multiplication, a key operation in neural
networks.
•Step 1: Element-wise multiplication
• Each input value xi​is multiplied by the corresponding weight wij.
•Step 2: Summation of products
• The results are summed to obtain the neuron activation y 1​.
•Optical Interpretation
• A light source represents input data.
• A modulator applies weights to the input data in a spatially parallel manner.
• A bucket detector aggregates the weighted signals, simulating a neuron summation.
28
Case study: Practical Problem-solving using Neural
Networks

29

You might also like