0% found this document useful (0 votes)
37 views29 pages

NN 01

The document discusses an introduction to artificial neural networks. It covers the course outlines, biological neurons, and artificial neurons. It compares biological neurons to artificial neurons and how artificial neural networks simulate the human brain by learning from examples like biological systems.

Uploaded by

youssef hussein
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views29 pages

NN 01

The document discusses an introduction to artificial neural networks. It covers the course outlines, biological neurons, and artificial neurons. It compares biological neurons to artificial neurons and how artificial neural networks simulate the human brain by learning from examples like biological systems.

Uploaded by

youssef hussein
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Artificial Neural Network

and Deep Learning


Lecture 1
Dr. Dina Elsayad
[email protected]

Thanks for Prof. Dr. Hala Mousher Ebied for


her main credits in the course content preparation
Thanks for Dr. Ghada Hamed for her credits in the course content

NEURAL NETWORKS - LECTURE 1 1

Agenda
 Course Outlines
 Overview of Neural Networks
 Biological and Artificial Neuron Model
 Definition of Neural Networks
 Applications of Neural Networks
 Artificial Neuron Structures

NEURAL NETWORKS - LECTURE 1 2

1
Course Outlines
1. Introduction to NNs
2. Main characteristics of Neural Networks
3. Resenblatt’s perceptron Single Layer Network
4. Least Mean Square algorithm for Single Layer Network
5. Multilayer Perceptron (MLP) Network
6. Optimization of Back-Propagation Algorithm
7. Deep Learning
8. Convolutional Neural Networks (CNNs)
9. Regularization and CNNs
10. YOLO for Object Detection
11. Fully CNNs and U-Net for Image Segmentation, Generative Models

NEURAL NETWORKS - LECTURE 1 3

Textbooks
 Haykin. Neural Networks and Learning Machines, 3ed., Prentice Hall (Pearson), 2009.
 Deep Learning, Ian Goodfellow, Yoshua Bengio, and Aaron Courville, MIT Press, Cambridge,
2016
 Neural Networks and Deep Learning, Charu C. Aggarwal, Springer, 2018

NEURAL NETWORKS - LECTURE 1 4

2
Course Assessment
Assessment
◦ Homework, Quizzes, Computer Assignments, and Project (35 Points)
◦ Midterm Exam (15 Points)
◦ Final Exam (50 Points)

Assessment (Old Bylaw)


◦ Homework, Quizzes, Computer Assignments, and Project (25 Points)
◦ Midterm Exam (10 Points)
◦ Final Exam (65 Points)

Programming and homework assignments


• Late answers are NOT accepted!

NEURAL NETWORKS - LECTURE 1 5

Course Materials
All lectures and labs will be uploaded to:
Neural Network & Deep Learning

NEURAL NETWORKS - LECTURE 1 6

3
Lecture 1:
Introduction to ANNs

NEURAL NETWORKS - LECTURE 1 7

Lecture Objectives
After studying this lecture, the student will be able to:

– Define the Neural Networks.

– Explain how the biological neuron works.

– Summarize the difference between the Biological and the Artificial Neuron.

– Explain the mathematical representation of the artificial neuron unit.

– Summarize the properties and capabilities of Neural Networks.

NEURAL NETWORKS - LECTURE 1 8

4
Agenda
 Course Outlines

 Overview of Neural Networks


 Biological and Artificial Neuron Model
 Definition of Neural Networks
 Applications of Neural Networks
 Artificial Neuron Structures

NEURAL NETWORKS - LECTURE 1 9

Where is NN?

NEURAL NETWORKS - LECTURE 1 10

5
What is a Neural Network?
 Neural Networks replicate the way humans learn, inspired by how the neurons in our brains
fire, only much simpler.

 Researchers attempt to simulate Human brain by implementing artificial neural networks


(ANN).

 The Human brain could give the correct response (output) for each input of its environment.

NEURAL NETWORKS - LECTURE 1 11

What is a Neural Network?, cont.


 The researchers considered the neural network as a black box strategy, which is
trainable.

 The key aspect of black box approaches is developing relationships between


input and output.

 The researchers tried to ‘train’ the neural black-box to ‘learn’ the correct response
output for each of the training samples.

NEURAL NETWORKS - LECTURE 1 12

6
The Human Nervous System
The human nervous system may be viewed as a three-stage system:
◦ The brain, represented by the neural (verve) net, is central to the system. It continually
receive information, perceives it, and makes appropriate decision.
◦ The receptors convert stimuli from the human body or the external environment into
electrical impulses that convey information to the brain.
◦ The effectors convert electrical impulses generated by the brain into discernible responses
as system output.

Block diagram representation of nervous system.

NEURAL NETWORKS - LECTURE 1 13

Human brain
 The Human brain computes in different way from digital-computer:
 The brain is a highly complex, nonlinear, and parallel computing.
 It characterize by;
 Robust and fault tolerant - because they are always able to respond
and small changes in input do not normally cause a change in
output.
 Flexible – can adjust to new environment by learning
 Can deal with probabilistic, noisy or inconsistent information
 Is small, compact and requires little power than the digital
computer.

NEURAL NETWORKS - LECTURE 1 14

7
Human brain, cont.
 The brain is slower than the digital-computer in the mathematic computation,
however, The brain is many times faster than the digital-computer in:
 vision, pattern recognition, perception, motor control
 Human uses 1% calculation, 99% understanding
 based on patterns, drawing information from experience
 Machine opposite: 99% calculation, 1% understanding
 though this understanding is growing

NEURAL NETWORKS - LECTURE 1 15

How does the Human brain do the


task required from it?
At birth, a brain has great structure and the ability to build or build up its own
rules by “experience”.

◦ Experience is build up over time, dramatic development within 2 years after


birth and continues to develop afterward.

NEURAL NETWORKS - LECTURE 1 16

8
Basic element in a biological brain
A neuron is the basic element in a biological brain
There are approximately 100,000,000,000 neurons in a
human brain
One neuron is connectedly with approximately 10,000
other neurons
Each of these neurons is relatively simple in design.

NEURAL NETWORKS - LECTURE 1 17

How does the computer simulate


the Human?
Allow computers to learn from experience like
humans
◦ By gathering knowledge from experience
(examples or training set)
Understand the world as hierarchy of concepts
◦ Build the computer in hierarchical way like the
brain
◦ Biological neurons = nodes
◦ Neurons connected to each others
◦ Thereby learn complicated concepts by building
them out of simpler ones

NEURAL NETWORKS - LECTURE 1 18

9
Agenda
 Course Outlines
 Overview of Neural Networks

 Biological and Artificial Neuron Model


 Definition of Neural Networks
 Applications of Neural Networks
 Artificial Neuron Structures

NEURAL NETWORKS - LECTURE 1 19

Biological Neuron Structure

The biological neuron is composed of four major


parts:
◦ A neuron contains a cell body for signal processing,
◦ many dendrites to receive signals (Inputs receives through
dendrite)
◦ an axon for outputting the result; and
◦ a synapse between the axon and each dendrite

NEURAL NETWORKS - LECTURE 1 20

10
How does a bio-neuron work?
Synaptic activity
Electrical Signals (impulses) come into the dendrites
through the synapses.
Electrical signal causes a change in synaptic potential
and the release of transmitter chemicals.
Chemicals can have an excitatory effect on the
receiving neuron (making it more likely to fire) or an
inhibitory effect (making it less likely to fire).

 Total inhibitory and excitatory from all dendrites connections to a


particular neuron are summed up in the cell body.
 When the sum is larger than a threshold, the neuron fires, and sends
out an impulse signal to other neurons through the axon

NEURAL NETWORKS - LECTURE 1 22

Translate from Biological Neuron to


Artificial Neuron

A physical neuron

An artificial neuron

NEURAL NETWORKS - LECTURE 1 23

11
Translate from Biological Neuron to Artificial Neuron

NEURAL NETWORKS - LECTURE 1 24

Network of Neurons
The human brain composed of many “neurons” that co-operate to perform the
desired objective, which means give desired output to a specific input.

 We can consider the Neural Networks as a


network of many simple processors “node or
units”.
 This called forward propagation.
 Summation at each node can be occurred in
parallel using parallel programming(They are
independent) .
NEURAL NETWORKS - LECTURE 1 27

12
Agenda
 Course Outlines
 Overview of Neural Networks
 Biological and Artificial Neuron Model

 Definition of Neural Networks


 Applications of Neural Networks
 Artificial Neuron Structures

NEURAL NETWORKS - LECTURE 1 28

What is a Neural Network?


A Neural Networks is a method of computing, based on the interaction of
multiple connected processing elements.

Its computational model inspired from neurological model of the brain.


It is a machine that is designed to model the way in which the brain performs a
practical task.
It is usually implemented by using electronic components or is simulated in software.

NEURAL NETWORKS - LECTURE 1 29

13
What is a Neural Network?, cont.
It is a massively parallel distributed processor (Formal Definition in the Book)
◦ Made up of simple processing units (neurons)
◦ The simple processing units has a natural propensity for storing experience
knowledge and making it available for use.
It resembles the brain in two respects:
◦ Knowledge is acquired from environment through learning process.
◦ Interneuron connection strengths, known as synaptic weights, are used to store
the acquired knowledge

NEURAL NETWORKS - LECTURE 1 30

Computing power of Neural Networks


Its derives its computing power through:
◦ Its massively parallel distributed structure.

◦ Its ability to learn and therefore generalize.

◦ Learning: the function of which is to modify synaptic weights in an orderly


fashion to obtain a desired objective.

◦ Generalization: refers to the ability of neural network to produce outputs for


inputs not encountered during training (learning).

NEURAL NETWORKS - LECTURE 1 31

14
Properties and Capabilities of Neural Networks
1. Nonlinearity. An artificial neuron can be linear or nonlinear. A neural network, made up of an
interconnection of nonlinear neurons, is itself nonlinear.
2. Input-Output Mapping
It is built by learning from examples in order to minimize the difference between the
desired response and the actual response.
3. Adaptively
NNs have a built-in capability to adapted their synaptic weights to changes in the
surrounding environment, i.e. it can be easily retrained to deal with minor changes in the
operating environment.
In non-stationary environment, a NN can be designed to change its synaptic weights in
real-time.

NEURAL NETWORKS - LECTURE 1 32

Agenda
 Course Outlines
 Overview of Neural Networks
 Biological and Artificial Neuron Model
 Definition of Neural Networks

 Applications of Neural Networks


 Artificial Neuron Structures

NEURAL NETWORKS - LECTURE 1 33

15
Problems Commonly Solved With Neural Networks
There are many different problems that can be solved with a neural network.
However, neural networks are commonly used to address particular types of problems.
The following types of problem are frequently solved with neural networks:
◦ Regression - Approximate an unknown function
◦ Classification
◦ Pattern recognition
◦ Prediction
◦ Optimization
◦ Clustering

These problems will be discussed briefly.

NEURAL NETWORKS - LECTURE 1 34

Regression (1l3)
Computer program required to Predict a numerical value given some input.
Ex.: for every house price, I know the area of this house.
House price 100,000 130,000 200,000 500,000 1,000,000
Area (m2) 80 90 100 150 200

Then someone asked me that there is a house with specific area what is its price?
If we are following rule-based methods or search methods, then the algorithm will fail
because the new price.
As a result, we need house-fitting or interpolation to get the value of the house price.
The output of this problem will be continuous variable.
Face detection problem can be categorized below regression?
NEURAL NETWORKS - LECTURE 1 35

16
Regression (2/3)
Another Ex.: the job salaries and these job salaries are a very large database that
tells us the domain of every job.
Job Salary 10X 20X 30X 35X 50X
Domain X Y X Z Z
Location CAI NY PAR NY CAI
Grade B A A B A

If anybody searches for a new job, even if this job is not existed in our DB, the
system can predict the output by using the regression.
In all cases, the output is a continuous variable.
If we looked at the problem mathematics, we will find that there are a lot of
names; House-fitting, regression, in general it is a function approximation.

NEURAL NETWORKS - LECTURE 1 36

Regression (3/3) - Function approximation


So, the system need to learn a function that maps input x to output y.
This function could be a curve on which we apply fitting on a data we have.
The curve could be linear or non-linear
As a result, that curve is expected to have errors,
In the opposite figure, y doesn`t equal some function of x.

 y equal function of x and the ф of x plus some error.


 Estimation error: measure the distance between this example and the
curve on which we applied the fitting.
NEURAL NETWORKS - LECTURE 1 37

17
Classification or Recognition (1/2)
Here the output variable turned from continuous variable into a discrete
variable.
Instead of predicting the house price, we need to predict the house category:
instead of returning numbers for prices, we will use low category, economical
category, high category, or premium end or high end.
The difference is that the problem is converted from function approximation to
classification problem.
We still trying to estimate the function but in this case we are trying to get
something called Decision Boundary instead of curve.

NEURAL NETWORKS - LECTURE 1 39

Classification or Recognition (2/2)


The Decision Boundary means to get a break between two classes.

In case of the binary classification, the break will separate two classes where
there are any example of class 1 existed in class 2 region.
One of the classification names is discrimination, or discriminative function.

NEURAL NETWORKS - LECTURE 1 40

18
Classification
Classification is the process of classifying input into groups.
For example, an insurance company may
Want to classify insurance applications into different risk categories,
Or an online organization may want its email system to classify incoming mail into
groups.
Often, the neural network is trained by presenting it with a sample group of
data and instructions as to which group each data element belongs.
This allows the neural network to learn the characteristics that may indicate
group membership.

NEURAL NETWORKS - LECTURE 1 42

Pattern Recognition
Pattern recognition is one of the most common uses for neural networks.
Pattern recognition is a form of classification.
Pattern recognition is simply the ability to recognize a pattern. The pattern must be
recognized even when it is distorted.
In general, pattern recognition is the problem to classify given patterns into several
classes
Character recognition
Speech recognition
Face detection/recognition

Pattern recognition is the basis for creating machines that can learn and think.

NEURAL NETWORKS - LECTURE 1 43

19
Application: Face detection
Face detection is an example of pattern classification or pattern
recognition
Face detection (“face” or “non-face”)
Face detection is to search for a face in a given image
The image can be a still picture taken by a digital camera, or moving
pictures captured by a video camera
This problem is important for many security related systems,
internet based media search, etc.
The problem is highly non-linear.

NEURAL NETWORKS - LECTURE 1 44

Application: Automatic driving


Automatic driving is a special case of pattern recognition
Examples: car-driving, Control of mobile robots
The inputs of the neural network may be contains a video image and some
distance information
The outputs corresponds directly to handle directions
The problem is to find the parameters of the neural network
controller from given observations
A car can be considered as a special mobile robot

NEURAL NETWORKS - LECTURE 1 45

20
Prediction
Prediction = estimating the future value(s) in a time series
Given a time-based series of input data, a neural network will predict future values.
The accuracy of the guess will be dependent upon many factors, such as the quantity
and relevancy of the input data.
For example, neural networks are commonly applied to problems involving
predicting movements in financial markets.
Example: Given stock values observed in the past few days, guess if we should buy or
sell today

NEURAL NETWORKS - LECTURE 1 46

Optimization
Optimization can be applied to many different problems for which an optimal solution is
sought.
The neural network may not always find the optimal solution; rather, it seeks to find an
acceptable solution.
Perhaps one of the most well-known optimization problems is the traveling salesman problem
(TSP).

NEURAL NETWORKS - LECTURE 1 47

21
Clustering
It is the process of grouping the data into classes (clusters) so that the data objects
(examples) are:
similar to one another within the same cluster
dissimilar to the objects in other clusters

no predefined classes


The purpose is to extract rules or relations between data, and using these
knowledge to gain profit.
Data visualization is an important way to see the relation between data points in a
high-dimensional space. (Image Segmentation)
Self-organized feature map is a kind of neural network for this purpose

NEURAL NETWORKS - LECTURE 1 48

Clustering, Application
An example:
All users can be categorized into several groups
according to their occupations (Professors,
Engineers, Wives, Students, Sells man,
Managers, Lawyers, Doctors, Unemployed)
The data can be visualized by mapping them to
a 2-D space
From this map, we can see if a person is a
payable user or not

NEURAL NETWORKS - LECTURE 1 49

22
Clustering Applications

Other examples: image segmentation (Satellite images)


= find homogeneous regions in the image (facilitate the analysis of the image)

NEURAL NETWORKS - LECTURE 1 50

Agenda
 Course Outlines
 Overview of Neural Networks
 Biological and Artificial Neuron Model
 Definition of Neural Networks
 Applications of Neural Networks

 Artificial Neuron Structures

NEURAL NETWORKS - LECTURE 1 51

23
Artificial Neuron
Neuron is the basis information processing unit.
Three-basic elements of the
neural model:
1. A set of synapses or connecting links
 Characterized by weight or strength.
2. An adder
 Summing the inputs signals weights by synapses.
 is called a linear combiner.
3. An activation function
 Also called squashing function
 Squash limits the output to some finite values.

NEURAL NETWORKS - LECTURE 1 52

Mathematical terms of Nonlinear Neuron Model

 Each input has an associated weight w


 The neuron model of the figure in this screen includes: input
signals (x1, x2, …, xn); and synaptic weights (wk1, wk2,…, wkn).
 Note that wki refers to the weight from unit i to unit k

NEURAL NETWORKS - LECTURE 1 53

24
Mathematical terms of Nonlinear Neuron Model

 The weighted sum of all the inputs


coming into neuron k:
 The weighted sum uk is called the
net input to unit k,
 The adder of summing the input
signals is linear process.
n
uk   wkj x j
j 1

NEURAL NETWORKS - LECTURE 1 54

Mathematical terms of Nonlinear Neuron Model

 Add the bias bk : vk  uk  bk


 The output signal is:
yk   (uk  bk )
Where the function  is the activation function.

NEURAL NETWORKS - LECTURE 1 55

25
Effect of adding a Bias
The use of bias bk has the effect of applying an
affine transformation to the output uk.
vk  uk  bk
Depending on weather the bias bk is positive or
negative, the relation between
◦ the activation potential vk of neuron k, and
◦ the linear combiner output uk

is modified as in the next figure.

NEURAL NETWORKS - LECTURE 1 56

Effect of adding a Bias, cont.


• Note that as a result of this affine
transformation, the graph of vk versus uk no
longer passes through the origin.
• The bias helps convergence of the weights to
an acceptable solution. It works as a
threshold.
• Bias or threshold: if the effective input is
larger than the bias, the neuron outputs a
one, otherwise, it outputs a zero.

NEURAL NETWORKS - LECTURE 1 57

26
Bias as extra input
In this figure, the effect of the bias is accounted for
by doing two things:
1. A Bias unit can be thought of
as a unit which always has an
input value of 1, x0=+1.
2. A bias value is exactly
equivalent to a weight wk0 on
an extra input line x0.
• So we may formulate
n
vk as
follows: v  w x k j 0
kj j

NEURAL NETWORKS - LECTURE 1 58

Vector-matrix formulation of single-unit


A single linear unit has a model of the form:
n
vk  net k   wkj x j  x0 
j 0
x 
 1
.
 wk 0 x0  wk1 x1  wk 2 x2  ...  wkn xn  wk 0 wk1 ... wkn  
.
 
.
 xn 
 wk 0   x0 
w  x 
 k1   1
. .
WT X where W    and X   
. .
   
 .  .
 wkn   xn 

NEURAL NETWORKS - LECTURE 1 60

27
Example
 The inputs (3, 1, 0, -2) are presented to single neuron, its
weight is (0.3, -0.1, 2.1, -1.1).
 i. What is the net input to the transfer function?
 ii. What is the neuron output?
X1=3 0.3

X2=1 -0.1

Σ f y
X3=0 2.1

X4=-2 -1.1

NEURAL NETWORKS - LECTURE 1 61

Example
 Solution:
i- The net input is given by summed weighted inputs:
uk  net k  W T X
3
1
 0.3  0.1 2.1  1.1 
0
 
  2
 3(0.3)  1(-0.1)  0(2.1)  - 2(-1.1)
 0.9  (-0.1)  2.2
3
ii- The output cannot be determined because the transfer function is not
specified. which function should be used as an activation function?

NEURAL NETWORKS - LECTURE 1 62

28
Thank you!
Next Lecture:
◦ Types of activation functions

NEURAL NETWORKS - LECTURE 1 63

29

You might also like