0% found this document useful (0 votes)
18 views14 pages

Introduction To ANN

The document provides an introduction to Artificial Neural Networks (ANN), explaining their structure, similarities to biological neural networks, and their functionality in mimicking human decision-making. It discusses the advantages of ANNs, such as parallel processing and fault tolerance, as well as disadvantages like hardware dependence and unrecognized behavior. Additionally, it touches on deep learning and its historical context within the field of machine learning.

Uploaded by

asuradev65511
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views14 pages

Introduction To ANN

The document provides an introduction to Artificial Neural Networks (ANN), explaining their structure, similarities to biological neural networks, and their functionality in mimicking human decision-making. It discusses the advantages of ANNs, such as parallel processing and fault tolerance, as well as disadvantages like hardware dependence and unrecognized behavior. Additionally, it touches on deep learning and its historical context within the field of machine learning.

Uploaded by

asuradev65511
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 14

Introduction to Deep

learning and ANN


UNIT 1

Dr Manisha Khulbe
Artificial Neural Network

• The term "Artificial Neural Network" is derived


from Biological neural networks that develop the
structure of a human brain. Similar to the human
brain that has neurons interconnected to one
another, artificial neural networks also have
neurons that are interconnected to one another
in various layers of the networks. These neurons
are known as nodes.
Biological Neural Network.
Dendrites from Biological Neural Network
represent inputs in Artificial Neural Networks
nucleus represents Nodes, synapse represents
Weights, and Axon represents Output.
Artificial Neural Network -
human-like manner
• An Artificial Neural Network in the field of Artificial
intelligence where it attempts to mimic the network of
neurons makes up a human brain so that computers will
have an option to understand things and make
decisions in a human-like manner. The artificial neural
network is designed by programming computers to
behave simply like interconnected brain cells.
• Human brain is made up of incredibly amazing parallel
processors.
• There are around 1000 billion neurons in the human
brain. Each neuron has an association point somewhere
in the range of 1,000 and 100,000.
• In the human brain, data is stored in such a manner as
to be distributed, and we can extract more than one
piece of this data when necessary from our memory
parallelly.
• The artificial neural network is designed by
programming computers to behave simply like
interconnected brain cells.
• Artificial Neural Network primarily consists of three
layers:
The artificial neural network takes input and computes the weighted
sum of the inputs and includes a bias. This computation is represented
in the form of a transfer function.
Advantages of Artificial Neural Network (ANN)

• Parallel processing capability:


• Storing data on the entire network:
Data that is used in traditional programming is stored
on the whole network, not on a database. The
disappearance of a couple of pieces of data in one place
doesn't prevent the network from working.
• Capability to work with incomplete knowledge:
After ANN training, the information may produce output
even with inadequate data. The loss of performance
here relies upon the significance of missing data.
• Having a memory distribution:
For ANN is to be able to adapt, it is important to
determine the examples and to encourage the network
according to the desired output by demonstrating these
examples to the network.
• Having fault tolerance:
Extortion of one or more cells of ANN does not prohibit it
from generating output, and this feature makes the
network fault-tolerance.
Diadvantages
• Assurance of proper network structure: There is no
particular guideline for determining the structure of
artificial neural networks. The appropriate network
structure is accomplished through experience, trial, and
error.
• Unrecognized behavior of the network: It is the
most significant issue of ANN. When ANN produces a
testing solution, it does not provide insight concerning
why and how. It decreases trust in the network.
• Hardware dependence:
Artificial neural networks need processors with
parallel processing power, as per their structure.
Therefore, the realization of the equipment is
dependent.
• Difficulty of showing the issue to the network:
ANNs can work with numerical data. Problems
must be converted into numerical values before being
introduced to ANN. The presentation mechanism to be
resolved here will directly impact the performance of
the network. It relies on the user's abilities.
• The duration of the network is unknown:
• The network is reduced to a specific value of the error,
and this value does not give us optimum results.
• How do artificial neural networks
work?
Deep learning
• The word "deep" in "deep learning" refers to the number of
layers through which the data is transformed. More precisely,
deep learning systems have a substantial credit assignment
path (CAP) depth. The CAP is the chain of transformations from
input to output. CAPs describe potentially causal connections
between input and output.
• Most modern deep learning models are based on multi-layered
artificial neural networks such as convolutional neural networks
and transformers, although they can also include
propositional formulas
History
• There are two types of neural networks:
feedforward neural networks (FNNs) and
recurrent neural networks (RNNs). RNNs have cycles in their
connectivity structure

• The term Deep Learning was introduced to the machine


learning community by Rina Dechter in 1986,[51] and to
artificial neural networks by Igor Aizenberg and colleagues in
2000, in the context of Boolean threshold neurons.
• continued in next lecture

You might also like