Unit 1
Unit 1
In simple terms, you can understand soft computing - an emerging approach that gives the amazing
ability of the human mind. It can map a human mind and the human mind is a role model for soft
computing.
Hard computing is used for solving mathematical problems that need a precise answer. It fails to
provide solutions for some real-life problems. Thereby for real-life problems whose precise
solution does not exist, soft computing helps.
When conventional mathematical and analytical models fail, soft computing helps, e.g., You can
map even the human mind using soft computing.
Analytical models can be used for solving mathematical problems and valid for ideal cases. But
the real-world problems do not have an ideal case; these exist in a non-ideal environment.
Soft computing is not only limited to theory; it also gives insights into real-life problems.
Like all the above reasons, Soft computing helps to map the human mind, which cannot be
possible with conventional mathematical and analytical models.
Elements of soft computing
Soft computing is viewed as a foundation component for an emerging field of
conceptual intelligence. Fuzzy Logic (FL), Machine Learning (ML), Neural Network
(NN), Probabilistic Reasoning (PR), and Evolutionary Computation (EC) are the
supplements of soft computing. Also, these are techniques used by soft computing to
resolve any complex problem.
Any problems can be resolved effectively using these components. Following are three
types of techniques used by soft computing:
Fuzzy Logic
Artificial Neural Network (ANN)
Genetic Algorithms
Computation time Takes less computation time. Takes more computation time.
Neural Networks, such as Madaline, Adaline, Art Any numerical problem or traditional methods of
Example
Networks. solving using personal computers.
What is Artificial Neural Network?
The term "Artificial Neural Network" is derived from Biological neural networks that develop
the structure of a human brain. Similar to the human brain that has neurons interconnected to
one another, artificial neural networks also have neurons that are interconnected to one
another in various layers of the networks. These neurons are known as nodes.
Dendrites Inputs
Cell nucleus Nodes
Synapse Weights
Axon Output
Features Artificial Neural Network Biological Neural Network
Definition It is the mathematical model which is mainly It is also composed of several processing pieces
inspired by the biological neuron system in the known as neurons that are linked together via
human brain. synapses.
Processing Its processing was sequential and centralized. It processes the information in a parallel and
distributive manner.
Control Mechanism Its control unit keeps track of all computer-related All processing is managed centrally.
operations.
Rate It processes the information at a faster speed. It processes the information at a slow speed.
Complexity It cannot perform complex pattern recognition. The large quantity and complexity of the
connections allow the brain to perform complicated
tasks.
Memory Its memory is separate from a Its memory is integrated into the
processor, localized, and non- processor, distributed, and
content addressable. content-addressable.
Response time Its response time is measured in Its response time is measured in
milliseconds. nanoseconds.
Evolution of Neural Networks
Since the 1940s, there have been a number of noteworthy advancements in the
field of neural networks:
•1960s-1970s: Perceptrons
This era is defined by the work of Rosenblatt on perceptrons. Perceptrons are
single-layer networks whose applicability was limited to issues that could be solved
linearly separately.
Output Layer:
The input goes through a series of transformations using
the hidden layer, which finally results in output that is
conveyed using this layer.
The artificial neural network takes input and computes the
weighted sum of the inputs and includes a bias. This
computation is represented in the form of a transfer
function.
•Interconnections
•Activation functions
•Learning rules
Interconnections:
• Interconnection can be defined as the way processing elements (Neuron) in ANN are
connected to each other. Hence, the arrangements of these processing elements and
geometry of interconnections are very essential in ANN.
• These arrangements always have two layers that are common to all network
architectures, the Input layer and output layer where the input layer buffers the input
signal, and the output layer generates the output of the network.
• The third layer is the Hidden layer, in which neurons are neither kept in the input layer
nor in the output layer. These neurons are hidden from the people who are interfacing
with the system and act as a black box to them.
• By increasing the hidden layers with neurons, the system’s computational and
processing power can be increased but the training phenomena of the system get more
complex at the same time.
There exist five basic types of neuron connection
architecture :
When outputs can be directed back as inputs to the same layer or preceding layer nodes,
then it results in feedback networks. Recurrent networks are feedback networks with
closed loops. The above figure shows a single recurrent network having a single neuron
with feedback to itself.
4. Single-layer recurrent network
The above network is a single-layer network with a feedback connection in which the
processing element’s output can be directed back to itself or to another processing element or
both. A recurrent neural network is a class of artificial neural networks where connections
between nodes form a directed graph along a sequence.
5. Multilayer recurrent network
In this type of network, processing element output can be directed to the processing element in the
same layer and in the preceding layer forming a multilayer recurrent network. They perform the
same task for every element of a sequence, with the output being dependent on the previous
computations. Inputs are not needed at each time step. The main feature of a Recurrent Neural
Network is its hidden state, which captures some information about a sequence.