SC Unit 1
SC Unit 1
Zadeh coined the term of soft computing in 1992. The objective of soft computing is to provide
precise approximation and quick solutions for complex real-life problems.
Or
The following are some of the reasons why soft computing is needed:
Problem-1:
Tell that whether w1 is the same as w2 or not?
Solution –
The answer is simply No, it means there is an algorithm by which we can analyze
it.
Problem-2:
Tell how much these two strings are similar?
Solution –
The answer from conventional computing is either YES or NO. But these maybe
80% similar, this can be answered only by Soft Computing
How it works
Soft computing is tolerant of uncertainty, imprecision, and partial truths.
It's based on biological inspirations like genetics, evolution, and human nervous
systems.
Soft computing is used when finding an exact solution isn't as important as finding a
quick approximate solution.
1. Robustness: Soft computing techniques are robust and can handle uncertainty,
imprecision, and noise in data, making them ideal for solving real-world
problems.
2. Approximate solutions: Soft computing techniques can provide approximate
solutions to complex problems that are difficult or impossible to solve exactly.
3. Non-linear problems: Soft computing techniques such as fuzzy logic and neural
networks can handle non-linear problems effectively.
4. Human-like reasoning: Soft computing techniques are designed to mimic
human-like reasoning, which is often more effective in solving complex problems.
5. Real-time applications: Soft computing techniques can provide real-time
solutions to complex problems, making them ideal for use in real-time
applications.
Concept of computing:
According to the concept of computing, the input is called an antecedent and the output
is called the consequent.
For example, Adding information in Database, Compute the sum of two numbers using
a C program, etc.
There are two types of computing as following:
1. Hard computing
2. soft computing
Characteristics of hard computing:
Evolutionary computing:
The structures and operations of human neurons serve as the basis for artificial
neural networks. It is also known as neural networks or neural nets. The input layer of
an artificial neural network is the first layer, and it receives input from external
sources and releases it to the hidden layer, which is the second layer. In the hidden
layer, each neuron receives input from the previous layer neurons, computes the
weighted sum, and sends it to the neurons in the next layer. These connections are
weighted means effects of the inputs from the previous layer are optimized more or
less by assigning different-different weights to each input and it is adjusted during the
training process by optimizing these weights for improved model performance.
Fuzzy Sets:
What is Fuzzy?
The term fuzzy refers to things that are not clear or are vague. In the real world
many times we encounter a situation when we can’t determine whether the state is
true or false, their fuzzy logic provides very valuable flexibility for reasoning. In this
way, we can consider the inaccuracies and uncertainties of any situation.
Fuzzy Logic:
Fuzzy Logic is a form of many-valued logic in which the truth values of
variables may be any real number between 0 and 1, instead of just the traditional
values of true or false. It is used to deal with imprecise or uncertain information and
is a mathematical method for representing vagueness and uncertainty in decision-
making.
Fuzzy Logic is based on the idea that in many cases, the concept of true or
false is too restrictive, and that there are many shades of gray in between. It allows
for partial truths, where a statement can be partially true or false, rather than fully true
or false.
Fuzzy Logic is implemented using Fuzzy Rules, which are if-then statements
that express the relationship between input variables and output variables in a fuzzy
way. The output of a Fuzzy Logic system is a fuzzy set, which is a set of membership
degrees for each possible output value.
Key Concepts
In Rough Set Theory we use 3 main concepts. Let’s understand them one by one:
1. Indiscernibility Relation: It states that if two objects cannot be distinguished from
one another based on available attributes then they are considered equivalent. For
example if two animals have the same height and weight we cannot differentiate
between them using just those attributes.
2. Boundary Region: This is the area where the lower and upper approximations differ
represents uncertainty about whether certain elements belong to the target set.
Understanding this region helps in identifying ambiguous cases within the data.
5. Reducts: It is a minimal subset of attributes that can still represent the original set
without losing significant information. Finding reducts helps simplify data analysis
by reducing complexity while retaining essential features.
Recent trends in soft computing:
Recent trends in soft computing focus heavily on hybrid approaches combining
different soft computing techniques like fuzzy logic, neural networks, and evolutionary
algorithms to tackle complex real-world problems, particularly in areas like big data
analysis, decision making, image processing, and advanced control systems, with
emphasis on interpretability, explainability, and handling uncertainty in data; further
key trends include:
Deep learning integration:
Incorporating deep neural networks within soft computing frameworks to enhance
learning capabilities and handle large datasets.
Swarm intelligence optimization:
Utilizing swarm intelligence algorithms like particle swarm optimization and ant
colony optimization for complex optimization problems.
Probabilistic reasoning:
Integrating probabilistic methods with fuzzy logic to manage uncertainty and provide
more robust solutions.
Specific application areas seeing significant advancements in soft computing include:
Healthcare:
Diagnosis support systems, personalized medicine, medical image analysis using
fuzzy logic and neural networks.
Robotics:
Motion control, path planning, obstacle avoidance using hybrid soft computing
approaches for intelligent robots.
Finance:
Risk assessment, portfolio optimization, fraud detection leveraging soft computing
techniques for complex financial data analysis.
Cyber security:
Intrusion detection, anomaly detection, threat analysis by applying soft computing
methods to network data.
Internet of Things (IoT):
Data analysis, decision making, and predictive maintenance in IoT systems using soft
computing algorithms.
Addressing complex problems:
Soft computing is increasingly used to tackle problems with inherent uncertainty,
vagueness, and incomplete information, which traditional computational methods
struggle with.
Explainable AI (XAI):
Research is focusing on developing interpretable soft computing models to
understand the reasoning behind decisions made by the system.
Hardware acceleration:
Integrating soft computing algorithms with specialized hardware to improve
computational efficiency for real-time applications.
Pattern Recognition:
2. Image Processing:
There are basically 2 classes of methods for image processing: frequency domain
methods and spatial domain methods.
In frequency domain methods, the image is processed in terms of its Fourier
transform. In spatial domain methods, the processing directly involves the pixels of
matrix R.
A digital image processing system has the following components: a sensor unit, a
specialized image processing unit, a digital computer, some storage devices, and
output devices.
The natural images are captured in an image sensor device. Depending upon the
mechanism of the capturing device, the image any be analog or digital. Devices such
as normal camera capture the image on a chemical photographic plate. This is called
the negative of the image, which is further processed to get the actual image. This
image is analog by nature.
For further processing of the image, it must be converted into a digital or discrete
from. The digitizer unit takes care of this phase. The processing of converting an
analog image into a digital one involves 2 phases of processing called sampling and
quantization.
The processing of finding intensity values at selected points of the image is called
sampling and the resulting intensity values are fitted with in a limited number of
intensity values. This process is called quantization.
The image processor is general purpose computer ranging from a PC to a super
computing device. In some applications, specialized computers are used. The
processor also equipped with software for image processing. Mass storage capability
is another requirement for processing unit. As each image has the size of 1024 X
1024 pixels and at each pixel intensity is an 8-bit quantity, it requires megabytes of
the memory for storage space.
The generated output images must be stored for display or making hardcopies.
Hardcopy devices may be laser printers, film cameras, inkjet printers etc.
5. Fuzzy image processing
6. Image processing through clustering
Soft computing in real estate is primarily used for tasks like accurate property
valuation, automated property management, efficient lead generation, and data
analysis by leveraging techniques like neural networks, fuzzy logic, and genetic
algorithms to handle complex and uncertain real estate data, providing more precise
insights for decision-making compared to traditional methods.
To find the minimal shortest path in a network to update the routing table, soft
computing techniques are used. Fuzzy-logic-based Big Bang Big Crunch (BB-BC)
and biogeography- based optimization (BBO) techniques are the recent SC
approach to enumerate the shortest path. BB-BC approach is based on the evolution
of the universe by Big Bang theory.
BBO approach is based on the dynamic equilibrium in the number of species on an
island. This algorithm has low computational time and high convergence. BB–BC
algorithm shows the superiority over the other one in finding the shortest path in the
network with minimum error. By Big Bang theory, the energy (kinetic) is
discharged during initial explosion and is counterbalanced by the energy attraction
of bodies called gravitational pull. When there is an enough mass, it becomes bigger
than the first and when reaches to critic density, the explosion stops and starts
contracting, leading to its beginning shape named as Big Crunch.
This theory of Big Bang followed by Big Crunch is repeated to form the
optimization algorithm called Big Bang big crunch (BB—BC) optimisation
algorithm and its pseudo-code is as follows:
Initial set of candidate solutions are generated randomly and the fitness of the
solution is defined by the objective function. The Centre of mass calculated during
contraction in Big, Crunch phase after Big Bang phase is computed as follows
Where xc is the position of center of mass and fi is the fitness value of ith candidate
and N is the population size. The new position after Big Crunch can be computed
by
where xnew is the new position around the centre of mass, l is the
upper limit of the parameter, r is the random number, and k
represents the iteration. The ILC of the candidate solution represents the
integrated link cost which can be evaluated using fuzzy logic as objective function
to decide the fitness value.
Biogeography-Based Optimization
HSI represents the Habitat Suitability Index. HSI is modified probabilistically using
immigration and emigration rates, and it is the fitness measure. Smax is the
maximum species count and species count probability Ps of each habitat, E and I
are maximum migration rates.
MOSES is a tool that monitors the staff and reminds the deadline and pending jobs.
Multi agent architecture is an iShop Floor that is capable to plan and control the
industrial process. Easy Meeting and Aml are some of intelligent system that is
used for organising and assisting office meetings and to optimise the official
requirements. AIRE, iRoom, NIST are some of the tools used to assist in the office
meeting rooms, smart space, and consider the emotional factors of participants and
their associated argumentation processes. To develop an intelligent tool based on
ontology, they require knowledge engineering skills to model the system and should
possess overlapping temporal reasoning and should handle the real life data having
missing, uncertain, and vague values. Soft computing techniques possess this
quality and can perform approximate reasoning like humans.
The application of fuzzy techniques in semantic web is boundless. Some of them are
fuzzy logic based information retrieval, annotation in textual documents using fuzzy
logic, semantic portal based on fuzzy description logic, semantic search engines that
incorporate inference, information retrieval, and security. Multimedia-based
information retrieval model uses fuzzy description logic for handling thousands of
images. In medical diagnosis, fuzzy techniques are used to identify tumour from X-
ray images. Fuzzy description logic provides expressive searching mechanism
compared to other techniques and is used for search and comparing products in
online shopping.
Information retrieval (IR) and information filtering context (IF) has wide
applications including semantic web and mining. Clustering techniques are widely
used to categorise the search results for topic diversification in presenting the
retrieved document and to mine the contents for interested pattern. In IR and IE the
document may be put into more than one cluster where conventional algorithm fails
to deal such problems. Soft computing techniques are ideal in handling overlapping,
multi-label classes and multi-objective problems.
The Hierarchical-Data Divisive Soft Clustering Algorithm (H2D-SC) is proposed
by Bordogna for text classification. The procedure of H2D-SC is given here.
Procedure H2D-SC
Step 1: At the beginning, all documents are viewed as full members of a unique
huge cluster C, the whole collection (i.e., all documents belong to the initial cluster
C = D with membership value equal to 1).
Step 2: The decision whether to split C into more specific sub-clusters is taken by
assessing the quality of the cluster.
Step 3: In case the cluster C has been evaluated worth splitting, the number KC of
sub clusters to generate is determined depending on some cluster’s properties.
Step 4: The algorithm applies a (soft) clustering algorithm (FCMs algorithm) having
as input the number KC of clusters so as to generate the soft clusters of the next
hierarchical level.
Step 5: The process iterates until no more clusters are evaluated worth splitting.
Apart from the above points, engineers strive hard to find an optimal solution
for a complex problem. Search-based software engineering problems find the
optimal solution and is also useful to solve combinatorial search problems. It
applies meta-heuristic search techniques to explore the possibilities and produce
good solution according to quality function which is not possible by humans when
the search space is large and has complex quality function.