Basics of Soft Computing08 - Chapter1 PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

CHAPTER - 1

BASIC CONCEPTS OF SOFT COMPUTING

1.0 INTRODUCTION

Present era is an era of information. ’We are drowning in information and


starving for knowledge’.(Ruthorford, 1985). Real life problems have exploded both
in size and complexity. With the advent of computers and digital revolution, storing
large amount of data is fairly easier. The traditional mathematical / statistical
methods or techniques are inadequate to handle data base to extract knowledge from
it. During last decade, challenges in the areas of data storage organization and
searching have led to a new field ‘Data Mining’ and the requirement such data bases
have become inevitable in various fields such as medical field. (Mitra et al., 2002).
In several domains, voluminous data are stored either in centralized data base
or in distributed data bases that includes health care, where several diagnostic
information are stored ((Mitra et al., 2002), (Mitra & Hayashi 2000). The analysis on
such data base cannot be handled manually and it goes beyond human capacity.
Computing Technology is employed for automating the process and hence, the need
arises for developing methodologies – Intelligent Data Analysis.

Intelligent Data Analysis system is evolved by mimicking the natural process


of Evolution. Biology and Computer Science, Life and Computation are related.
Computational problems in Biology and Medicine have created Bioinformatics. The
research field that explores the model and computational techniques for solving
highly complex real life problems comes under Natural computing.

The computational methods developed are passed on the information


gathered in the form of data and the data becomes the source for developing the
model and it is called learning or ‘Learning From Data’. The science of learning
plays a key role in the field of research. Learning is learning through experience and
experience is gained by proper utilization of data. The data is exploited as an
information processor and Soft Computing Techniques play an important role for

1
developing Expert Systems which learn from the data and for designing and

developing Models. Natural Computations include the computational methods,

cellular automata which derived its inspiration from self reproduction, evolutionary
computational methods, which are largely due to Darwinian Evolution of species.
All the computational techniques such as Neural computation, mimicking the
functioning of the brain and swarm intelligence methods, motivated by the group
behavior of organisms have drawn their inspiration only from Nature and Soft
Computing Techniques is one of the successful techniques for developing models
for Expert system. ( Gupta & Kulkarni 2013 )

Professor L.A. Zadeh. describes Soft Computing(SC) is an association of


Computing Methodologies which includes, Fuzzy Logic(FL), Neural Computing
and Probabilistic Computing (PC) as its main components. The important aspect of
Soft Computing is that its constituent methodologies are complementary and
symbiotic rather than competitive and exclusive. Therefore models for Expert
Systems are developed in combination and the most popular hybrid system is Neuro
Fuzzy. Other hybrid systems are, Fuzzy-Genetic(FG), Neurogenetic(NG), Neuro-
Fuzzy- Genetic (NFG)have also occupied an important place in addressing high
complex problems of several domains which include medical domain.( Dounias, G.
(2003)

1.1 ARTIFICIAL INTELLIGENCE

Artificial Intelligence (AI) is the field that studies how machine can be
made to act intelligently. AI research is progressing rapidly. The potential benefits
of AI are very high. Undoubtedly everything that civilisation has to offer is a
product of human intelligence by the tools that AI may provide. (Hawking 2014).
Success in creating AI would be the biggest event in human history. AI is to
discover how to program a computer to perform the remarkable functions that make
up human intelligence for which deeper understanding of human intelligence and
human mind is needed. ( Nath 2009)

2
1.1.1 Machine Learning:

Machine learning (ML), a branch of artificial intelligence, concerns the


construction and study of systems that can learn from data. Machine Learning is a
scientific field which explores the possibility of constructing systems that can learn
from data and has attracted researchers from all fields that include computer
science, engineering, mathematics, physics, neuro science and cognitive science. (
Ramadas & Chunawala (2006). The very objective of ML techniques is to build
models that can adapt to their environment and can learn from the experience that is
provided in the form of data. According to Samuel(1959), ML is field of study that
gives computers the ability to learn without being explicitly programmed. A
computer programme is said to learn from experience E with respect to some task
T and a performance measure P, if its performance on T, as measured by P,
improves with experience E (Mitchell, 1998). ML is building models with the help
of computers to optimise the performance criterion using lot of example data
[experience].ML techniques are used to develop algorithms that discover knowledge
from data and experience using statistical and computational methods. ML becomes
essential when humans expertise is not able to explain their expertise. For example

1. Speech Recognition Task


2. Customer Relationship Management
3. Customer Credit Score
4. Fraud detection
5. Robot technique
6. Data Mining
7. Call patterns in Telecommunication
8. Performance Improvement of search engine in World Wide
Web[WWW].
9. Medical diagnosis
Machine learning is a challenging field which paves way to work in
unknown domains. Machine learning has the ability to learn diligently using the data
base in changing environment.

3
1.1.2 Soft Computing

Soft computing is a coalition of methodologies which are tolerant of


imprecision, uncertainty and partial truth and which collectively provide the
foundation for conception, design and utilisation of intelligent systems.( Liu & Li
2004) The principal members of the coalition are Fuzzy Logic, Neuro Computing,
Evolutionary Computing (EC)and Probabilistic Computing. A basic credo which
underlies soft computing is that, in general better results can be obtained by
employing the constituent methodologies of soft computing in combination rather
stand-alone model [Zadeh, 2004].

1.2 HISTORICAL DEVELOPMENT

Computers are used in almost all disciplines that include science, technology
and medical science etc. Computing techniques are used to find exact solutions of
scientific problems. The solutions are attempted on the basis of two valued logic and
classical mathematics. However, all real life problems cannot be handled by
conventional methods. Zadeh, who is known as the father of Fuzzy Logic has
mentioned that humans are able to resolve tasks of high complexity without
measurement or computation. Hence the need arose for developing systems that
work on Artificial Intelligence

During the year 1955 John McCarthy organised Summer Research Project on
Artificial Intelligence (AI) to initiate research programmes in AI with a proposal that
AI research will proceed on the basis of the conjuncture that every aspect of learning
or any other feature of intelligence can in principle be so precisely described that a
machine can be made to stimulate it [McCarthy et al., 1955]. AI became a field of
research to build models and systems that act intelligently without human
intervention. In mid 1980s Zadeh focused on building systems or making computers
think like humans [Zadeh,1984]. For this purpose, the Machines’ ability to compute
with numbers which is named as hard computing has to be supplemented by an
additional ability more similar to human thinking which is named as soft computing
[Seising, et al., 2013].

4
The idea of soft computing was evolved in 1981 when Zadeh published his
paper on soft data analysis (Zadeh ,1997). Soft computing is a term coined by
Zadeh, combing a collection of computing techniques, spanning many fields that fall
under various categories in computational intelligence. Soft computing has three
main branches, Fuzzy system, Evolutionary computation, Artificial Neural
computing which sub-sums Machine Learning (ML), Probabilistic Reasoning (PR),
Belief Network, Chaos Theory, parts of Learning Theory and Wisdom Expert
System(WES).

Soft computing according to professor Zadeh is ‘an emerging approach to


computing which parallels the remarkable ability of the human mind to reason and
learn in an environment of uncertainty and imprecision’[Seising, 2000]. In general it
is a collection of computing tools and techniques shared by closely related
disciplines. Fuzzy logic is used to mimic the ability of the human mind to effectively
employ modes of reasoning that are approximate rather than exact. Later Neural
network technique combined with fuzzy logic was employed in a wide variety to
adapt and learn from experience. The basic thesis of soft computing is the precision
and certainty carry a cost and that intelligent system should exploit wherever
possible, the tolerance for imprecision and uncertainty. In effect, the role model for
soft computing is human mind. The guiding principle of soft computing is “Exploit
the tolerance for imprecision, uncertainty and partial truth to achieve tractability,
robustness and low solution cost”. (Zadeh, 2004)

In 1950s many medical reasoning problems used primarily, rule-based


systems (Szolovits et al.,1998). The practice of building knowledge-intensive system
was viewed as, extracting rules from application experts and putting those rules into
an expert system shell. A survey on use of soft computing methods in medicine as
their knowledge is catalogued in the form of if-then rules (Yardimci, 2007). Though
this trend continued up-to late 1980s, this kind of problem solving behaviour worked
only in relatively, well constrained domain. The knowledge mining view has
weaknesses due to the following reasons:

5
1. Impossibility of extracting adequate set of rules from professionals in a given
application area (which has been called the knowledge acquisition
bottleneck, a mournful phrase which has been repeated in literature). As a
result there is a considerable degradation, in the performance of the
modelling of disease by the rules and in addition it is difficult to construct
automatic system to provide classification or pattern recognition tools to help
specialists to make a decision.

2 The resulting knowledge based system is generally difficult to maintain.


Addition of rules can change the behaviour of the system totally. Other
conventional methods such as Bayes classifier are also unable to deal with
most clinical decision making problems. The choice of a method to solve this
problem usually depends on the nature of the problem; like classification,
automatic diagnosis and decision support. It may not be possible to solve the
problem just by using one methodology. Hence, the need arose to use
different methodologies together in combination which can be chosen based
on the appropriateness on the nature of the problem and at that point the
importance of applying soft computing methodologies became inevitable.
According to MEDLINE database the use of soft computing methodologies
in basic science of medicine is significantly increasing. (Yardimci, 2009 ).
In biochemistry field there is a variety of phenomena with many complex
chemical reactions in which many genes and proteins affect enzyme activity
of others. It is difficult to analyse and estimate, many of these phenomena,
using conventional mathematical models. So Artificial Neural Network
(ANN), Fuzzy- Artificial Neural Networks and the Artificial Neural
Network-Genetic Algorithms have been applied for analysis in a variety of
research fields. Almost in all branches of basic science, biochemistry, bio-
statistics, genetics, physiology, cytology, histology and pathology, soft
computing techniques have been applied for analysis (Szolovits et al., 1998).

6
1.2.1 Fuzzy Logic

The problem of handling system with uncertainty, imprecision and


vagueness has been discussed for many years in so many fields which include the
field of philosophy. One might think that fuzzy logic is quite recent but its origin
dates back at least to the Greek philosophers and especially Plato(428 – 347 B.C.).
It even seems plausible to trace their origin in India and China (Garrido, 2012) as
they were the first to consider that, all things need to be of certain type or need not to
be of certain type, but there is stopover between. Recent research shows that, in
principle, Fuzzy Logic can be used to model any continuous system, be it in
Artificial Intelligence (AI) or physics or biology or economics etc. Researchers in
many fields feel that Fuzzy Logic Models are more useful and more accurate than
the Mathematical Models.

Logic is defined to be the study of the structure and principle of correct


reasoning. Propositions are descriptions of the world. They are affirmations or
denials of events in real world. There is a long philosophical tradition of
distinguishing between truths necessary (a priori or logical) and facts contingent (a
posteriori or factual) (Garrido, 2012). Both have really led the two concepts of
logical truth, (without being opposed to each other) and are quite different; the
conception of truth as coherence, and the conception of truth as correspondence.
According to the point of view of consistency, a proposition is true or false
depending on their relationship with respect to a given set of proposition, because
they have been consistently applied the rules of that system. Under the times of
correspondence, a proposition is true or false if it agrees with the fact that is referred
to. There were views that tried to overcome such dichotomy. Among them, the
semantic point of view raised by Polish Mathematician and Philosopher Alfrec
Tarski (1902 – 1983) may be mentioned.

Zadeh continued to think about the basic issues in system analysis, especially
the issue of un-sharpness of class boundaries. The beginning of the genesis of the
Fuzzy set theory emerged out of these thoughts,(Zadeh,1965). He introduced this
concept of Fuzzy set, that is a class in which there may be a continuous infinity of

7
membership values with the grade of membership value of an object in a fuzzy set A
represented by a number ݂஺ ሺ‫ݔ‬ሻ in the interval [0,1] (Zadeh, 1965,Seising,2007). In
1968 he presented a paper ‘Fuzzy Algorithm’ to Information and Control in which
he has generalised the conception of an algorithm with the concept of Fuzzification
(Zadeh,1968). This became an inspiration for the common people and they started
practising it in their daily life for activities such as preparing recipes for cooking,
following prescriptions to illness, for getting guidance to park the car which be
regarded as a very crude form of Fuzzy Algorithms.

With the Basic Theory of Zadeh’s fuzzy controllers, researchers began to


apply fuzzy logic to various mechanical and industrial processes. In 1976, professor
Teranno and Shibata in Tokyo with professors Tanaka and Aslci in Osaka made
major contributions both to the development of Fuzzy Logic Theory and its
applications (Jang et al., 1995, 1997). Mamdani (1980) designed the first Fuzzy
controller for a steam engine applied to control a cement plant in Denmark. In 1987
Hitachi used a Fuzzy controller for the Sendai train control which used an innovative
system created by humans. In the same year the company Omron developed the first
commercial controller and the year 1987 is considered the “fuzzy boom” due to the
large number of products based on fuzzy logic to be traded. Fuji (1993) applied
Fuzzy Logic to control chemical injection water treatment plants for the first time in
Japan. It was right there in the country Nippon and South Korea, where more height
Fuzzy Logic has been, creating close partnerships between government, universities
and industries.

Parallel to the study of the applications of fuzzy logic, Professors Takagi and
Sugeno developed the first approach to construct fuzzy rules (Fuzzy Rules), from
training data or training The applications of fuzzy logic in everyday life since then
grow rapidly. The Fuzzy Rules, or rules of a fuzzy system, define a set of
overlapping patches that relate a full range of inputs to a full range of outputs. In that
sense, the fuzzy system approximates some mathematical function or equation of
cause and effect.

8
A very important result says that fuzzy systems can approximate any
continuous math function. Bart Kosko (1993) proved this uniform convergence
theorem by showing that enough small fuzzy patches can sufficiently cover the
graph of any function or input/output relation. The theorem also shows that we can
pick in advance the maximum error of the approximation and be sure there exists a
finite number of fuzzy rules that achieve it (Garrido, 2012).

1.2.2 Artificial Neural Network

The discipline of neural networks originates from an understanding of the


human brain. The starting point of modern neuroscience was the discovery of the
central nervous system’s microscopic structure as nerve cells. Rashevsky (1924),
developed a systematic approach to mathematical methods in Biology. In 1938 he
published the first volume of Mathematical Biophysics.

Rashevsky’s new theory was based on the abstract concept of the


fundamental unit of life – the cell. He described this concept mathematically
.Rashevsky’s idea was a radically new approach to the investigation of the brain. He
was interested in phenomena of excitation and propagation in peripheral nerve fibers
because he thought that data are stored in these nerve fibers. In 1943 McCulloch and
Pitts published ‘A Logical Calculus of the Ideas Imminent in Nervous Activity” in
Rashevsky’s Bulletin of Mathematical Biophysics. Similar to Rashevsky’s theory,
this original paper linked the activities of an abstract neural network of two-factor
elements’ with a complete logical calculus for time-dependent signals in electric
circuits and time is measured here as a synaptic delay. In contrast to Rashevsky,
McCulloch and Pitts interpreted neurons in terms of electric on-off switches. They
had shown that a system of ‘artificial neurons’ like this could perform the same
calculations and obtain the same results of an equivalent analogic structure, as the
`all-or-none' law of nervous activity is sufficient to insure that the activity of any
neuron may be represented as a proposition. Psychological relations existing among
nervous activities correspond, to relations among the propositions; and the utility of
the representation depends upon the identity of these relations with those of the logic
of propositions. To each reaction of any neuron there is a corresponding assertion of

9
a simple proposition.”. Because electric on-off switches can be interconnected such

that each Boolean statement can be realized, McCulloch and Pitts now ‘realized’ the
entire logical calculus of propositions by ‘neuron nets’. They arrived at the
following assumptions:

i) the activity of the neurons is an ‘all-or-none’ process;

ii) a certain fixed number of synapses must be excited within the period of
latent addition in order to excite a neuron at any time, and this number is
independent of previous activity and position on the neuron;

iii) the only significant delay within the nervous system is synaptic delay;

iv) the activity of any inhibitory synapse absolutely prevents excitation of the
neuron at that time;

v) the structure of the net does not change with time (McCulloch, 1943);

Every McCulloch-Pitts neuron is a threshold element; if the threshold value


is exceeded, the neuron becomes active and ‘fires'. By ‘firing’ or ‘not firing’, each
neuron represents the logical truth values ‘true’ or ‘false’. Appropriately linked
neurons thus carry out the logical operations like conjunction, disjunction, etc. Two
years later the mathematician picked the paper up and used it in teaching the theory
of computing machines and may be that initiated the research program of ‘Neuronal
Information Processing’, a collaboration involving psychology and sensory
physiology, in which other groups of researchers were soon interested. Some years
later, Von Neumann wrote on his comparative view on the computer and the brain in
an unfinished manuscript that was published posthumously after his premature
death(Neumann, 1958). In the late 1940s, Hebb proposed the Hebbian rule in to
describe how learning affects the synaptics between two neurons. In 1951,
Mathematician Marvin Minsky had worked with Dean Edmonds in Princeton to
develop a first neuro computer, which consisted of 3,000 tubes and 40 artificial
“neurons” was called SNARC (Stochastic Neural- Analog Reinforcement
Computer), in which the weights of neuronal connections could be varied
automatically. But SNARC was never practically employed in AI. In 1958, Frank

10
Rosenblatt and Charles Wightman at Cornell University developed a first machine
for pattern 741 classification. Rosenblatt described this early artificial neural
network, called Mark I Perceptron, in an essay for the Psychological Review
(Rosenblatt, 1958). The euphoria came to an abrupt halt in 1969, however, when
Minsky and Seymour Papert completed their study of perceptron networks and
published their findings in a book (Minsky et al., 1987).

In the 1970s, Grossberg (1976), Von der Malsburg (1973), and Fukushima
(1975) conducted pioneering work on competitive learning and self organization,
based on the connection patterns found in the visual cortex.. Fukushima (1988)
proposed his neocognitron models [Fukushima, 1988], under the competitive
learning paradigm. The neocognitron is a neural network specially designed for
visual or character pattern recognition. Kohonen (1990). proposed his self-
organizing maps (SOM) The SOM algorithm adaptively transforms incoming signal
patterns of arbitrary dimensions into one- or two-dimensional discrete maps in a
topologically ordered fashion. A Kohonen network is a structure of interconnected
processing units that compete for the signal.

Grossberg and Carpenter contributed with the Adaptive Resonance Theory


(ART) model in the mid-1980s (Grossberg, 1976),(Carpenter et al.,1987). The ART
networks, also based on competitive learning, are capable of stable clustering of an
arbitrary sequence of input data in real time. Each cluster is represented by the
weight vector of a prototype unit, and clusters are allocated incrementally by the
network. The dynamics of the model are characterized by first-order differential
equations. The ART model is recurrent and self-organizing. It has two basic layers:
the input layer called comparing and the output layer called recognizing.

The modern era of neural-network research is commonly deemed to have


started with the publication of the Hopfield network in 1982. The model works at the
system level rather than at a single neuron level. It is an ANN working with the
Hebbian rule. This network can be used as an associative memory for information
storage and to solve optimization problems. Boltzmann machine was introduced in
1985 as an extension to the Hopfield network by incorporating stochastic neurons

11
(Ackley et al., 1985). The Boltzmann learning is based on a method called
Simulated Annealing (SA) (Kirkpatrick et al., 1983). These works revived the
interest in neural networks. Kosko extended the ideas of Grossberg and Hopfield and
proposed the adaptive bidirectional associative memory (BAM) (Kosko,1987). The
Hamming network was proposed by Lippman in the mid-1980s (Lippman ,1987). It
is composed of a similarity feed forward subnet with an n-node input layer and an
m-neuron memory layer and a winner-take-all (WTA) subnet with a fully connected
m-neuron topology. The network is the most straightforward associative memory.
The Hamming network calculates the Hamming distance between the input pattern
and each memory pattern, and selects the memory with the smallest Hamming
distance, which is declared as the winner.

The landmark of the field is the multilayer perceptron (MLP) model trained
with the back propagation (BP) learning algorithm published in 1986 by Rumelhart
et al. [1986]. Hopfield (1982) published the paper ‘Neural networks and physical
systems with emergent collective computational abilities’ on his invention of an
associative neural network .

Broomhead and Lowe (1988) proposed the radial basis function network
(RBFN) model. The RBFN has equivalent capabilities as the MLP model, but with a
much faster training speed. The cellular neural network (CNN) model was proposed
Chua and Yang (1988), has a unique network architecture. CNNs are especially
useful in image and video processing, and are most suitable for very large scale
integrated (VLSI) implementation.

Vapnik (1995) invented a computationally powerful class of supervised


learning networks called Support Vector Machines (SVM) for solving pattern
recognition, regression problems.

Neural Networks have certainly come a long way from early days of
McCulloh and Pitts and they will continue to grow in theory, design and application.

12
1.2.3 Evolutionary Computation

The origin of evolutionary computation started in the year 1950 through the
contributions of researchers Bremernam (1962), Friedberg(1958,1959), Box(1957)
and others but it gained momentum only during the year 1970,due to the
fundamental work of Holland (1962), Rechenberg(1965), Schwefel(1968) and Fogel
(1962). In 1973, Rechenberg’s work expanded the idea of evolutionary computing
approach to deal with numerical optimization (Rechenberg, 1973). In 1975,
Schwefel also contributed to application of evolutionary strategies to deal with
numerical/parametric optimization and also, formalized it as it is known nowadays
(Schwefel ,1975). During 1980’s due to the advancement in the field of computers,
the application of evolutionary algorithms to solve difficult real world problems has
started to receive significant attention. The researchers in various disciplines of
evolutionary computation remained isolated from each other until early 1990’s
(Belew et al., 1991). In mid 1960’s itself the bases for the 3 main forms of
Evolutionary Algorithms namely, Evolutionary Programming, Genetic Algorithms
and Evolutionary Strategies were identified. The roots of Evolutionary Programming
(EP) were laid by Lawerence Foyal et al (1966) and those of Genetic Algorithms
(GA) were developed at the University of Michigan by Holland (1967). On the other
side of the Atlantic Ocean, Evolutionary Strategies (ES’s) were joint development of
group of three students, Bienert, Rechenberge and Schewefel in Berlin,1965 (Jong
1992). Holland showed how to use Genetic search algorithms to solve real world
problems in his book Adaptation in Natural and Artificial system which was
published in 1975 (Holland ,1975). John Koza’s book Genetic Programming: On the
programming of computers by means of natural selection was published in the year
1992 (Koza,1992). However in 1990, there was an organized effort to provide a
forum for interaction among the various EP research Committees and an
International Workshops on Parallel problem solving from Nature at Dortmend
(Jong 1992). From then on the interaction and co-operation among EA researchers
around the world continued to grow (Jong 1992). The need for an organized EC
handbook was evident due to the dramatic growth of interest. The growth of the field
is reflected by many EC events and related activities each year and also by the
increasing number of books and articles about EC (Jong,1992).

13
1.3 AIM AND OBJECTIVES OF THE THESIS

The aim of the thesis is to develop new algorithms based on Soft Computing
techniques for disease classification and prediction. The objective of the thesis is

a. to collect and unify the recent advances in Soft Computing techniques with
special reference to disease classification.

b. to propose and investigate new approaches for disease classification and


prediction using Soft Computing techniques.

c. to compare the performance of the proposed method with other existing


methods.

d. to develop an unified modeling approach based on Soft Computing algorithm


for disease classification and prediction with special reference to HIV and
TB..

1.4 DATA BASE

The Tuberculosis and HIV disease databases are obtained from clinical trial
database at NIRT, ICMR, Chennai. Some of the real and synthetic databases are
obtained from the UCI repository of the University of Irvine. The Lung Cancer
Dataset and Prostate Cancer data set are obtained from http://datam.i2r.a-
star.edu.sg/datasets/krbd/

1.5 ORGANIZATION OF CHAPTERS

The thesis consists of five chapters:

Chapter – 1 – This chapter describes concepts of artificial intelligence, machine


learning and soft computing and also their historical development primarily, fuzzy
logic, artificial neural network and evolutionary computation.

14
Chapter - 2 – discusses the fuzzy logic concepts, fuzzy set theory and fuzzy
relations and also artificial neural network and its’ various architecture. This chapter
also explains the technique of evolutionary computing, particularly the genetic
algorithm and its applications to biomedical database.

Chapter - 3 – This chapter proposes new algorithms based on Artificial Neural


Network and Fuzzy Logic to provide human like expertness for disease
classification. This also deals with fussing Artificial Neural Network and Fuzzy
Logic to develop hybrid algorithms, Neuro – Fuzzy and Fuzzy – Neuro for better
classification and prediction.

Chapter – 4 – This chapter focuses on evolutionary computation techniques


clubbing with Artificial Neural Network and Fuzzy Logic to develop expert systems,
such as, Evolutionary Fuzzy Neuro Systems so as to improve the efficiency and
accuracy of the systems. It compares the performance of the new algorithms with
other machine learning algorithms.

Chapter – 5 - This chapter deals with application of soft computing hybrid


algorithms to disease classification centred on the diagnosis problems in medical
domains of high conceptual complexity. The reduction and simplification of
domains usually help to develop more efficient systems which have also been
discussed. These algorithms are focused on general diseases with reference to HIV
and TB.

Chapter – 6 - This chapter presents the summary of salient findings, conclusions


and topics for future research.

1.6 REVIEW OF LITERATURE

Kermani, et al (1995) developed ANN Classifier for Breast Cancer


Classification. Mitra, et al (2000). have described a way of designing a hybrid
decision support system in soft computing paradigm for detecting the different
stages of cervical cancer. Furey et al . (2000) have developed a method to analyse
the data generator through DNA Micro Array Experiment consisting of thousands of

15
gene expression measurements. They are being used to gather information from
tissue and cell samples which are used to diagnose the disease. Futschik et al
(2003) have classified cancer tissue using fuzzy logic rules which have been
extracted from the trained net work to infer knowledge about classification process.
Güler & Übeyli (2004) have developed an ANFIS model for detection of electro
cardiograph graphic changes in patients with partial epilepsy. A neuro fuzzy model
for diagnosis of psychosomatic disorders is proposed by Aruna et al . A neural
fuzzy decision support system for paediatric ALL cancer subtype identification
using gene expression data was the work developed byTung, & Quek (2005). The
model of feature selection based on mutual information criteria of max-dependency,
max-relevance and minimum-redundency was the model developed by Peng et al
(2005). Temurtas,et al (2009) have compared the performance of the multilayer
neural network structure trained by Levenberg-Marquardt algorithm as a classifier
for identifing diabetes with few other performances. Anitha, et al. (2009). have
proposed a hybrid neural network for identifying abnormal retinal image
classification. Avci & Turkoglu (2009) have used principle component analysis and
ANFIS to diagnose heart valve disease. Ince, et al (2010)'s work is on evaluation of
global and local training techniques over feed-forward neural network architecture
spaces for computer-aided medical diagnosis. Karthik, et all (2011 have made an
attempt for diagnosis of liver disease and its types using Rough Set. Karegowda et
al (2011) have propsed that an application of genetic algorithm optimised neural
network connecion weights for medical diagnosis of PIMA Indians. Uzoka, et al
(2011) have used the. Clinical decision support system (DSS) in the diagnosis of
malaria: Petković et al (2013) had studied on how Automatic Nervous System
(ANS) branches affect the most relevant heart variability, by using ANFIS network..
Castanho et al. (2013). have developed Fuzzy expert system for predicting
pathological stage of prostate cancer. Shilaskar & Ghatol (2013) have promoted a
system for medical diagnosis of cardiovascular diseases by feature selection.
Samuel et al. (2013) have developed a web based decision support system driven by
fuzzy logic for the diagnosis of typhoid fever. Elshazly et al. (2013) have developed
a Hybrid system for lymphatic diseases diagnosis. Seera & Lim (2014) have
developed a hybrid intelligent system for medical data classification.

16

You might also like