Literature Review
Literature Review
Literature Review
SYNOPSIS ON
A Study of Fuzzy & Intuitionistic Fuzzy Information Measures with Its Applications in Decision Making
Submitted by
Pratiksha Tiwari
Under the guidance of
Introduction
Information theory deals with the study of problems concerning any system that includes information dispensation, storage, retrieval and decision making. In other words, information theory studies all problems related to the entity called communication system which is diagrammatically represented the following block diagram:
Source of Messages
Encoder
Channel
Decoder
Destination
Noise
The source of messages can be a person or machine that generates the messages, the encoder convert messages in to an object which is suitable for transmission, such as a sequence of binary digits, channel is a medium over which the coded message is transmitted, decoder convert the received output from the channel and try to convert the received output in to the original message to transport it to the destination. But this cannot be done with absolute consistency due to existence of some disorder in the system which is also termed as noise. Fundamental theorem of Information Theory states that It is possible to transmit information over a noisy channel at any rate less than channel capacity with an arbitrary small probability of error. We can reduce the probability of error to zero by reducing the transmission rate towards zero, but the essence of fundamental theorem of information theory is that in order to achieve higher reliability their no need to reduce the transmission rate to zero but by reducing it to channel capacity and this can be achieved by means of coding .The encoder assigns a group of messages a sequence of symbols called code words, proper for sending it over the transmission channel and encoder convert the received sequence of symbols in to message. Using Entropy (also termed as measure of uncertainty) we can define information transmitted over channel can be transmitted at any rate less than channel capacity with an arbitrary small probability of error (with the help of a coding system). It also satisfies four axioms for uncertainty measures: Axiom 1: ( ) ( ) is a monotonically increasing function.
( ) ) ) (
( ) ( ) ( ) ) ( )
) is continuous function of p.
where C is arbitrary positive number and the logarithm base is any number greater than 1. By taking C=1 and logarithm base 2, equation (1) reduces to ( ) ( ) ... (2)
The above expression is known a Shannons Entropy or measure of uncertainty. Information Theory has advanced in variety of ways in many disciplines and its applications can be found in pattern recognition, social sciences, management sciences, operation research etc. Analogous to information theory which is based on probability theory fuzzy set theory was developed by Zadeh [73]. Fuzzy set theory gave a new dimension to set theory which says, an element either belong to a set of does not belong to a set, in logics a statement can be either true of false, in operation research a solution can be feasible or not , but this is not well justified in reality. It modeled reality in a better way than traditional theories. Zadeh [73] wrote The notion of a fuzzy set provides a convenient point of departure for the construction of a conceptual framework which parallels in many respects the framework used in the case of ordinary sets, but is more general than the latter and, potentially, may prove to have a much wider scope of applicability, particularly in the fields of pattern classification and information processing. Essentially, such a framework provides a natural way of dealing with problems in which the source of imprecision is the absence of sharply defined criteria of class membership rather than the presence of random variables. Fuzzy information theory revolutionaries research because uncertainty and fuzziness are there in human thinking and various practical problems.
A fuzzy set A in the universe of discourse is characterized by the membership function , -, where ( ) a degree of membership for each and degree of non( )). Zadeh [74] introduced fuzzy entropy a measure of fuzzy
information based on Shannons entropy. De Luca and Termini [14] characterized the fuzzy entropy and introduced a set of following properties (14) for which fuzzy entropy should satisfy them: 1. Fuzzy entropy is minimum iff set is crisp. 2. Fuzzy entropy is maximum when membership value is 0.5. 3. Fuzzy entropy decreases if set is sharpened. 4. Fuzzy entropy of a set is same as its complement. Bhandari and Pal [5] gave various measures of fuzzy entropy and measure of fuzzy divergence corresponding to a fuzzy set relative to some other fuzzy set . Let be a Universal set and ( ) . A mapping D: F( )F( )R is called
divergence between two fuzzy subsets if it satisfies following properties for any , B,C 1. D( 2. D( 3. D( ) is non-negative. )= D( ) if ) ( )} D( ). )
4. Max{ D(
In fuzzy set theory, by non-membership value of an element is complement of its membership value from one, but practically it is not true, this is dealt by higher order Fuzzy set proposed by Atanassov [1] termed as intuitionistic fuzzy sets(IFSs). It is found to be highly useful in dealing vagueness and hesitancy originated from inadequate information. It characterizes two characteristic functions for membership and non-membership * ( ) ( ) ( ) and ( ) + , where ( ) , , - and ( ) and , ( ) respectively for ( ) ( ) .
an element of the universe of discourse. An intuitionistic fuzzy set (IFS) is given as - such that , ( )= ( ) - denote degree of membership and degree of non-
membership of
Szmidt and Kacprzyk [54] extended the properties of fuzzy entropy proposed by De Luca and Termini [14] for an entropy measure in intuitionistic fuzzy set if it satisfies the following properties: 1. Intuitionistic fuzzy entropy is zero iff set is crisp. 2. Intuitionistic fuzzy entropy is one iff membership value is same as non-membership values for every element. 3. Intuitionistic fuzzy entropy decreases as set get sharpened i.e ( ) ( ) fuzzy than B[ ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ] ( ) if A is less
4. Intuitionistic fuzzy entropy of a set is same as its complement. In IFS the membership and non-membership values corresponding to an element are real numbers, this was overcome by interval-valued fuzzy sets (IvIFS) Atanassov [2], which are described by membership function and non-membership function whose values are intervals, in consequence IvIFS are more flexible and realistic in handling uncertainty and fuzziness. IvIFS in the finite universe is expressed as* ( ) ( ) and ( ) ( ) , ( ) +, where ( ) , ( ) ( )( ) ( )- is called non-membership . It is extensively used in
for each
many fields such as medical diagnosis, pattern recognition, various socio-economic problems and in multiple attribute decision making etc.
Literature Review
Shannon [51] defined measure of uncertainty for discrete and continuous probability distribution and proved various mathematical properties corresponding to both discrete and continuous probability distributions. However, it is very probable that Shannons work wouldnt have become famous without the help of Weaver [65], whose popular text The Mathematics of communication re-interpreted Shannons work for broader scientific audiences. That represents the beginning of the then so-called Information theory. However, in his work Weaver [65] went over and above Shannons mathematical theory mentioning not only the technical but also the semantic and prominent problems of communication. Other roots of information theory can be found in works of Wiener [67] and Kolmogorov [30]. Parallel to probability theory Zadeh discovered Fuzzy set theory in 1965, although in 1962 he described requirement of scientific tool
to handle real world problems as he said we need a radically different kind of mathematics, the mathematics of fuzzy or cloudy quantities which are not describable in terms of probability distributions. Indeed, the need for such mathematics is becoming increasingly apparent even in the realm of inanimate systems, for in most practical cases the a priori data as well as the criteria by which the performance of a man-made system are judged are far from being precisely specified or having accurately-known probability distributions. Ishikawa & Mieno [23] applied fuzzy entropy to the seal impression problem to measure the subjective value of information under the condition of uncertainty. Successive and direct segmentation methods are introduced in recognizing the pattern and successive is found to be 2.32 times cost-effective. Taneja et.al. [57], gave two parametric generalizations of directed divergence (Kullback &
Leibler, [32]), Jensen difference divergence (Burebea & Rao [10], [11], Rao, [47]) and Jeffreys invariant divergence (Jeffreys, [24]). These generalizations are put in combined expression and their properties were studied. Lin [36] introduced a class of information theoretic divergence measures based on Shannon entropy, measure is characterized by non-negativity, finiteness, semiboundedness and boundedness. Bhandari & Pal [5], defined divergence measure between fuzzy sets along with a set of properties. They also gave information theoretic measure of discrimination between two sets which reduces to non-probabilistic measure of entropy given by Deluca and Termini [14]. Further, Renyis [48] probabilistic entropy of order has been extended to define nonprobabilistic entropy of a fuzzy set with various properties and applications of these measures to clustering, image processing etc. are discussed. Pal & Bezdek [40] introduced two classes for measures for discrete fuzzy sets, the separable additive and multiplicative classes. The additive class has no-negative concave function and the multiplicative class has non-negative, monotonically increasing concave functions and introduced the concept of weighted fuzzy entropy. Klir & Harmanec [29] said that In the near future, most research in the area of generalized information theory will probably focus on the investigation of the various optimization problems whose formulation and efficient solution methods are essential to make the principles of
minimum and maximum uncertainty operational for dealing with problems that are beyond the scope of classical information theory. Criado & Gachechiladze [13] introduced Zadeh's entropy and structurally related it with the quantity weighted entropy of De Luca and Termini [14] and also established its connection with Shannon's entropy. Montes & Gill [38] studied a class of divergence between fuzzy set and a class of divergence between fuzzy partitions. Yager [68] gave measure of entropy associated with a fuzzy measure and its manifestation for different fuzzy measures is described. The problem of uncertain decision making for the case in which the uncertainty represented by a fuzzy measure is considered. The Choquet integral is introduced as providing a generalization of the expected value to this environment. Szmidt & Kacprzyk [54] gave a non-probabilistic entropy measure for IFS and IFS are interpreted geometrically. Montes et.al [39] proposed an axiomatic form to measure the difference between fuzzy sets and studied the case of local divergence. Fan & Ma [16] considered fuzzy entropy based on the axiomatic definition of fuzzy entropy and introduced some more general fuzzy entropy formulas. In addition, studied fuzzy entropy induced by distance measure. Qing & Li [46] proved the equivalence of some fuzzy entropy formulae and discussed the relations between these fuzzy entropy formulae and distance measure. Further, gave a class of fuzzy entropy formulae. Hooda [18] gave two generalized measures of fuzzy entropy and characterized a new measure of R-normed fuzzy entropy. Some generalized measures of fuzzy directed and symmetric divergence are studied and particular cases of the generalized and R-normed fuzzy entropies have been obtained. Kao & Lin [28] divided a fuzzy number into two parts: the position and fuzziness. The former is represented by the elements with membership value 1 and the latter by the entropy of the fuzzy number; both have crisp values. Given a set of fuzzy descriptive variables, the position and entropy of the projected fuzzy responses are calculated from the regression model via the one-toone correspondence between a fuzzy number and its entropy, the projected fuzzy response is obtained.
Szmidt & Kacprzyk [55],[56], proposed a measure of entropy for IFS and simplified the distance calculated in the paper Szmidt & Kacprzyk [53] using Hamming distance , proved properties for the formula when hamming distance was applied. Hung & Yang [22] presented two families of entropy for IFS and corresponding properties. Further they compared proposed entropy with entropies given by Burillo and Bustince [9] and Szmidt and Kacprzyk [54] found that proposed entropy measure is more consistent for presenting the degree of fuzziness. Parkash et.al. [44] developed two new measures of fuzzy directed divergence and discussed their properties. Vlachos & Sergiadis [63] gave information-theoretic approach measure discrimination between IFSs. Also, a symmetric discrimination information measure was proposed and the notion of intuitionistic fuzzy cross-entropy was introduced. Furthermore, a generalized version of the De LucaTermini non-probabilistic entropy was derived for IFSs. Based on this generalization, a connection between the concepts of fuzziness and entropy in the fuzzy and the intuitionistic fuzzy setting was established. Finally, they demonstrated the efficiency of the proposed symmetric discrimination information measure in the context of pattern recognition, medical diagnosis, and image segmentation. Parkash et.al. [45] gave two new measures of weighted fuzzy entropy and proposed measures of entropy were used for the study of optimization principles when certain partial information is available. Further, the existing as well as the newly introduced weighted measures of fuzzy entropy was applied to study the maximum entropy principle. Chaira & Ray [12] defined a new intuitionistic fuzzy divergence and used this proposed distance measure for edge detection and the results are found better with respect to the previous methods. Seising [50] dealt with the connection between information theory and fuzzy set theory. Hooda & Jain [20] defined and characterized two fuzzy information measures which are sub additive in nature. Parkash & Gandhi [42] defined two fuzzy entropy measures involving trigonometric functions and applying them to the results available in geometry. Verma & Sharma [60] introduced exponential fuzzy entropy of order for fuzz y sets based on Pal & Pal [41] and De-Luca and Termini [14] and proved some properties of this measure.
Bajaj & Hooda [4] proved two generalized measures of fuzzy directed divergence and proved their validity. Ye [69] introduced two measures of intuitionistic fuzzy entropy on intuitionistic fuzzy information sets and then the essential properties of these measures are proved. Finally, a numerical example is given to show that the information measures of the proposed intuitionistic fuzzy entropy are reasonable and effective by the comparison of the proposed entropy and existing entropy. Hooda & Bajaj [19] developed a concept of useful fuzzy information, based on utility. A useful fuzzy directed divergence measure is validated. Also, constrained optimization of useful fuzzy entropy and useful fuzzy-directed divergence is studied. Kumar et.al. [34] introduced two generalized parametric measures of fuzzy directed divergence and studied their properties. Deshmukh et.al. [15] have proposed some generalized measure of fuzzy entropy based upon real parameters, discussed their properties. Jha et.al. [25], developed measures of intuitionistic fuzzy directed divergence measures based on Havard & Charvat [17] and Kullback [31], on Renyi measure [48] of directed divergence and also on Sharma and Mittals [50] directed divergence. Parkash & Gandhi [43] defined and inspected two generalized fuzzy measures with essential properties. Kumar et.al. [34] introduced two parametric generalizations of existing measures of fuzzy information and two parametric directed divergence measures with their validity. They studied some measures of total ambiguity and new generalized measures of fuzzy information improvement. Further, particular cases of fuzzy entropy and directed divergence measures are also discussed. Verma & Sharma [61] introduced a generalized divergence measure with a flexible parameter based on divergence measure proposed by Wei & Ye [66] for IFS and proved properties of the corresponding measure and demonstrated the role of flexible parameter on multi-criteria decision making. Bhatia & Singh [6], proposed arithmetic-geometric divergence, parametric, unified (, ) and generalized arithmetic-geometric measures of fuzzy directed divergence and proved various properties related to these measures. Further, they normalized all proved measures.
Sun & Liu [52] developed two entropy measures for IvIFSs, satisfying all validating properties and also defined similarity measure for IvIFSs. Wang [64] proposed and analyzed some entropy and similarity measure for intuitionistic fuzzy sets. Jha & Mishra [26] gave some trigonometric, hyperbolic and exponential measures of fuzzy entropy and directed divergence. Bhatia & Singh [8] developed an approach to define new fuzzy directed divergence and defined a measure of fuzzy directed divergence with mathematical properties and applied it to image segmentation. Bhatia & Singh [7] presented three divergence measures between fuzzy sets and established properties of these divergence measures. Relation of the divergence measures with aggregation operations is examined. Ye [70], have developed two multiple attribute group decision-making methods with the unknown weights of both experts and attributes in intuitionistic fuzzy setting and interval-valued intuitionistic fuzzy setting based on the algorithms of entropy weights, the weights of the experts and the attributes are derived from the decision matrices represented as IFSs or IVIFSs. Further, introduced two evaluation formulas of the weighted correlation coefficients between alternatives and rank the alternatives and select the most desirable one. Verma & Sharma [62] devised an exponential intuitionistic fuzzy entropy measure and various properties of the measure are proved. Yu [71] investigated the intuitionistic fuzzy aggregation method introducing belief levels. The generalized belief intuitionistic fuzzy weighted averaging operator and ordered weighted averaging operator have also been introduced with a real example of anonymous review of National Science Foundation of China to show the effectiveness of the proposed method. Luo et.al. [37] investigated the multiple attribute decision making problems (MADM) with attribute values given in the form of intuitionistic fuzzy information and uncertain attribute weights. They also suggested that proposed model can be extended to solve the MADM with interval-valued intuitionistic fuzzy information and uncertain attribute weights. Hu & Li [21] studied relationship between entropy and similarity measure of interval valued intuitionistic fuzzy sets and gave methods to describe entropy of interval valued intuitionistic fuzzy sets based on its similarity measure. Further, investigated some sufficient conditions for
transformations from an entropy measure to a similarity measure and vice versa for interval valued intuitionistic fuzzy sets.
Objectives
To study various entropy measures for fuzzy sets and intuitionistic fuzzy sets. To compute various generalizations of entropy measures on intuitionistic fuzzy sets. To find various applications of intuitionistic fuzzy information theory in decision making.
Methodology After formulating the basic problem of fuzzy and intuitionistic fuzzy information theory, its generalization will be employed depending on the nature of the problem. MATLAB The Language of Technical Computing will be used for making graphical representation of various entropies. Under this thesis no specific equipment/instrument is required. Extensive use will be made in library facilities and reviewing of various research papers in this discipline.
REFERENCES
[1] Atanassov, K.T. (1986), Intuitionistic fuzzy sets, Fuzzy Sets and Systems, Vol. 20(1), 8796. [2] Atanassov, K. T. (1999), Intuitionistic Fuzzy Sets: Theory and Applications, PhysicaVerlag, Heidelberg, New York. [3] Bajaj, R. K., Hooda, D. S. (2010), Generalized Measures of Fuzzy Directed-Divergence, Total Ambiguity and Information Improvement, Journal of Applied Mathematics, Statistics and Informatics (JAMSI), 6 (2010), Vol. 2, 31-44. [4] Bajaj, R. K., Hooda, D. S. (2010), On Some New Generalized Measures of Fuzzy Information, World Academy of Science, Engineering and Technology, Vol. 38, 747752. [5] Bhandari, D. & Pal, N. R. (1993), Some new information measures for fuzzy sets, Information Sciences, Vol. 67, 204 228.
[6] Bhatia P.K. & Singh S. (2012), Three Families of Generalized Fuzzy Directed divergence, AMO-Advanced Modeling and Optimization, Vol. 14(3), 599-614. [7] Bhatia P.K. & Singh S. (2013), On Some Divergence Measures between Fuzzy Sets and Aggregation Operations, AMO-Advanced Modeling and Optimization, Vol. 15(2), 235248. [8] Bhatia P.K. & Singh S. (2013), A New Measure of Fuzzy Directed Divergence and Its Application in Image Segmentation, I.J. Intelligent Systems and Applications, Vol. 04, 81-89. [9] Burillo, P. & Bustince, H. (1996), Entropy on intuitionistic fuzzy sets and interval valued fuzzy sets, Fuzzy Set Systems, Vol.78, 305316. [10] Burebea, J. & Rao, C, R. (1982), Entropy Differential Metric, Distance and Divergence Measures in Probability Spaces: A Unified Approach, J. Multi. Analy. , Vol. 12, 575-596. [11] Burebea, J. & Rao, C, R. (1982), On the Convexity of Some Divergence Measures Based on Entropy Functions, IEEE Transactions on Information Theory, IT-28, 89-495. [12] Chaira, T. & Ray, A. K.(2008), A new measure using intuitionistic fuzzy set theory and its application to edge detection, Journal of Applied Soft Computing , Vol. 8(2),919927. [13] Criado, F. & Gachechiladze, T. (1997), Entropy of Fuzzy Events, Fuzzy Sets and Systems, Vol. 88, 99-106. [14] De Luca A. and Termini S. (1972), A definition of a Non-probabilistic Entropy in Setting of Fuzzy Sets, Information and Control, Vol. 20, 301-312. [15] Deshmukh, K. C. & Khot, P. G., & Nikhil (2011), Generalized Measures of Fuzzy Entropy and their Properties International Journal of Engineering and Natural Sciences, Vol. 5(3),124-128. [16] Fan, J. L. & Ma, Y. L. (2002), Some New Fuzzy Entropy Formulas, Fuzzy Sets and Systems, Vol. 128, 277284. [17] Harvada, J.H. and Charvat, F. (1967), Quantification methods of classification processes: Concept of structural entropy, Kybernetika, Vol. 3, 30-35. [18] Hooda, D. S. (2004), On Generalized Measures of Fuzzy Entropy, Mathematica Slovaca, Vol. 54 (3), 315-325.
[19] Hooda, D. S. & Bajaj, R.K. (2010), Useful Fuzzy Measures of Information, Integrated Ambiguity and Directed Divergence, International Journal of General Systems, Vol. 39(6), 647658. [20] Hooda, D. S. & Jain, D. (2009), Sub Additive Measures of Fuzzy information, Journal of Reliability and Statistical Studies, Vol. 2(2),39-52. [21] Hu, K. & Li, J. (2013), The Entropy and Similarity Measure of Interval Valued Intuitionistic Fuzzy Sets and Their Relationship, International Journal of Fuzzy Systems, Vol. 15(3), 279-288. [22] Hung, W. L. & Yang, M. S. (2006), Fuzzy Entropy on Intuitionistic Fuzzy Sets, International Journal of Intelligent Systems, Vol. 21, 443451. [23] Ishikawa, A. & Mieno, H. (1979), The Fuzzy Entropy Concept and Its Application, Fuzzy Sets and Systems, Vol. 2,113-123. [24] Jeffreys, H. (1946), An Invariant Form of the Prior Probability in Estimation Problems, Proc. Royal Soc., Ser A, Vol. 186, 453-561. [25] Jha, P., Jha, M. & Mishra, V. K. (2011), On Some New Measures of Intuitionistic Fuzzy Entropy and Directed Divergence, Global Journal of Mathematical Sciences: Theory and Practical, Vol.3 (5), 473-480. [26] Jha, P. & Mishra, V. K. (2012), Some New Trigonometric, Hyperbolic and Exponential Measures of Fuzzy Entropy and Fuzzy Directed Divergence, International Journal of Scientific & Engineering Research, Volume 3(4), 1-5. [27] Jha P. & Mishra V.K.(2013), An Unorthodox Parametric Measure of Information and Corresponding Measure of Intutionistic Fuzzy Information, Bulletin of Mathematical Sciences & Applications, Vol. 2 , No. 1 , 29 -32 [28] Kao, C. & Lin, P. H. (2005), Entropy for fuzzy regression analysis , International Journal of Systems Science, Vol. 36(14), 869876. [29] Klir, G. J. & Harmanec, D. (1996), Generalized Information Theory- Recent Developments, Kybernetes, Vol. 25(7/8), 50-67. [30] Kolmogorov, A. N. (1941), Interpolation and Extrapolation. In: Bulletin delAcadmie des Sciences delURSS, Serie Mathmatique, Vol. 5, 3-14. [31] Kullback,S. (1959), Information Theory and Statistics, Willey and Sons, New Delhi.
[32] Kullback, S., and R.A. Leibler (1951), On Information and Sufficiency, The Annals of Mathematical Statistics, Vol. 22(1), pp.7986, [33] Kumar, T., Bajaj, R.K. & Gupta, N. (2011), On Some Parametric Generalized Measures of Fuzzy Information, Directed Divergence and Information Improvement, International Journal of Computer Applications, Vol. 30 (9), 5-10. [34] Kumar, A., Mahajan, S. & Kumar, R. (2011), Some New Generalized Measures Of Fuzzy Divergence, International Journal of Mathematical Sciences and Applications, Vol. 2, 1-9. [35] Lee, S. & Park, W. (2012), Uncertainty Evaluation via Fuzzy Entropy for Multiple Facts, International Journal on Information Management (IJIM), Vol. 1(1), 14-19. [36] Lin, J. (1991), Divergence Measure Based on Shannon Entropy, IEEE Transactions on Information Theory, Vol. 3(1), 145-151. [37] Luo Y., Li X., Yang Y. and Liu Z., (2013), Some Models for Multiple Attribute Decision Making with Intuitionistic Fuzzy Information and Uncertain Weights , International Journal of Computer Science Issues, Vol. 10, Issue 1(3), 262-266. [38] Montes S., Gil P. (1998), Some Classes of Divergence Measure Fuzzy subsets and Between Fuzzy Partitions, Mathware & Soft Computing, Vol. 5, 253-265. [39] Montes, S. , Couso, I., Gil, P. & Bertoluzza, C. (2002), Divergence Measure Between Fuzzy Sets, International Journal of Approximate Reasoning, Vol. 30, 91-105. [40] Pal, N. R. & Bezdek, J. C. (1994), Measuring Fuzzy Uncertainty, IEEE Transactions on Fuzzy Systems, Vol. 2(2),107-118. [41] Pal, N. R. & Pal, S. K. (1999), Entropy: A New Definitions and Its Applications, IEEE Transactions on systems, Man and Cybernetics, Vol. 21(5), 1260-1270. [42] Parkash, O. & Gandhi, C. P. (2010), Applications of Trigonometric Measures of Fuzzy Entropy to Geometry World Academy of Science, Engineering and Technology, Vol. 37, 468-471. [43] Parkash, O. & Gandhi, C. P. (2011), New Generalized Measure of Fuzzy Entropy and Their Properties, Journal of Informatics and Mathematical Sciences, Vol.3 (1), 19. [44] Parkash, O., Sharma, P. K. & Kumar, S. (2006), Two New Measures of Fuzzy Divergence and Their Properties SQU Journal for Science, Vol. 11, 69-77.
[45] Parkash, O., Sharma, P. K. & Mahajan, R. (2008), New Measures of Weighted Fuzzy Entropy and Their Applications for the Study of Maximum Weighted Fuzzy Entropy Principle, Information Sciences, Vol. 178, 23892395. [46] Qing, M. & Li, T. (2004), Some Properties and New Formulae of Fuzzy Entropy, Proceedings of the 2004 IEEE International Conference on Networking, Sensing & Control Taipei, Taiwan, March 21-23, 401-406. [47] Rao, C. R. (1982), Diversity and Dissimilarity Coefficients: A Unified Approach, J. Theoret. Popul. Biology, Vol. 21, 24-43. [48] Renyi, A. (1961), On Measures of Entropy and Information, Proceedings of the Fourth Berkeley Symposium on Mathematics, Statistics and Probability, Vol. I, University of California Press, Berkeley, Calif., 541-561. [49] Sharma, B.D. & Mittal, D. P. (1975), New non-additive measures of entropy for discrete probability distribution, J. Math. Sci. (Calcutta), Vol. 10, 28 - 40. [50] Seising, R. (2009), 60 Years A Mathematical Theory of Communication Towards a Fuzzy Information Theory , IFSA-EUSFLAT 2009, 1332-1337. [51] Shannon, C. E. (1948), A Mathematical Theory of Communication, The Bell System Technical Journal, Vol. 27, pp. 379423. [52] Sun, M. & Liu, J. (2012), New Entropy and Similarity Measures for Interval-valued Intuitionistic Fuzzy Sets, Journal of Information & Computational Science, Vol. 9(18), 5799-5806. [53] Szmidt E, Kacprzyk J. (2000), Distances between intuitionistic fuzzy sets, Fuzzy Set Systems, Vol. 114, 505518. [54] Szmidt, E. & Kacprzyk, J. (2001), Entropy for Intuitionistic Fuzzy Sets, Fuzzy Sets and Systems, Vol.118, 467-477. [55] Szmidt, E. & Kacprzyk, J. (2005), A New Measure of Entropy and Its Connection with a Similarity Measure for Intuitionistic Fuzzy Sets EUSFLAT LFA, 461-466. [56] Szmidt, E. & Kacprzyk, J. (2005), New measures of entropy for intuitionistic fuzzy sets, Ninth Int. Conf. on IFSs, Sofia, 7-8 May 2005, NIFS ,Vol. 11(2), 12-20. [57] Taneja, I. J., Pardo, L., Morales, D. & Menendez, M.L., (1989), On Generalized Information Review, and Divergence , Vol. 13,47-73. Measures and Their Applications: A Brief
[58] Tuli R.K. and Sharma C.S. (2012), Two New Weighted Measures of Fuzzy Entropy and Their properties, American. Jr. of Mathematics and Sciences, Vol. 1(1), 107-111. [59] Verma R.K, Dewangan C.L., Jha P. (2012), An Unorthodox Parametric Measures of Information and Corresponding Measures of Fuzzy Information, International Journal of Pure and Applied Mathematics, Vol. 76(4), 599-614. [60] Verma, R. & Sharma, B. D. (2011), On Generalized Exponential Fuzzy Entropy, World Academy of Science, Engineering and Technology, Vol. 60, 1402-1405. [61] Verma, R. and Sharma, B.D. (2012), On Generalized Intuitionistic Fuzzy Divergence (Relative Information) and Their Properties, Journal of Uncertain Systems, Vol. 6(4), 308-320. [62] Verma, R. & Sharma, B. D. (2013), Exponential entropy on intuitionistic fuzzy sets, Kybernetika, Vol. 49 (1), 114-127. [63] Vlachos, I. K. & Sergiadis, G. D. (2007), Intuitionistic fuzzy information Applications to pattern recognition, Pattern Recognition Letters, Vol. 28, 197206. [64] Wang, H. (2012), Fuzzy Entropy and Similarity Measure for Intuitionistic Fuzzy Sets, International Conference on Mechanical Engineering and Automation Advances in Biomedical Engineering, Vol. 10, 84-89. [65] Weaver, W., (1949), The mathematics of communication, Scientific American, Vol. 181. 11-15. [66] Wei, P., & Ye J., (2010), Improved Intuitionistic Fuzzy Cross-Entropy and Its Application to Pattern Recognitions, International Conference on Intelligent Systems and Knowledge Engineering, 114116. [67] Wiener, N. (1948), Cybernetics or Control and Communications in the Animal and the Machine, Cambridge, Massachusetts: MIT Press. [68] Yager, R.R. (2001), A General Approach to Uncertainty Representation using Fuzzy Measures, FLAIRS-01 Proceedings, Association for the Advancement of Artificial Intelligence (aaai.org) conference, 619-624. [69] Ye, J. (2010), Two Effective Measures of Intuitionistic Fuzzy Entropy, Computing, Vol. 87, 55-62.
[70] Ye, J. (2013), Multiple Attribute Group Decision-Making Methods with Completely Unknown Weights in Intuitionistic Fuzzy Setting and Interval-Valued Intuitionistic Fuzzy Setting, Group Decis Negot, Vol. 22, 173188. [71] Yu, D. (2013), Intuitionistic Fuzzy Information Aggregation and Its Application on Multi-Criteria Decision-Making, Journal of Industrial and Production Engineering, Vol. 30(4), 281290. [72] Zadeh, L.A. (1962), From Circuit Theory to System Theory, Proceedings of the IRE, Vol. 50, 856-865. [73] Zadeh, L.A. (1965), Fuzzy Sets, Information and Control, Vol. 8, pp. 338- 353. [74] Zadeh, L.A. (1968), Probability Measures of Fuzzy Events, J. Math. Anal. Appl., Vol. 23, 421-427.