0% found this document useful (0 votes)
99 views11 pages

The Consistency Between Cross-Entropy and Distance Measures in Fuzzy Sets

The processing of uncertain information is increasingly becoming a hot topic in the artificial intelligence field, and the information measures of uncertainty information processing are also becoming of importance. In the process of decision-making, decision-makers make decisions mostly according to information measures such as similarity, distance, entropy, and cross-entropy in order to choose the best one.

Uploaded by

Science Direct
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
99 views11 pages

The Consistency Between Cross-Entropy and Distance Measures in Fuzzy Sets

The processing of uncertain information is increasingly becoming a hot topic in the artificial intelligence field, and the information measures of uncertainty information processing are also becoming of importance. In the process of decision-making, decision-makers make decisions mostly according to information measures such as similarity, distance, entropy, and cross-entropy in order to choose the best one.

Uploaded by

Science Direct
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

SS symmetry

Article
The Consistency between Cross-Entropy and
Distance Measures in Fuzzy Sets
Yameng Wang, Han Yang and Keyun Qin *
College of Mathematics, Southwest Jiaotong University, Chengdu 610031, China;
[email protected] (Y.W.); [email protected] (H.Y.)
* Correspondence: [email protected].

Received: 14 February 2019; Accepted: 12 March 2019; Published: 16 March 2019 

Abstract: The processing of uncertain information is increasingly becoming a hot topic in the artificial
intelligence field, and the information measures of uncertainty information processing are also
becoming of importance. In the process of decision-making, decision-makers make decisions mostly
according to information measures such as similarity, distance, entropy, and cross-entropy in order to
choose the best one. However, we found that many researchers apply cross-entropy to multi-attribute
decision-making according to the minimum principle, which is in accordance with the principle
of distance measures. Thus, among all the choices, we finally chose the one with the smallest
cross-entropy (distance) from the ideal one. However, the relation between cross-entropy and
distance measures in fuzzy sets or neutrosophic sets has not yet been verified. In this paper, we mainly
consider the relation between the discrimination measure of fuzzy sets and distance measures, where
we found that the fuzzy discrimination satisfied all the conditions of distance measure; that is to
say, the fuzzy discrimination was found to be consistent with distance measures. We also found
that the cross-entropy, which improved when it was based on the fuzzy discrimination, satisfied
all the conditions of distance measure, and we finally proved that cross-entropy, including fuzzy
cross-entropy and neutrosophic cross-entropy, was also a distance measure.

Keywords: discrimination; cross-entropy; distance measure; consistency

1. Introduction
In the real world, there exists much uncertain, imprecise, and incomplete information, meaning
that there are also many tools to settle them. Zadeh [1] first proposed the concept of a fuzzy
set, which, defined by a membership function, is used to depict the membership value of one
object to a set. Atanassov [2] proposed an intuitionistic fuzzy set (IFS) described by two functions,
including a membership function depicting the membership value, and a non-membership function
depicting the non-membership value of one object to the intuitionistic fuzzy set. The intuitionistic
fuzzy set is the extension of a fuzzy set through adding a non-membership function. It has
provided a more flexible mathematical framework to process uncertainty, imprecise, and incomplete
information. Smarandache [3] firstly proposed the notion of neutrosophy and a neutrosophic set in
1998. The neutrosophic set is defined by the truth-membership function, indeterminacy-membership
function, and the falsity-membership function, and it is comprised of a membership value,
non-membership value, and indeterminacy-membership value. The neutrosophic sets theory has
been successfully applied in the image-processing field by Vlachos and Sergiadis [4]. Therefore, Wang
et al. [5] put forward the definition of a single-value neutrosophic set (SVNS) and some operations
for better application in real scientific and engineering fields. A single value neutrosophic set is the
extension of fuzzy set, and the SVNS theory also provides us with a more convenient tool for the
uncertain information processing. Recently, some researchers have devoted themselves to the study of

Symmetry 2019, 11, 386; doi:10.3390/sym11030386 www.mdpi.com/journal/symmetry


Symmetry 2019, 11, 386 2 of 11

the single-value neutrosophic set theory and its applications, and achieved some successful results
in some fields. Zhang et al. [6–10] did a lot of research about neutrosophic sets, and proposed a new
kind of inclusion relation and new operations in SVNSs; furthermore, they also discussed the algebraic
structure and some applications in algebraic systems.
Information measures are essential to decision-making in information processing, including
the similarity function, distance or divergence function, and entropy and cross-entropy. These
information measures are widely applicable in things like image processing, clustering, and pattern
recognition. Liu et al. [11] applied a single-value neutrosophic number to the Decision-Making
Trial and Evaluation Laboratory Method, and consequently presented the SVNN-DEMATEL (single
value neutrosophic number-Decision-making Trial and Evaluation Laboratory Method) model.
Mukhametzyanov et al. [12] provided an analysis about some multi-criteria decision-making (MDM)
methods and the final selection, and presented a result consistency evaluation model. Tu et al. [13]
introduced some simplified neutrosophic symmetry measures and applied it to their decision-making.
The similarity function is mainly used to measure the level of similarity between two objects. Entropy
is usually to depict the degree of uncertainty of one object, and is very important for measuring
uncertain information. Cross-entropy can depict the discrimination degree of two objects, and we can
judge their relation from it. Therefore, cross-entropy has many applications in information measures,
decision-making, pattern recognition, and so on. Zadeh [14] firstly proposed the entropy of fuzzy
events based on Shannon entropy. Kullback [15] was concerned with an information measure which is
known as “distance” or “divergence”, depicting the relationship between two probability distributions.
Therefore, it can serve as an information measure which can indicate the degree of discrimination.
Furthermore, a new kind of information measure called the “cross-entropy distance” of two probability
distributions was introduced by Kullback. DeLuca and Termini [16] introduced the notion of fuzzy
entropy and some axioms to express the fuzziness degree of a fuzzy set, according to Shannon’s
function. Then, the fuzzy entropy was generalized to the interval-valued fuzzy sets and intuitionistic
fuzzy sets by Burillo and Bustince [17]. Szmidt [18] defined intuitionistic fuzzy entropy in a new way.
Wei et al. [19] proposed interval-valued intuitionistic fuzzy entropy. Cross-entropy can be used to
depict the degree of discrimination between two objects. Therefore, many researchers have modified
cross-entropy measures. For example, Lin [20] proposed divergence based on Shannon entropy,
and it is a type of modified fuzzy cross-entropy. Bhandari [21] introduced fuzzy divergence between
two fuzzy sets. Shang et al. [22] put forward fuzzy cross-entropy and a symmetric discrimination
measure, which was improved by fuzzy divergence and can be used to describe the discrimination
degree between two fuzzy sets. Vlachos and Sergiadis [4] presented intuitionistic fuzzy cross-entropy,
and also found a connection between fuzzy entropy and intuitionistic fuzzy entropy in terms of
fuzziness and intuitionism. Verma [23,24] introduced the divergence measure, which is an information
measure that can depict the discrimination degree. Cross-entropy measures were generalized to
single-valued neutrosophic sets and applied to multi-criteria decision-making by Ye [25]. Since then,
Şhahin [26] has continued to generalize the cross-entropy measure to interval neutrosophic sets, and
introduced its application in multi-criteria decision-making.
In our study, we found that the fuzzy discrimination proposed by Bhandari [21] and improved
fuzzy cross-entropy by Shang [22], as well as neutrosophic cross-entropy introduced by Ye [25] all
have similar properties with distance measures, such as non-negativity, symmetry, and when the
cross-entropy (distance) between two fuzzy sets is 0, if and only if the two fuzzy sets are completely
equal. Furthermore, the decision principle of cross-entropy and distance applied in decision-making
are also the same. That is to say, in the process of decision-making, among all the choices, we finally
chose the one with the smallest cross-entropy (distance) from the ideal one. Based on the above
analysis, we tended to study their relationships between cross-entropy and distance measure. There
has been no previous research about their relationships between cross-entropy and distance. So,
we mainly proved that the fuzzy discrimination and improved fuzzy cross-entropy and neutrosophic
cross-entropy based on discrimination are distance measures in fact. We also present all the proof
Symmetry 2019, 11, 386 3 of 11

about fuzzy discrimination and improved fuzzy cross-entropy and neutrosophic cross-entropy based
on discrimination satisfying all the conditions of distance in the paper. In Section 2, we mainly
introduce some relevant knowledge, and provide proof that the fuzzy discrimination measure satisfies
all the conditions of a distance measure, i.e., it is actually a kind of distance measure. In Section 3,
we mainly prove that the fuzzy cross-entropy satisfies all the conditions of a distance measure, and
that cross-entropy in single-value neutrosophic sets is also a kind of distance; that is to say, that the
cross-entropy measure is consistent with distance measures.

2. Fuzzy Discrimination Is Consistent with Distance Measure


Let X be a universe course, and a fuzzy set A is denoted by a membership function µ A ( x ) which
is used to express the degree of belongingness of one point x ∈ X to the set A, and for all x ∈ X,
µ A ( x ) ∈ [0, 1]. When µ A ( x ) = 0 or µ A ( x ) = 1, then A becomes a crisp set.

Definition 1 ([1]). Let X be a universe course, and fuzzy set A


e defined on X be given as:

e = {h x, µ e ( x )i | x ∈ X }
A A

where µ Ae : X → [0, 1], and every point has a membership value to express the degree of belongingness to a set.

Let FS( X ) be the set of all the fuzzy sets. The following are some properties of fuzzy sets:
M, N, T ∈ FS( X ),

(1) if M ⊆ N, then µ M ( x ) ≤ µ N ( x );
(2) if M ⊆ N, N ⊆ M, then M = N;
(3) if M ⊆ N, N ⊆ T, then M ⊆ T.

Definition 2. Bhandari [21] proposed the fuzzy discrimination measure which is used to express a
discrimination degree in favor of A against B. A, B ∈ FS( X ) are defined as follows:
n
µ M (x ) 1 − µ M (x )
I ( M, N ) = ∑ (µ M (xi ) ln µ N (xii) + (1 − µ M (xi )) ln 1 − µ N (xii)
i =1

It is obvious that I ( M, N ) ≥ 0. I ( M, N ) = 0 if, and only if M = N. Bhandari [21] also defined


E( M, N ) = I ( M, N ) + I ( N, M) for symmetry.
From the above, the fuzzy discrimination has similar properties, such as distance measures
(except for one axiom of distance measure), and they also have a similar principle corresponding to
the principle of minimum cross-entropy. In other words, there exists a perfect solution, A, but because
it is generally unlikely to exist in a real situation, we should aim to find a solution which is likely to
exist in the real world denoted by B, C, ..., so that we can get their cross-entropy to A. We ended up
choosing the smallest cross-entropy, and the solution corresponding to the smallest cross-entropy was
the optimal solution. The distance measure also had the same principle as the cross-entropy.

Definition 3. A function D : FS( X ) × FS( X ) → [0, 1] was named as the distance measure on FS( X ) so that
the following conditions [10] could be satisfied, and A, B ∈ FS( X ),

(1) 0 ≤ D ( A, B) ≤ 1;
(2) D ( A, B) = 0, if and only if A = B;
(3) D ( A, B) = D ( B, A);
(4) if A ⊆ B ⊆ C, then D ( A, C ) ≥ D ( A, B), D ( A, C ) ≥ D ( B, C ).

We redefined the fuzzy distance measure as follows, considering the infinity of the discrimination
measure:
Symmetry 2019, 11, 386 4 of 11

Definition 4. A function FD : FS( X ) × FS( X ) → [0, ∞) was named as the fuzzy distance measure so that
the following conditions were satisfied, and A, B ∈ FS( X ),
(1) FD ( A, B) ≥ 0;
(2) FD ( A, B) = 0, if and only if A = B;
(3) FD ( A, B) = FD ( B, A);
(4) if A ⊆ B ⊆ C, then FD ( A, C ) ≥ FD ( A, B), FD ( A, C ) ≥ FD ( B, C ).

It is obvious that the symmetry fuzzy discrimination has satisfied the first three conditions of
fuzzy distance:
(1) E( M, N ) ≥ 0
(2) E( M, N ) = 0, if and only if M = N;
(3) E( M, N ) = E( N, M ).
Thus, we just needed to verify that the fuzzy discrimination satisfied condition (4) of the fuzzy
distance measure.

Theorem 1. Let m, n, t be three numbers in [0, 1]. If m ≤ n ≤ t, thenE(m, t) ≥ E(m, n), E(m, t) ≥ E(n, t).

Proof. We provided 0 ln 0 = 0, m0 = 0.

m 1−m
E(m, n) = I (m, n) + I (n, m) = (m − n) ln( ) + (n − m) ln( ) (1)
n 1−n

n 1−n
E(n, t) = I (n, t) + I (t, n) = (n − t) ln( ) + (t − n) ln( ) (2)
t 1−t
m 1−m
E(m, t) = I (m, t) + I (t, m) = (m − t) ln( ) + (t − m) ln( ) (3)
t 1−t
Firstly, we needed to prove that (3) ≥ (1).

m m−t 1 − m t−m m 1 − m t−m


(3) = ln( ) + ln( ) = ln[( )m−t ( ) ]
t 1−t t 1−t

m m−n 1 − m n−m
(1) = ln[( ) ( ) ]
n 1−n

m m−t 1 − m t−m n m−n 1 − n n−m


(3) − (1) = ln[( ) ( ) ( ) ( ) ]
t 1−t m 1−m
Let f (t) = ( mt )m−t ( 11−−mt )t−m ( m
n m − n 1− n n − m
) ( 1− m ) ]

n m−n 1 − n n−m m 1 − m t−m


A=( ) ( ) , B = ( )m−t , C = ( )
m 1−m t 1−t
Then, f (t) = A ∗ B ∗ C, f 0 (t) = A ∗ ( B0 ∗ C + B ∗ C 0 )

∂B m m m m−t m m−t
= (e(m−t) ln t )0 = e(m−t) ln( t ) ∗ (− ln( ) − ) = B ∗ (− ln( ) − )
∂t t t t t

∂C 1− m 1−m t−m
= (e(t−m) ln( 1−t ) )0 = C ∗ (ln( )+ )
∂t 1−t 1−t

0 m m−t 1−m t−m


f (t) = A ∗ B ∗ C ∗ (− ln( )− + ln( )+ )
t t 1−t 1−t
It is obvious that A, B, C ≥ 0.
Since 0 ≤ m ≤ n ≤ t, then mt ≤ 1, ln( mt ) ≤ 0, 11−−mt ≥ 1, ln( 11−−mt ) ≥ 0. That is, f 0 (t) ≥ 0.
Symmetry 2019, 11, 386 5 of 11

When n = t, f (t) = f (n) = 1. When t > n, f (t) > f (n) = 1, then ln f (t) ≥ 0.
That is to say, the original (3) − (1) ≥ 0, i.e., E(m, t) ≥ E(m, n) has been obtained.
From here, we continue to prove that (3) ≥ (2).

n 1 − n t−n
(2) = ln[( )n−t ( ) ]
t 1−t

m m−t 1 − m t−m t n−t 1 − t t−n


(3) − (2) = ln[( ) ( ) ( ) ( ) ]
t 1−t n 1−n
Let g(m) = ( mt )m−t ( 11−−mt )t−m ( nt )n−t ( 11−
−t )t−n
n

t 1 − t t−n m 1 − m t−m
D = ( )n−t ( ) , B = ( )m−t , C = ( )
n 1−n t 1−t
Then, g(m) = D ∗ B ∗ C, g0 (m) = D ∗ ( B0 ∗ C + B ∗ C 0 )

∂B m m m−t
= (e(m−t) ln( t ) )0 = B ∗ (ln( ) + )
∂m t t

∂C 1− m 1−m m−t
= (e(t−m) ln( 1−t ) )0 = E ∗ (− ln( )+ )
∂m 1−t 1−m

m m−t 1−m m−t


g0 (z) = D ∗ B ∗ C ∗ (ln( )+ − ln( )+ )
t t 1−t 1−m
It is obvious that D ≥ 0.
Since 0 ≤ m ≤ n ≤ t, then mt ≤ 1, ln( mt ) ≤ 0, 11−−mt ≥ 1, ln( 11−−mt ) ≥ 0. That is, g0 (m) ≤ 0.
When m = n, g(m) = g(n) = 1, then m < n, and when g(m) > g(n) = 1, then ln g(m) ≥ 0.
That is to say, the original (3) − (2) ≥ 0, i.e., E(m, t) ≥ E(n, t) has been obtained.

Finally, we obtained the proof of the theorem.

Theorem 2. Let X be a universe course, M, N, T ∈ FS( X ). If M ⊆ N ⊆ T, then E( M, T ) ≥


E( M, N ), E( M, T ) ≥ E( N, T ).

E( M, N ) = I ( M, N ) + I ( N, M) = ∑in=1 (µ M ( xi ) − µ N ( xi )) ln
µ M ( xi )
µ N ( xi )
+ (µ N ( xi ) − µ M ( xi )) ln 11− µ M ( xi )
−µ ( x )
(4)
N i

E( N, T ) = I ( N, T ) + I ( T, N ) = ∑in=1 (µ N ( xi ) − µ T ( xi )) ln
µ N ( xi )
µ T ( xi )
+ (µ T ( xi ) − µ N ( xi )) ln 11− µ N ( xi )
−µ ( x )
(5)
T i

E( M, T ) = I ( M, T ) + I ( T, M) = ∑in=1 (µ M ( xi ) − µ T ( xi )) ln
µ M ( xi )
µ T ( xi )
+ (µ T ( xi ) − µ M ( xi )) ln 11−−µµM((xxi)) (6)
T i

From here, we need to prove that (6) ≥ (4) and (6) ≥ (5).
Since M ⊆ N ⊆ T, then µ M ( xi ) ≤ µ N ( xi ) ≤ µ T ( xi ), and µ( xi ) ∈ [0, 1], ∀ xi ∈ X.
From Theorem 1, we know that the Theorem has been satisfied to every single membership value,
meaning that the proof can be easily obtained from Theorem 1.

Example 1. Let X be a space of the universe course M, N, T ∈ FS( X ), where M = {h x, 0.5i| x ∈


X }, N = {h x, 0.7i| x ∈ X }, T = {h x, 0.9i| x ∈ X }. Clearly, M ⊆ N ⊆ T, and we can get E( M, N ) =
0.1695, E( N, T ) = 0.2700, E( M, T ) = 0.8789; that is, E( M, T ) ≥ E( M, N ), E( M, T ) ≥ E( N, T ).

Theorem 3. The above-defined symmetry fuzzy discrimination is a distance measure.

Finally, the above proves that the symmetry fuzzy discrimination, defined by Definition 2, is
consistent with the distance measure from the above theorems.
Symmetry 2019, 11, 386 6 of 11

3. Fuzzy Cross-Entropy Is Consistent with Distance Measure


Bhandari and Pal pointed out that the fuzzy discrimination has a defect—when µ B ( xi ) approaches
0 or 1, its value will be infinity. Thus, it has been modified on the basis of directed divergence proposed
by Lin [20], and also modified by Shang et al. [22] as follows:

Definition 5 ([22]). Let M, N ∈ FS( X ), where we can define a fuzzy cross-entropy as:
n
µ M (x ) 1 − µ M (x )
I2 ( M, N ) = ∑ (µ M (xi ) ln( 1/2(µ M (xi ) +i µ N (xi )) ) + (1 − µ M (xi )) ln( 1 − 1/2(µ M (xi ) +i µ N (xi )) )
i =1

This shows that it is well-defined and independent of every value of µ( xi ), which can express the
discrimination degree of A from B.

It also has the same properties as the above discrimination measures, in that when I2 ( M, N ) ≥ 0,
I2 ( M, N ) = 0 if, and only if M = N.
Let E2 ( M, N ) = I2 ( M, N ) + I2 ( N, M ); then, symmetry is satisfied. Thus, we mainly consider how
the above-defined fuzzy cross-entropy E2 ( M, N ) satisfies Condition (4) of the distance measure.
n
µ M ( xi ) 1 − µ M ( xi )
E2 ( M, N ) = ∑ µ M ( xi ) ln( ) + (1 − µ M ( xi )) ln( )
i =1
1/2(µ M ( xi ) + µ N ( xi )) 1 − 1/2(µ M ( xi ) + µ N ( xi ))
n
(7)
µ N ( xi ) 1 − µ N ( xi )
+ ∑ µ N ( xi ) ln( ) + (1 − µ N ( xi )) ln( )
i =1
1/2(µ M ( xi ) + µ N ( xi )) 1 − 1/2(µ M ( xi ) + µ N ( xi ))

Theorem 4. Let m, n, t be three numbers in [0, 1]. If m ≤ n ≤ t, then E2 (m, t) ≥ E2 (m, n), E2 (m, t) ≥
E2 (n, t).

Proof.
m 1−m
E2 (m, n) = I2 (m, n) + I2 (n, m) = m ln( ) + (1 − m) ln( )+
1/2(m + n) 1 − 1/2(m + n)
(8)
n 1−n
n ln( ) + (1 − n) ln( )
1/2(m + n) 1 − 1/2(m + n)

n 1−n
E2 (n, t) = I2 (n, t) + I2 (t, n) = n ln( ) + (1 − n) ln( )+
1/2(n + t) 1 − 1/2(n + t)
(9)
t 1−t
t ln( ) + (1 − t) ln( )
1/2(n + t) 1 − 1/2(n + t)
m 1−m
E2 (m, t) = I2 (m, t) + I2 (t, m) = m ln( ) + (1 − m) ln( )+
1/2(m + t) 1 − 1/2(m + t)
(10)
t 1−t
t ln( ) + (1 − t) ln( )
1/2(m + t) 1 − 1/2(m + t)
We firstly prove that (10) ≥ (8).
We provided 0 ln 0 = 0, m0 = 0.

m 1−m t 1−t
(10) = ln( )m + ln( )1−m + ln( )t + ln( )1− t
1/2(m + t) 1 − 1/2(m + t) 1/2(m + t) 1 − 1/2(m + t)

m 1−m t 1−t
= ln[( )x ( )1− m ( )t ( )1− t ]
1/2(m + t) 1 − 1/2(m + t) 1/2(m + t) 1 − 1/2(m + t)
Symmetry 2019, 11, 386 7 of 11

m 1−m n 1−n
(8) = ln[( )x ( )1− m ( )n ( )1− n ]
1/2(m + n) 1 − 1/2(m + n) 1/2(m + n) 1 − 1/2(m + n)

m + n m 2t t m + n n 1 − 1/2(m + n) 1−m 1−t 1 − 1/2(m + n) 1−n


(10) − (8) = ln[( ) ( )( ) ( ) ( )1− t ( ) ]
m+t m+t 2n 1 − 1/2(m + t) 1 − 1/2(m + t) 1−n

1−1/2(m+n)
1− m ( 1− t 1−1/2(m+n) 1−n
Let f (z) = ( m +n m 2t t m+n n
m+t ) ( m+t ) ( 2n ) ( 1−1/2(m+t) ) 1−1/2(m+t)
)1− t ( 1− n )

m + n n 1 − 1/2(m + n) 1−n m+n m 2t t 1 − 1/2(m + n) 1−m 1−t


A=( ) ( ) ,B = ( ) ,C = ( ) ,D = ( ) ,E = ( )1− t .
2n 1−n m+t m+t 1 − 1/2(m + t) 1 − 1/2(m + t)

Then, f (t) = A ∗ B ∗ C ∗ D ∗ E, f 0 (t) = A( B0 CDE + BC 0 DE + BCD 0 E + BCDE0 )

∂B m+n m −m −m
= (em ln( m+t ) )0 = e(m−t) ln( t ) ∗ ( ) = B∗( )
∂t m+t m+t

∂C 2t 2t m
= (e(t ln( m+t ) )0 = C ∗ (ln( )+ )
∂t m+t m+t

∂D (1−m) ln( 11− 1/2(m+n)


)
−1/2(m+t) ) 0 = D ∗ (
1−m
= (e )
∂t 2 − (m + t)

∂E 1− t
(1−t) ln( 1−1/2 )
(m+t) ) 0 = E ∗ (− ln(
1−t m−1
= (e )+ )
∂t 1 − 1/2(m + t) 1 − 1/2(m + t)

0 2t 1−t
f (t) = A ∗ B ∗ C ∗ D ∗ E ∗ (ln( ) − ln( ))
m+t 1 − 1/2(m + t)
It is clear that A, B, C, D, E ≥ 0. Since 0 ≤ m ≤ n ≤ t, then 2t ≥ m + t, ln( m2t+t ) ≥ 0, 1 − t ≥
1− t
1 − 1/2(m + t), ln( 1−1/2 (m+t)
) ≤ 0. That is, f 0 (t) ≥ 0. When t = n, f (t) = f (n) = 1, then t > n.
Where f (t) > f (n) = 1, then ln( f (t)) ≥ 0.
That is to say, the original (10) − (8) ≥ 0, meaning that E(m, t) ≥ E(m, n) has been obtained.
From here, we continue to prove that (10) ≥ (8).

n 1−n t 1−t
(9) = ln[( )n ( )1− n ( )t ( )1− t ]
1/2(n + t) 1 − 1/2(n + t) 1/2(n + t) 1 − 1/2(n + t)

n + t t 2m m n + t n 1−m 1 − 1/2(n + t) 1−t 1 − 1/2(n + t) 1−n


(10) − (9) = ln[( )( ) ( ) ( )1− m ( ) ( ) ]
m+t m+t 2n 1 − 1/2(m + t) 1 − 1/2(m + t) 1−n

1− m 1− m ( 1−1/2(n+t) 1−1/2(n+t) 1−n


n+t t 2m m n+t n
Let g(m) = ( m +t ) ( m+t ) ( 2n ) ( 1−1/2(m+t) ) 1−1/2(m+t)
)1− t ( 1− n ) ]

1 − 1/2(n + t) 1−n n + t n 2m m n+t t 1−m 1 − 1/2(n + t) 1−t


M=( ) ( ) ,N = ( ) ,P = ( ) ,Q = ( )1− m , S = ( ) ,
1−n 2n m+t m+t 1 − 1/2(m + t) 1 − 1/2(m + t)

Then, g(m) = M ∗ N ∗ P ∗ Q ∗ S, g0 (m) = M( N 0 PQS + NP0 QS + NPQ0 S + NPQS0 )

∂N 2m t
= N ∗ (ln + )
∂m m+t m+t

∂P t
= P ∗ (− )
∂m m+t
Symmetry 2019, 11, 386 8 of 11

∂Q 1−m t−1
= Q ∗ (− ln + )
∂m 1 − 1/2(m + t) 2 − (m + t)

∂S 1−t
= S∗( )
∂m 2 − (m + t)

2m 1−m
g0 ( x ) = M ∗ N ∗ P ∗ Q ∗ S ∗ (ln( ) − ln( ))
m+t 1 − 1/2(m + t)
It is obvious that M, N, P, Q, S ≥ 0, since 0 ≤ m ≤ n ≤ t, then 2m ≤ m + t, ln m2m +t ≤ 0, 1 − m ≥
1− m 0 ( m ) ≤ 0. When m = n, g ( m ) = g ( n ) = 1, then m < n,
1 − 1/2(m + t), ln 1−1/2 (m+t)
≥ 0; that is, g
and when g(m) > f (n) = 1, then ln g(m) ≥ 0. That is to say, the original (10) − (9) ≥ 0, meaning
that E2 (m, t) ≥ E2 (n, t) has been obtained.

Finally, we obtain the proof of the theorem.

Theorem 5. Let X be a universe course, M, N, T ∈ FS( X ). If M ⊆ N ⊆ T, then E2 ( M, T ) ≥


E2 ( M, N ), E2 ( M, T ) ≥ E2 ( N, T ).

E2 ( M, N ) = I2 ( M, N ) + I2 ( N, M)
n
µ M ( xi ) 1 − µ M ( xi )
= ∑ µ M ( xi ) ln( ) + (1 − µ M ( xi )) ln( )
i =1
1/2(µ M ( xi ) + µ N ( xi )) 1 − 1/2(µ M ( xi ) + µ N ( xi )) (11)
n
µ N ( xi ) 1 − µ N ( xi )
+ ∑ µ N ( xi ) ln( ) + (1 − µ N ( xi )) ln( )
i =1
1/2(µ M ( xi ) + µ N ( xi )) 1 − 1/2(µ M ( xi ) + µ N ( xi ))

E2 ( N, T ) = I2 ( N, T ) + I2 ( T, N )
n
µ N ( xi ) 1 − µ N ( xi )
= ∑ µ N ( xi ) ln( ) + (1 − µ N ( xi )) ln( )
i =1
1/2(µ N ( xi ) + µ T ( xi )) 1 − 1/2(µ N ( xi ) + µ T ( xi )) (12)
n
µ T ( xi ) 1 − µ T ( xi )
+ ∑ µ T ( xi ) ln( ) + (1 − µ T ( xi )) ln( )
i =1
1/2(µ N ( xi ) + µ T ( xi )) 1 − 1/2(µ N ( xi ) + µ T ( xi ))

E2 ( M, T ) = I2 ( M, T ) + I2 ( T, M)
n
µ M ( xi ) 1 − µ M ( xi )
= ∑ µ M ( xi ) ln( ) + (1 − µ M ( xi )) ln( )
i =1
1/2(µ M ( xi ) + µ T ( xi )) 1 − 1/2(µ M ( xi ) + µ T ( xi )) (13)
n
µ T ( xi ) 1 − µ T ( xi )
+ ∑ µ T ( xi ) ln( ) + (1 − µ T ( xi )) ln( )
i =1
1/2(µ M ( xi ) + µ T ( xi )) 1 − 1/2(µ M ( xi ) + µ T ( xi ))

We can easily obtain the proof from Theorem 3.

Example 2. Let X be a space of universe course, M, N, T ∈ FS( X ), in which M = {h x, 0.5i| x ∈


X }, N = {h x, 0.7i| x ∈ X }, T = {h x, 0.9i| x ∈ X } Clearly, M ⊆ N ⊆ T, and we can get E2 ( M, N ) =
0.042, E2 ( N, T ) = 0.0648, E2 ( M, T ) = 0.2035; that is, E2 ( M, T ) ≥ E2 ( M, N ), E2 ( M, T ) ≥ E2 ( N, T ).

Theorem 6. The above-defined symmetry fuzzy cross-entropy is a kind of distance measure.

4. Neutrosophic Cross-Entropy Is a Distance Measure


Smarandache [3,27] firstly proposed the definition of a neutrosophic set, which is an extension of
an intuitionistic fuzzy set(IFS) and an interval-valued intuitionistic fuzzy set, as follows:

Definition 6 ([3]). Let X be a universe course, where a neutrosophic set A in X is comprised of the
truth-membership function TA ( x ), indeterminacy-membership function I A ( x ), and falsity-membership function
Symmetry 2019, 11, 386 9 of 11

FA ( x ), in which TA ( x ), I A ( x ), FA ( x ) : X →]0− , 1+ [.
There is no restriction on the sum of TA ( x ), I A ( x ), FA ( x ), so 0− ≤ supTA ( x ) + supI A ( x ) + supFA ( x ) ≤ 3+ .

Wang et al. [5] introduced the definition of single value neutrosophic set (SVNS) for better
application in the engineering field. SVNS is an extension of the IFS, and also provides another way in
which to express and process uncertainty, incomplete, and inconsistent information in the real world.

Definition 7 ([5]). Let X be a space of points, where a single-value netrosophic set A in X is comprised of the
truth-membership function TA ( x ), indeterminacy-membership function I A ( x ), and falsity-membership function
FA ( x ). For each point x in X, TA ( x ), I A ( x ), FA ( x ) ∈ [0, 1]. Therefore, a SVNS A can be denoted by:

A = ( x, TA ( x ), I A ( x ), FA ( x )) | x ∈ X

There is no restriction on the sum of TA ( x ), I A ( x ), FA ( x ), so 0 ≤ supTA ( x ) + supI A ( x ) + supFA ( x ) ≤ 3.

The following are some properties about SVNSs M and N:


Let X be a universe course, SV NS( X ) be the set of all the single-value neutrosophic sets, and M, N, T ∈
SV NS( X ):

(1) M ⊆ N if, and only if TM ( x ) ≤ TN ( x ), I M ( x ) ≥ IN ( x ), FM ( x ) ≥ FN ( x ) for every x in X [5];


(2) M = N if, and only if A ⊆ B and B ⊆ A [5];
(3) If M ⊆ N, N ⊆ T, then M ⊆ T.

Then, Ye [25] first generalized the fuzzy cross-entropy measure to the SVNSs. The information
measure of neutrosophic sets are composed of the information measure of the truth-membership,
indeterminacy-membership, and falsity-membership in SVNSs. Let M, N ∈ SV NS( X ), where Ye
introduced the discrimination information of TM ( xi ) from TN ( xi ) for (i = 1, 2, ..., n) on the basis of the
definition of fuzzy cross-entropy I2 ( M, N ) as the following;
n
TM ( x ) 1 − TM ( x )
I2T ( M, N ) = ∑ (TM (xi ) ln( 1/2(TM (xi ) +i TN (xi )) ) + (1 − TM (xi )) ln( 1 − 1/2(TM (xi ) +i TN (xi )) )
i =1

We can define the following information in terms of the indeterminacy-membership function and
the falsity-membership function in the same way:
n
IM ( x ) 1 − IM ( x )
I2I ( M, N ) = ∑ ( IM (xi ) ln( 1/2( IM (xi ) +i IN (xi )) ) + (1 − IM (xi )) ln( 1 − 1/2( IM (xi ) +i IN (xi )) )
i =1

n
FM ( x ) 1 − FM ( x )
I2F ( M, N ) = ∑ ( FM (xi ) ln( 1/2( FM (xi ) +i FN (xi )) ) + (1 − FM (xi )) ln( 1 − 1/2( FM (xi ) +i FN (xi )) )
i =1

Definition 8 ([25]). The single-value neutrosophic cross-entropy about M and N where M, N ∈ SV NS( X )
can be defined as follows:
I3 ( M, N ) = I2T ( M, N ) + I2I ( M, N ) + I2F ( M, N )

It can also be used to express the degree of differences of M from N. According to Shannon’s
inequality, it is clear that I3 ( M, N ) ≥ 0, and I3 ( M, N ) = 0 if, and only if M = N. That is, TM ( xi ) =
TN ( xi ), I M ( xi ) = IN ( xi ), FM ( xi ) = FN ( xi ) for any x ∈ X. Then, the neutrosophic cross-entropy can be
modified as E3 ( M, N ) = I3 ( M, N ) + I3 ( N, M) for the symmetry.

Theorem 7. Let X be a space of the universe course, M, N, T ∈ SV NS( X ). If M ⊆ N ⊆ T, then E3 ( M, T ) ≥


E3 ( M, N ), and E3 ( M, T ) ≥ E3 ( N, T ).
Symmetry 2019, 11, 386 10 of 11

According to the proof of Theorem 4, we can easily find that E2T ( M, T ) ≥ E2T ( M, N ), and
E2T ( M, T ) ≥ E2T ( N, T ). In a similar way,
E2I ( M, T ) ≥ E2I ( M, N ), and E2I ( M, T ) ≥ E2I ( N, T ), E2F ( M, T ) ≥
E22 ( M, N ), and E2F ( M, T ) ≥ E2F ( N, T ),
meaning that E3 ( M, T ) ≥ E3 ( M, N ), E3 ( M, T ) ≥ E3 ( N, T ).
Thus, conclusively, we were able to easily obtain the proof.

Example 3. Let X be a space of the universe course, M, N, T ∈ SV NS( X ), where M = {( x, 0.5, 0.3, 0.7) |
x ∈ X }, N = {( x, 0.7, 0.2, 0.5) | x ∈ X }, T = {( x, 0.8, 0.1, 0.1) | x ∈ X }.
It is clear that M ⊆ N ⊆ T, and we can obtain:

E2T ( M, N ) = 0.0420, E2T ( N, T ) = 0.0134, E2T ( M, T ) = 0.1013;

that is, E2T ( M, T ) ≥ E2T ( M, N ), E2T ( M, T ) ≥ E2T ( N, T ).

E2I ( M, N ) = 0.0134, E2I ( N, T ) = 0.0199, E2I ( M, T ) = 0.0648;

that is, E2I ( M, T ) ≥ E2I ( M, N ), E2I ( M, T ) ≥ E2I ( N, T ).

E2F ( M, N ) = 0.0420, E2F ( N, T ) = 0.2035, E2F ( M, T ) = 0.4101;

that is, E2F ( M, T ) ≥ E2F ( M, N ), E2F ( M, T ) ≥ E2F ( N, T ).

E3 ( M, T ) = E2T ( M, T ) + E2I ( M, T ) + E2F ( M, T ) = 0.5762, E3 ( M, N ) = 0.0974, E3 ( N, T ) = 0.2368.

Thus, E3F ( M, T ) ≥ E3F ( M, N ), E3F ( M, T ) ≥ E3F ( N, T ).

Theorem 8. The above-defined symmetry neutrosophic cross-entropy is a distance measure.

5. Conclusions
On account of these similar properties between distance measure and cross-entropy (such as
non-negativity and symmetry), and when the cross-entropy (distance) between two fuzzy sets is 0 if,
and only if the two sets coincide. We also found that the decision principle of cross-entropy is consistent
with decision principle of distance measure in decision-making. That decision principle is, among all
the choices, we finally chose the one with the smallest cross-entropy (distance) from the ideal solution.
Based on the above analysis, we tend to studied their relationships. In this paper, we mainly proved
that the fuzzy discrimination and improved fuzzy cross-entropy and neutrosophic cross-entropy based
on fuzzy discrimination were distance measures in fact. That is to say, the symmetry cross-entropy
mentioned in this paper is consistent with the distance measure. In the future, we will try to simplify
the formula and propose an improvement to it. It is how the cross-entropy formulas are composed of
logarithmic functions which is what makes the calculation so complicated.

Author Contributions: All authors have contributed equally to this paper.


Funding: This work has been supported by the National Natural Science Foundation of China (Grant No.
61473239).
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353.
2. Atanassov, K. Intuitionistic fuzzy set. Fuzzy Sets Syst. 1986, 20, 87–96.
3. Smarandache, F. Neutrosophy: Neutrosophic Probability, Set, and Logic; American Research Press: Rehoboth,
DE, USA, 1998.
4. Vlachos, I.K.; Sergiadis, G.D. Intuitionistic fuzzy information-applications to pattern recognition.
Pattern Recognit. Lett. 2007, 28, 197–206.
5. Wang, H.; Smarandache, F.; Zhang, Y.Q.; Sunderraman, R. Single Valued Neutrosophic Sets. Multispace
Multistructure 2010, 4, 410–413.
Symmetry 2019, 11, 386 11 of 11

6. Zhang, X.H.Fuzzy anti-grouped filters and fuzzy normal filters in pseudo-BCI algebras. J. Intell. Fuzzy Syst.
2017, 33, 1767–1774.
7. Zhang, X.H.; Ma, Y.C.; Smarandache, F.; Dai, J.H. Neutrosophic regular filters and fuzzy regular filters in
pseudo-BCI algebras. Neutrosophic Sets Syst. 2017, 17, 10–15.
8. Zhang, X.H.; Bo, C.X.; Smarandache, F.; Dai, J.H. New inclusion relation of neutrosophic sets with
applications and related lattice structure. Int. J. Mach. Learn. Cybern. 2018, 9, 1753–1763,
doi:10.1007/s13042-018-0817-6.
9. Zhang, X.H.; Bo, C.X.; Smarandache, F.; Park, C. New operations of totally dependent-neutrosophic sets and
totally dependent-neutrosophic soft sets. Symmetry 2018, 10, 187.
10. Zhang, X.H.; Yu, P.; Smarandache, F.; Park, C. Redefined neutrosophic filters in BE-algebras. Ital. J. Pure
Appl. Math. 2019, in press.
11. Liu, F.; Guan, A.W.; Lukovac, V.; Vukić, M. A multicriteria model for the selection of the transport service
provider: A single valued neutrosophic DEMATEL multicriteria model. Decis. Mak. Appl. Manag. Eng. 2018,
1, 121–130.
12. Mukhametzyanov, I.; Pamučar, D. A sensitivity analysis in MCDM problems: A statistical approach.
Decis. Mak. Appl. Manag. Eng. 2018, 1, 51–80, doi:10.3390/sym10060215.
13. Tu, A.; Ye, J.; Wang, B. Symmetry Measures of Simplified Neutrosophic Sets for Multiple Attribute
Decision-Making Problems. Symmetry 2018, 10, 144, doi:10.3390/sym10050144.
14. Zadeh, L.A. Probability measures of fuzzy events. J. Math. Anal. Appl. 1968, 23, 421–427.
15. Kullback, S. Information Theory and Statistics; Dover Publications: New York, NY, USA, 1997.
16. De Luca, A.S.; Termini, S. A definition of nonprobabilistic entropy in the setting of fuzzy sets theory.
Inf. Control 1972, 20, 301–312.
17. Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzt sets and on interval-valued fuzzy sets. Fuzzy Sets Syst.
1996, 78, 305–316.
18. Szmidt, E.; Kacprzyk, J. Entropy for intuitionistic fuzzy sets. Fuzzy Sets Syst. 2001, 118, 467–477.
19. Wei, C.P.; Wang, P.; Zhang, Y.Z. Entropy, similarity measure of interval-valued intuitionistic fuzzy sets and
their applications. Inf. Sci. 2011, 181, 4273–4286.
20. Lin, J. Divergence measures based on Shannon entropy. IEEE Trans. Inf. Theory 1991, 37, 145–151.
21. Bhandari, D.; Pal, N.R. Some new information measures for fuzzy sets. Inf. Sci. 1993, 67, 209–228.
22. Shang, X.G.; Jiang, W.S. A note on fuzzy information measures. Pattern Recognit. Lett. 1997, 18, 425–432.
23. Verma, R. On generalized fuzzy divergence measure and their application to multicriteria decision-making.
J. Comb. Inf. Syst. Sci. 2014 39, 191–213.
24. Verma, R.; Sharma, B.D. On generalized intuitionistic fuzzy relative information and their properties.
J. Uncertain Syst. 2012, 6, 308–320.
25. Ye, J. Single valued neutrosophic cross-entropy for multicriteria decision making problems. Appl. Math.
Model. 2014, 38, 1170–1175.
26. Şhahin, R. Cross-entropy measure on interval neutrosophic sets and its applications in multicriteria decision
making. Neural Comput. Appl. 2017, 28, 1177–1187, doi:10.1007/s00521-015-2131-5.
27. Smarandache, F. A Unifying Field in Logics. Neutrosophy: Neutrosophic Probability, Set and Logic; American
Research Press: Rehoboth, DE, USA, 1999.

c 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like