An extropy based goodness-of-fit test for Circular
distributions
1 STATE OF THE ART AND PROPOSED PLAN
Directional statistic plays an important role in many area of research, as directions are an important measures. Many
times we encounter problems where we need to find a best-fitted circular distribution for directional statistics instead
of a linear data. Some of the monographs and literature for circular statistics can be found in [2], [4]. In 2000, Lund
and Rao [4] proposed an entropy based goodness-of-fit test of von Mises distribution, which is widely known. Later in
2015, Lad et al. introduced a complementary dual of entropy and named it as ‘extropy’. In many scenario, specially
in exponential family of distributions, it is observed that, calculating extropy is easier than calculating entropy of a
distribution because of the ‘log’ term appears in the expression of entropy. Here, we propose to work on an extropy-based
goodness-of-fit test for von Mises distribution initially. Later we may work on some other useful circular distributions.
We observe that von Mises distribution maximises entropy as well as extropy under some constraints, which gives us the
motivation to prepare an extropy based goodness-of-fit test for von Mises distribution. In 2015, Lad et al. [3] introduced
Extropy as a complementary dual of Entropy. For a continuous probability distribution f with support S, he defines the
extropy J(f ) as
1 1 −1
Z Z
1
J(f ) = − f 2 (x)dx = − f (F (u)) du. (1)
2 S 2 0
Back in 1948, Shannon [5] introduced entropy for a continuous distribution as
Z
H(f ) = − f (x) log(f (x)). (2)
S
As we can see by the expressions only, that finding entropy is much more difficult compare to extropy for a large number
of distribution because of the ‘log’ term appeared in the expression of entropy. Lad [3] showed extropy as complementary
dual of entropy, so it can have many application in the area of communication theory, information theory and statistics
etc.
In [4], Lund and Rao mention some ways to define order statistics and spacing for circular data. At that time the
concept related extropy was not published, so there is no known goodness-of-fit test, which is based on extropy. He
suggested some ways to define order statistics for circular data and which is equivalent to Vasicek’s definition for linear
data [6]. Consistency of sample entropy defined in the form of order statistics is proved in [6]. Similarly we can also come
up with a sample extropy which is consistent.
In 2022, Chaubey [1] worked on smoothing methods for density and distribution functions dealing with circular data.
He highlighted the usefulness of circular kernels for smooth density estimation. He observed that the wrapped Cauchy
kernel as a choice of circular kernels appears as a natural candidate compare to von-Mises circular kernel. We propose
to work on density estimation for sample extropy using wrapped Cauchy kernel.
To find the critical values of our proposed goodness-of-fit test, we can use Monte Carlo simulation as Lund and Rao [4]
found some complications in finding a closed form of density function for sample entropy in entropy based goodness-of-fit
test also. Goodness of fit test is an important statistical method that compares observed values to predicted values from
a model or a distribution. Goodness-of-fit tests can help with decision-making and checking model fit. Extropy is simple
to calculated for a large family of distribution. An extropy based goodness-of-fit test seems to be easy to use in many
cases and working on this can help in all areas wherever we are fitting a distribution on observed samples.
References
[1] Yogendra P Chaubey. On nonparametric density estimation for circular data: An overview. Directional Statistics for
Innovative Applications: A Bicentennial Tribute to Florence Nightingale, pages 351–378, 2022.
[2] S Rao Jammalamadaka and Ashis SenGupta. Topics in circular statistics, volume 5. world scientific, 2001.
[3] Frank Lad, Giuseppe Sanfilippo, and Gianna Agrò. Extropy: Complementary Dual of Entropy. Statistical Science,
30(1):40 – 58, 2015.
[4] Ulric Lund and S Rao Jammalamadaka. An entropy-based test for goodness of fit of the von mises distribution.
Journal of statistical computation and simulation, 67(4):319–332, 2000.
[5] Claude E Shannon. A mathematical theory of communication. The Bell system technical journal, 27(3):379–423,
1948.
[6] Oldrich Vasicek. A test for normality based on sample entropy. Journal of the Royal Statistical Society: Series B
(Methodological), 38(1):54–59, 1976.