Extropy and Entropy Estimation Based on Uniformly and Log-Normal Distributed Data
Keywords:
Entropy; Extropy; Mean Square Error; Non-parametric statistics; Monte Carlo simulation; Type-I Interval Censoring.Abstract
This paper proposes non-parametric estimates for the two information measures extropy and entropy when a progressively Type-I interval censored data is available. Different non-parametric approaches are used for deriving the estimates; namely Moments, Linear, Kernel and differential Approximation methods. Some properties of the proposed estimates are studied. The performance of the proposed estimates is studied under various censoring schemes via simulation studies considering the parent distributions of the data; namely Uniform and Log-Normal distributions. The results indicate that Moments Approximation (J1 and H1) and Linear
Approximation (J2 and H2) estimates for the extropy and entropy have a smaller mean squared error than competing estimates. A real data set is presented and analysed.
Keywords: Entropy; Extropy; Mean Square Error; Non-parametric statistics; Monte Carlo simulation; Type-I Interval Censoring.
2010 Mathematics Subject Classification. 26A25; 26A35