Information Measure in Terms of the Hazard Function and Its Estimate
Abstract
:1. Introduction
2. Quantal Fisher Information and Quantal Kullback-Leibler Information
3. Quantal Fisher Information in Terms of the (Reversed) Hazard Function
4. Quantal KL Information and Choice of the Weight Function in Terms of Maximizing the Quantal Fisher Information
- ,
5. Estimation of the Quantal KL Information
- Symmetric alternatives: Logistic, , Uniform, Beta(0.5,0.5), Beta(2,2);
- Asymmetric alternatives: Beta(2,5), Beta(5,2), Exponential, Lognormal(0,0.5), Lognormal(0,1).
6. Concluding Remarks
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Efron, B.; Johnstone, I. Fisher information in terms of the hazard rate. Ann. Stat. 1990, 18, 38–62. [Google Scholar] [CrossRef]
- Park, S.; Shin, M. Kullback-Libler information of a censored variable and its applications. Statistics 2014, 48, 756–765. [Google Scholar] [CrossRef]
- Tsairidis, C.; Zogfrafos, K.; Ferentinos, K.; Papaioannou, T. Information in quantal response data and random censoring. Ann. Inst. Stat. Math. 2001, 53, 528–542. [Google Scholar] [CrossRef]
- Rao, M.; Chen, Y.; Vemuri, B.C.; Wang, F. Cumulative residual entropy: A new measure of information. IEEE Trans. Inf. Theory 2004, 50, 1220–1228. [Google Scholar] [CrossRef]
- Di Crescenzo, A.; Longobardi, M. On cumulative entropies. J. Stat. Plan. Inference 2009, 139, 4072–4087. [Google Scholar] [CrossRef]
- Gertsbakh, I. On the Fisher information in the type-I censored and quantal response data. Stat. Probab. Lett. 1995, 32, 297–306. [Google Scholar] [CrossRef]
- Park, S. On the asymptotic Fisher information in order statistics. Metrika 2003, 57, 71–80. [Google Scholar] [CrossRef]
- Chen, Z. The efficiency of ranked-set sampling relative to simple random sampling under multi-parameter families. Stat. Sin. 2000, 10, 247–263. [Google Scholar]
- Baratpour, S.; Rad, A.H. Testing goodness-of-fit for exponential distribution based on cumulative residual entropy. Commun. Stat. Theory Methods 2012, 41, 1387–1396. [Google Scholar] [CrossRef]
- Zhang, J. Powerful goodness-of-fit tests based on the likelihood ratio. J. R. Stat. Soc. B 2002, 64, 281–294. [Google Scholar] [CrossRef]
- Kullback, S. Information Theory and Statistics; Wiley: New York, NY, USA, 1959. [Google Scholar]
- Gupta, R.D.; Gupta, R.C.; Sankaran, P.G. Some Characterization Results Based on Factorization of the (Reversed) Hazard Rate Function. Commun. Stat. Theory Methods 2004, 33, 3009–3031. [Google Scholar] [CrossRef]
- Pakyari, R.; Balakrishnan, N. A general purpose approximate goodness-of-fit test for progressively Type-II censored data. IEEE Trans. Reliab. 2012, 61, 238–244. [Google Scholar] [CrossRef]
- Noughabi, H.A.; Arghami, N.R. Goodness-of-Fit Tests Based on Correcting Moments of Entropy Estimators. Commun. Stat. Simul. Comput. 2013, 42, 499–513. [Google Scholar] [CrossRef]
- Qiu, G.; Jia, K. Extropy estimators with applications in testing uniformity. J. Nonparametric Stat. 2018, 30, 182–196. [Google Scholar] [CrossRef]
Specification | |||||
---|---|---|---|---|---|
Distribution | Parameter | ||||
Normal | Location | 0.1983 | 0.4205 | 0.4805 | 0.5513 |
Normal | Scale | 0.3497 | 0.3014 | 0.2701 | 0.1883 |
Normal | Generalized FI | 0.0693 | 0.1267 | 0.1298 | 0.1013 |
n | |||||
---|---|---|---|---|---|
10 | 0.4360 | 0.4350 | 0.4322 | 2.7395 | 0.2619 |
20 | 0.4131 | 0.4141 | 0.4115 | 3.9108 | 0.1920 |
30 | 0.4028 | 0.4025 | 0.4008 | 4.6310 | 0.1586 |
40 | 0.3977 | 0.3978 | 0.3960 | 5.1486 | 0.1385 |
50 | 0.3928 | 0.3924 | 0.3918 | 5.5000 | 0.1244 |
60 | 0.3901 | 0.3905 | 0.3892 | 5.8764 | 0.1138 |
70 | 0.3914 | 0.3914 | 0.3905 | 6.1723 | 0.1060 |
80 | 0.3872 | 0.3873 | 0.3867 | 6.3988 | 0.0991 |
90 | 0.3874 | 0.3874 | 0.3867 | 6.6307 | 0.0935 |
100 | 0.3858 | 0.3851 | 0.3851 | 6.8319 | 0.0888 |
Alternatives | |||||
---|---|---|---|---|---|
N(0,1) | 5.01 | 5.01 | 5.00 | 5.08 | 4.96 |
Logistic(0,1) | 10.54 | 10.35 | 10.48 | 12.34 | 8.55 |
t(5) | 16.98 | 16.86 | 17.04 | 19.69 | 13.15 |
t(3) | 32.15 | 31.96 | 32.35 | 34.56 | 26.03 |
t(1) | 88.06 | 88.04 | 88.23 | 86.49 | 84.63 |
Uniform | 16.57 | 16.40 | 16.78 | 13.57 | 9.71 |
Beta(0.5,0.5) | 60.66 | 60.52 | 61.10 | 66.55 | 31.82 |
Beta(1,1) | 16.73 | 16.55 | 16.91 | 13.53 | 9.89 |
Beta(2,2) | 5.52 | 5.41 | 5.52 | 3.26 | 5.08 |
Beta(2,5) | 11.53 | 17.47 | 14.64 | 17.26 | 11.54 |
Beta(5,2) | 17.92 | 11.48 | 14.77 | 17.62 | 11.51 |
Exponential(1) | 72.62 | 81.23 | 77.59 | 86.72 | 58.54 |
Log normal(0,0.5) | 40.64 | 51.40 | 46.64 | 53.98 | 34.29 |
Log normal(0,1) | 88.03 | 92.20 | 90.48 | 94.32 | 79.20 |
Alternatives | |||||
---|---|---|---|---|---|
N(0,1) | 5.06 | 5.12 | 5.05 | 5.16 | 5.05 |
Logistic(0,1) | 16.13 | 16.21 | 16.20 | 18.49 | 11.45 |
t(5) | 30.25 | 30.31 | 30.41 | 33.63 | 21.10 |
t(3) | 60.86 | 60.85 | 60.99 | 61.60 | 48.57 |
t(1) | 99.72 | 99.72 | 99.73 | 99.48 | 99.33 |
Uniform | 57.43 | 57.61 | 57.73 | 80.08 | 25.92 |
Beta(0.5,0.5) | 99.06 | 99.08 | 99.08 | 99.97 | 80.21 |
Beta(1,1) | 57.54 | 57.59 | 57.80 | 80.04 | 26.07 |
Beta(2,2) | 13.18 | 13.30 | 13.33 | 14.75 | 8.21 |
Beta(2,5) | 35.09 | 43.79 | 39.67 | 59.41 | 25.65 |
Beta(5,2) | 43.56 | 35.09 | 39.47 | 59.03 | 25.57 |
Exponential(1) | 99.50 | 99.76 | 99.65 | 99.99 | 96.05 |
Log normal(0,0.5) | 84.70 | 89.52 | 87.40 | 94.16 | 71.05 |
Log normal(0,1) | 99.94 | 99.97 | 99.96 | 100.00 | 99.52 |
Alternatives | |||||
---|---|---|---|---|---|
N(0,1) | 5.06 | 5.08 | 5.04 | 5.05 | 5.05 |
Logistic(0,1) | 24.15 | 24.16 | 24.19 | 24.99 | 15.57 |
t(5) | 48.18 | 48.23 | 48.26 | 50.34 | 33.23 |
t(3) | 84.94 | 84.97 | 84.97 | 83.52 | 73.09 |
t(1) | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
Uniform | 95.02 | 95.07 | 95.07 | 99.93 | 59.20 |
Beta(0.5,0.5) | 100.00 | 100.00 | 100.00 | 100.00 | 99.47 |
Beta(1,1) | 94.82 | 94.89 | 94.88 | 99.95 | 58.92 |
Beta(2,2) | 31.96 | 32.10 | 32.10 | 54.81 | 15.39 |
Beta(2,5) | 72.88 | 78.80 | 76.00 | 96.22 | 50.56 |
Beta(5,2) | 78.73 | 72.98 | 76.04 | 96.11 | 50.70 |
Exponential(1) | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
Log normal(0,0.5) | 99.28 | 99.59 | 99.47 | 99.94 | 95.07 |
Log normal(0,1) | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Park, S. Information Measure in Terms of the Hazard Function and Its Estimate. Entropy 2021, 23, 298. https://doi.org/10.3390/e23030298
Park S. Information Measure in Terms of the Hazard Function and Its Estimate. Entropy. 2021; 23(3):298. https://doi.org/10.3390/e23030298
Chicago/Turabian StylePark, Sangun. 2021. "Information Measure in Terms of the Hazard Function and Its Estimate" Entropy 23, no. 3: 298. https://doi.org/10.3390/e23030298
APA StylePark, S. (2021). Information Measure in Terms of the Hazard Function and Its Estimate. Entropy, 23(3), 298. https://doi.org/10.3390/e23030298