New Family Of Estimators For The Loglinear Model Of Quasi-Independence Based On Power-Divergence Measures
dc.contributor.author | Felipe Ortega, Ángel | |
dc.contributor.author | Pardo Llorente, Leandro | |
dc.date.accessioned | 2023-06-20T09:33:21Z | |
dc.date.available | 2023-06-20T09:33:21Z | |
dc.date.issued | 2007-05 | |
dc.description | loglinear model, quasi-independence, maximum likelihood, minimum powerdivergence estimator | |
dc.description.abstract | We study the minimum power-divergence estimator, introduced and studied by N. Cressie and T. R. C. Read [Multinomial goodness-of-fit tests. J. R. Stat. Soc., Ser. B 46, 440–464 (1984), in the loglinear model of quasi-independence. A simulation study illustrates that minimum chi-squared estimator and Cressie-Read estimator are good alternatives to the classical maximum-likelihood estimator for this problem. The estimator obtained for = 2 is the most robust and efficient estimator among the family of the minimum power estimators. | |
dc.description.department | Depto. de Estadística e Investigación Operativa | |
dc.description.faculty | Fac. de Ciencias Matemáticas | |
dc.description.refereed | TRUE | |
dc.description.status | pub | |
dc.eprint.id | https://eprints.ucm.es/id/eprint/15086 | |
dc.identifier.doi | 10.1080/10629360600890154 | |
dc.identifier.issn | 0094-9655 | |
dc.identifier.officialurl | http://www.tandfonline.com/doi/pdf/10.1080/10629360600890154 | |
dc.identifier.relatedurl | http://www.tandfonline.com | |
dc.identifier.uri | https://hdl.handle.net/20.500.14352/49892 | |
dc.issue.number | 5 | |
dc.journal.title | Journal of Statistical Computation and Simulation | |
dc.language.iso | eng | |
dc.page.final | 420 | |
dc.page.initial | 407 | |
dc.publisher | Taylor & Francis | |
dc.rights.accessRights | restricted access | |
dc.subject.cdu | 517.9 | |
dc.subject.keyword | Loglinear Model | |
dc.subject.keyword | Quasi-Independence | |
dc.subject.keyword | Maximum Likelihood | |
dc.subject.keyword | Minimum Power-Divergence Estimator | |
dc.subject.keyword | Minimum | |
dc.subject.keyword | Distance | |
dc.subject.keyword | Computer Science | |
dc.subject.keyword | Interdisciplinary Applications | |
dc.subject.keyword | Statistics & Probability | |
dc.subject.ucm | Ecuaciones diferenciales | |
dc.subject.unesco | 1202.07 Ecuaciones en Diferencias | |
dc.title | New Family Of Estimators For The Loglinear Model Of Quasi-Independence Based On Power-Divergence Measures | |
dc.type | journal article | |
dc.volume.number | 77 | |
dcterms.references | Agresti, A., 2002,Categorical Data Analysis(2nd ed.(Wiley). Powers, D.A. and Xie,Y., 2000, Statistical Methods for Categorical Data Analysis (Academic Press). Andersen, E.B., 1990, The Statistical Analysis of Categorical Data (Springer-Verlag). Kullback, S., 1985, Kullback information. In: S. Kotz and N.L. Johnson (Eds) Encyclopedia of Statistical Sciences, Vol. 4 (NewYork: JohnWiley & Sons), pp. 421–425. Kullback, S., 1985, Minimum discrimination information (MDI) estimation. In: S. Kotz and N.C. Johnson (Eds) Encyclopedia of Statistical Sciences, Vol. 5 (NewYork: JohnWiley), pp. 527–529. Cressie, N. and Pardo, L., 2000, Minimum φ-divergence estimator and hierarchical testing in loglinear models. Statistica Sinica, 10(3), 867–884. Neyman, J., 1949, Contribution to the theory of the χ2-test. Proceedings of the First Symposium on Mathematical Statistics and Probability, University of Berkeley Press, Berkeley, pp. 239–273. Matusita, K., 1954, On the estimation by the minimum distance method. Annals of the Institute of Statistical Mathematics, 5, 59–65. Cressie, N. and Read, T.R.C., 1984, Multinomial goodness-of-fit tests. Journal of the Royal Statistic Society, Series B, 46, 440–464. Pardo, L., 2006, Statistical Inference Based on Divergence Measures (NewYork: Chapman & Hall/CRC). Rao, C.R., 1961, Asymptotic efficiency and limiting information. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Vol. 1, pp. 531–546. Rao, C.R., 1962, Efficient estimates and optimum inference procedures in large samples. Journal of the Royal Statistical Society Series B, 24, 46–72. Rao, C.R., 1963, Criteria of estimation in large samples. Sankhya Series A, 25, 189–206. Read, T.R.C. and Cressie, N.A.C., 1988, Goodness-of-fit for Discrete Multivariate Data (NewYork: Springer). Lindsay, B.G., 1994, Efficiency versus robutsness: the case for minimum Hellinger distance and related methods. Annals of Statistics, 22, 1081–1114. Berkson, J., 1980, Minimum chi-square, not maximum likelihood!. Annals of Statistics, 8, 457–487. Parr, W.C., 1981, Minimum distance estimation: a bibliography. Communications in Statistics: Theory and Methods, 10, 1205–1224. Causey, B.D., 1983, Estimation of proportions for multinomial contingency tables subject to marginal constraints. Communications in Statistics: Theory and Methods, 12, 2581–2587. Harris, R.R. and Kanji, G.K., 1983, On the use of minimum chi-square estimation. The Statistician, 32, 379–394. Hodges, J.L. and Lehmann, E.L., 1970, Deficiency. Annals of Mathematical Statistics, 41, 783–801. Mitra, S., Basu, S. and Basu, A., 2000, Exact minimum disparity inference in complex multinomial models. Metron, 58, 167–185. Basu, A. and Basu, S., 1998, Penalized minimum disparity methods for multinomial models. Statistica Sinica, 8, 841–860. | |
dspace.entity.type | Publication | |
relation.isAuthorOfPublication | 72ddce0d-fbc4-4233-800c-cbd2cc36a012 | |
relation.isAuthorOfPublication | a6409cba-03ce-4c3b-af08-e673b7b2bf58 | |
relation.isAuthorOfPublication.latestForDiscovery | 72ddce0d-fbc4-4233-800c-cbd2cc36a012 |
Download
Original bundle
1 - 1 of 1