Aviso: para depositar documentos, por favor, inicia sesión e identifícate con tu cuenta de correo institucional de la UCM con el botón MI CUENTA UCM. No emplees la opción AUTENTICACIÓN CON CONTRASEÑA
 

New Family Of Estimators For The Loglinear Model Of Quasi-Independence Based On Power-Divergence Measures

dc.contributor.authorFelipe Ortega, Ángel
dc.contributor.authorPardo Llorente, Leandro
dc.date.accessioned2023-06-20T09:33:21Z
dc.date.available2023-06-20T09:33:21Z
dc.date.issued2007-05
dc.descriptionloglinear model, quasi-independence, maximum likelihood, minimum powerdivergence estimator
dc.description.abstractWe study the minimum power-divergence estimator, introduced and studied by N. Cressie and T. R. C. Read [Multinomial goodness-of-fit tests. J. R. Stat. Soc., Ser. B 46, 440–464 (1984), in the loglinear model of quasi-independence. A simulation study illustrates that minimum chi-squared estimator and Cressie-Read estimator are good alternatives to the classical maximum-likelihood estimator for this problem. The estimator obtained for = 2 is the most robust and efficient estimator among the family of the minimum power estimators.
dc.description.departmentDepto. de Estadística e Investigación Operativa
dc.description.facultyFac. de Ciencias Matemáticas
dc.description.refereedTRUE
dc.description.statuspub
dc.eprint.idhttps://eprints.ucm.es/id/eprint/15086
dc.identifier.doi10.1080/10629360600890154
dc.identifier.issn0094-9655
dc.identifier.officialurlhttp://www.tandfonline.com/doi/pdf/10.1080/10629360600890154
dc.identifier.relatedurlhttp://www.tandfonline.com
dc.identifier.urihttps://hdl.handle.net/20.500.14352/49892
dc.issue.number5
dc.journal.titleJournal of Statistical Computation and Simulation
dc.language.isoeng
dc.page.final420
dc.page.initial407
dc.publisherTaylor & Francis
dc.rights.accessRightsrestricted access
dc.subject.cdu517.9
dc.subject.keywordLoglinear Model
dc.subject.keywordQuasi-Independence
dc.subject.keywordMaximum Likelihood
dc.subject.keywordMinimum Power-Divergence Estimator
dc.subject.keywordMinimum
dc.subject.keywordDistance
dc.subject.keywordComputer Science
dc.subject.keywordInterdisciplinary Applications
dc.subject.keywordStatistics & Probability
dc.subject.ucmEcuaciones diferenciales
dc.subject.unesco1202.07 Ecuaciones en Diferencias
dc.titleNew Family Of Estimators For The Loglinear Model Of Quasi-Independence Based On Power-Divergence Measures
dc.typejournal article
dc.volume.number77
dcterms.referencesAgresti, A., 2002,Categorical Data Analysis(2nd ed.(Wiley). Powers, D.A. and Xie,Y., 2000, Statistical Methods for Categorical Data Analysis (Academic Press). Andersen, E.B., 1990, The Statistical Analysis of Categorical Data (Springer-Verlag). Kullback, S., 1985, Kullback information. In: S. Kotz and N.L. Johnson (Eds) Encyclopedia of Statistical Sciences, Vol. 4 (NewYork: JohnWiley & Sons), pp. 421–425. Kullback, S., 1985, Minimum discrimination information (MDI) estimation. In: S. Kotz and N.C. Johnson (Eds) Encyclopedia of Statistical Sciences, Vol. 5 (NewYork: JohnWiley), pp. 527–529. Cressie, N. and Pardo, L., 2000, Minimum φ-divergence estimator and hierarchical testing in loglinear models. Statistica Sinica, 10(3), 867–884. Neyman, J., 1949, Contribution to the theory of the χ2-test. Proceedings of the First Symposium on Mathematical Statistics and Probability, University of Berkeley Press, Berkeley, pp. 239–273. Matusita, K., 1954, On the estimation by the minimum distance method. Annals of the Institute of Statistical Mathematics, 5, 59–65. Cressie, N. and Read, T.R.C., 1984, Multinomial goodness-of-fit tests. Journal of the Royal Statistic Society, Series B, 46, 440–464. Pardo, L., 2006, Statistical Inference Based on Divergence Measures (NewYork: Chapman & Hall/CRC). Rao, C.R., 1961, Asymptotic efficiency and limiting information. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Vol. 1, pp. 531–546. Rao, C.R., 1962, Efficient estimates and optimum inference procedures in large samples. Journal of the Royal Statistical Society Series B, 24, 46–72. Rao, C.R., 1963, Criteria of estimation in large samples. Sankhya Series A, 25, 189–206. Read, T.R.C. and Cressie, N.A.C., 1988, Goodness-of-fit for Discrete Multivariate Data (NewYork: Springer). Lindsay, B.G., 1994, Efficiency versus robutsness: the case for minimum Hellinger distance and related methods. Annals of Statistics, 22, 1081–1114. Berkson, J., 1980, Minimum chi-square, not maximum likelihood!. Annals of Statistics, 8, 457–487. Parr, W.C., 1981, Minimum distance estimation: a bibliography. Communications in Statistics: Theory and Methods, 10, 1205–1224. Causey, B.D., 1983, Estimation of proportions for multinomial contingency tables subject to marginal constraints. Communications in Statistics: Theory and Methods, 12, 2581–2587. Harris, R.R. and Kanji, G.K., 1983, On the use of minimum chi-square estimation. The Statistician, 32, 379–394. Hodges, J.L. and Lehmann, E.L., 1970, Deficiency. Annals of Mathematical Statistics, 41, 783–801. Mitra, S., Basu, S. and Basu, A., 2000, Exact minimum disparity inference in complex multinomial models. Metron, 58, 167–185. Basu, A. and Basu, S., 1998, Penalized minimum disparity methods for multinomial models. Statistica Sinica, 8, 841–860.
dspace.entity.typePublication
relation.isAuthorOfPublication72ddce0d-fbc4-4233-800c-cbd2cc36a012
relation.isAuthorOfPublicationa6409cba-03ce-4c3b-af08-e673b7b2bf58
relation.isAuthorOfPublication.latestForDiscovery72ddce0d-fbc4-4233-800c-cbd2cc36a012

Download

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
05.pdf
Size:
141.83 KB
Format:
Adobe Portable Document Format

Collections