Aviso: para depositar documentos, por favor, inicia sesión e identifícate con tu cuenta de correo institucional de la UCM con el botón MI CUENTA UCM. No emplees la opción AUTENTICACIÓN CON CONTRASEÑA
 

Minimum phi-divergence estimator in logistic regression models

dc.contributor.authorPardo Llorente, Julio Ángel
dc.contributor.authorPardo Llorente, Leandro
dc.contributor.authorPardo Llorente, María del Carmen
dc.date.accessioned2023-06-20T09:43:09Z
dc.date.available2023-06-20T09:43:09Z
dc.date.issued2006-01
dc.description.abstractA general class of minimum distance estimators for logistic regression models based on the phi- divergence measures is introduced: The minimum phi- divergence estimator, which is seen to be a generalization of the maximum likelihood estimator. Its asymptotic properties are studied as well as its behaviour in small samples through a simulation study.
dc.description.departmentDepto. de Estadística e Investigación Operativa
dc.description.facultyFac. de Ciencias Matemáticas
dc.description.refereedTRUE
dc.description.sponsorshipDGI
dc.description.statuspub
dc.eprint.idhttps://eprints.ucm.es/id/eprint/17492
dc.identifier.doi10.1007/s00362-005-0274-7
dc.identifier.issn0932-5026
dc.identifier.officialurlhttp://www.springerlink.com/content/hv6733q0308v37gn/fulltext.pdf
dc.identifier.relatedurlhttp://www.springerlink.com
dc.identifier.urihttps://hdl.handle.net/20.500.14352/50241
dc.issue.number1
dc.journal.titleStatistical Papers
dc.language.isoeng
dc.page.final108
dc.page.initial91
dc.publisherSpringer Verlag
dc.relation.projectIDBMF2003-00892
dc.rights.accessRightsrestricted access
dc.subject.cdu519.22
dc.subject.keywordLogistic regression model
dc.subject.keywordMinimum C-divergence estimator
dc.subject.keywordMonte Carlo simulation
dc.subject.keywordNewton-Raphson method.
dc.subject.ucmEstadística matemática (Matemáticas)
dc.subject.unesco1209 Estadística
dc.titleMinimum phi-divergence estimator in logistic regression models
dc.typejournal article
dc.volume.number47
dcterms.referencesAgresti, A. (1990). Categorical Data Analysis. John Wiley & Sons, New York. All, S. M. and Silvey, S. D. (1966). A general class of coefficients of divergence of one distribution from another. Journal of the Royal Statistical Society, Series B, 26, 131-142 Amemiya, T. (1972). Bivariate Probit Analysis: Minimum Chi-square Methods. Journal of American Statistical Association, 69, 940-944. Amemiya, T. (1981). Qualitative Response Models: A Survey. Journal of Economic Literature, XIX, 1483-1536. Amemiya, T. (1985). Advanced Econometrics, Basil Blackwell. Oxford and New York. Andersen, E.B. (1990). The Statistical Analysis of Categorical Data. Heidelberg: Springer-Verlag. Andersen, E.B. (1996). Introduction to the Statistical Analysis of Categorical Data. Springer-Verlag. Berkson, J. (1944). Application of logistic functions to bioassay. Journal of American Statistical Association, 39, 357-365. Berkson, J. (1953). A statistical precise and relatively simple method of estimating the bioassay with quantal response, based on the logistic function. Journal of American Statistical Association, 48, 565-599. Berkson, J. (1955). Maximum likelihood and minimum X 2 estimates of the logistic function. Journal of American Statistical Association, 50, 130-162. Cox, C. (1984). An elementary introduction to maximum likelihood estimation for multinomial models: Birch's theorem and delta method. The American Statistician, 38, 283-287. Cox, D.R. and Snell, E.J. (1989). Analysis of Binary Data. Chapman and Hall. Cressie, N. and Read, T. R. C. (1984). Multinomial goodness-of-fit tests. Journal of the Royal Statistical Society, Series B, 46~ 440-464. Cressie, N. and Pardo, L. (2000). Minimum C-divergence estimator and hierarchical testing in Loglinear models. Statistica Sinica, 10(3), 8674884. Cressie, N., Paxdo, L. and Pardo, M.C. (2003). Size and power considerations for testing Loglinear models using C-divergence test statistics. Statistica Sinica, 13(2),555-570. Csisz~, I. (1967). Information type measures of difference of probability distributions and indirect observations. Studia Scientiarum Mathematicarum Hungarica, 2~ 105-113. Flemmning, W. (1977). Functions of Several Variables, Second Edition, Springer-Verlag. New York. Hosmer, D. W. and Lemeshow, S. (1989). Applied Logistic Regression. New York, John Wiley. Jennings, D.E. (1986). Judging inference adequacy in Logistic regression. Journal of the American Statistical Association, 81(396), 987-990. Kullback, S. (1985). Kullback information. In Encyclopedia of Statistical Sciences, Volume 4 (editors S. Kotz and N. L. Johnson), 421-425. New York, John Wiley. Morales, D., Pardo, L. and Vajda, I. (1995). Asymptotic divergence of estimates of discrete distributions. Journal of Statistical Planning and Inference, 48, 347-369. Pardo, L. (1997). Statistical Information Theory. (In Spanish). Hesperides. Spain. Pardo, L. and Pardo, M.C. (2003). Minimum power-divergence in three-way contingency tables. Journal of Statistical Computation and Simulation, 73(11), 819-831. Pardo, M.C. and Pardo, J.A. (1999). Small-sample comparisons for the Rukhin goodness-of-fit statistics. Statistical Papers, 40(2), 159-174. Parr, W.C. (1981). Minimum distance estimation: a bibliography. Communication in Statistics (Theory and Methods), 12, 1205-1224. Rukhin. A.L. (1994). Optimal estimator for the mixture parameter by the method of moments and information affinity. Trans. l~th Prague Conference on Information Theory, 214-219. Wolfowitz, J. (1953). Estimation by the minimum distance method. Annals of the Institute of Statistical Mathematics, 5, 9-23.
dspace.entity.typePublication
relation.isAuthorOfPublication5e051d08-2974-4236-9c25-5e14369a7b61
relation.isAuthorOfPublicationa6409cba-03ce-4c3b-af08-e673b7b2bf58
relation.isAuthorOfPublication.latestForDiscovery5e051d08-2974-4236-9c25-5e14369a7b61

Download

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
PardoJulio11.pdf
Size:
712.5 KB
Format:
Adobe Portable Document Format

Collections