Para depositar en Docta Complutense, identifícate con tu correo @ucm.es en el SSO institucional: Haz clic en el desplegable de INICIO DE SESIÓN situado en la parte superior derecha de la pantalla. Introduce tu correo electrónico y tu contraseña de la UCM y haz clic en el botón MI CUENTA UCM, no autenticación con contraseña.
 

Minimum K-phi-divergence estimator

dc.contributor.authorPérez, T.
dc.contributor.authorPardo Llorente, Julio Ángel
dc.date.accessioned2023-06-20T09:43:23Z
dc.date.available2023-06-20T09:43:23Z
dc.date.issued2004-04
dc.description.abstractIn the present work, the problem of estimating parameters of statistical models for categorical data is analyzed. The minimum K-phi-divergence estimator is obtained minimizing the K-phi-divergence measure between the theoretical and the empirical probability vectors. Its asymptotic properties are obtained. Rom a simulation study, the conclusion is that our estimator emerges as an attractive alternative to the classical maximum likelihood estimator.
dc.description.departmentDepto. de Estadística e Investigación Operativa
dc.description.facultyFac. de Ciencias Matemáticas
dc.description.refereedTRUE
dc.description.statuspub
dc.eprint.idhttps://eprints.ucm.es/id/eprint/17521
dc.identifier.doi10.1016/s0893-9659(04)00040-0
dc.identifier.issn0893-9659
dc.identifier.officialurlhttp://www.sciencedirect.com/science/article/pii/S0893965904900766
dc.identifier.relatedurlhttp://www.sciencedirect.com
dc.identifier.urihttps://hdl.handle.net/20.500.14352/50248
dc.issue.number4
dc.journal.titleApplied Mathematics Letters
dc.language.isoeng
dc.page.final374
dc.page.initial367
dc.publisherPergamon-Elsevier Science
dc.relation.projectIDDGI BFM-2000-0800
dc.rights.accessRightsrestricted access
dc.subject.cdu519.22
dc.subject.keywordCategorical data
dc.subject.keywordK¢-divergence
dc.subject.keywordMinimum K¢-divergence estimator
dc.subject.keywordConsistency
dc.subject.keywordSimulation.
dc.subject.ucmEstadística matemática (Matemáticas)
dc.subject.unesco1209 Estadística
dc.titleMinimum K-phi-divergence estimator
dc.typejournal article
dc.volume.number17
dcterms.referencesS. Kullback and A. Leibler, On information and sufficiency, Annals of Mathematical Statistics 22, 76-86, (1951). N. Cressie and T.P~.C. Read, Multinomial goodness-of-fit tests, Journal of the Royal Statistical Society, Series B 46, 440-464, (1984). D. Morales, L. Pardo and I. Vajda, Asymptotic divergence of estimates of discrete distributions, Journal of Statistical Planning and Inference 48, 347-369, (1995). S.M. Ali and S.D. Silvey, A general class of coefficients of divergence of one distribution from another, Journal of the Royal Statistical Society, Series B 26, 131-142, (1966). I. Csisz£r, Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit~it on Markhoffschen Ketten., Publications of the Mathematical Institute of Hungarian Academy of Sciences, Series A 8, 85-108, (1963). M.C. Pardo, Asymptotic behaviour of an estimator based on Rao's divergence, Kybernetika 33 (5), 489-504, (1997). M.C. Pardo, A comparison of some estimators of the mixture proportion of mixed normal distributions, Journal of Computational and Applied Mathematics 84, 207-217, (1997). J. Burbea and C.R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Transactions on Information Theory 28, 489-495, (1982). N.W. Birch, A new proof of the Pearson-Fisher theorem, Annals of Mathematical Statistics 35, 817-824, (1964).
dspace.entity.typePublication
relation.isAuthorOfPublication5e051d08-2974-4236-9c25-5e14369a7b61
relation.isAuthorOfPublication.latestForDiscovery5e051d08-2974-4236-9c25-5e14369a7b61

Download

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
PardoJulio16.pdf
Size:
379.13 KB
Format:
Adobe Portable Document Format

Collections