Minimum K-phi-divergence estimator
dc.contributor.author | Pérez, T. | |
dc.contributor.author | Pardo Llorente, Julio Ángel | |
dc.date.accessioned | 2023-06-20T09:43:23Z | |
dc.date.available | 2023-06-20T09:43:23Z | |
dc.date.issued | 2004-04 | |
dc.description.abstract | In the present work, the problem of estimating parameters of statistical models for categorical data is analyzed. The minimum K-phi-divergence estimator is obtained minimizing the K-phi-divergence measure between the theoretical and the empirical probability vectors. Its asymptotic properties are obtained. Rom a simulation study, the conclusion is that our estimator emerges as an attractive alternative to the classical maximum likelihood estimator. | |
dc.description.department | Depto. de Estadística e Investigación Operativa | |
dc.description.faculty | Fac. de Ciencias Matemáticas | |
dc.description.refereed | TRUE | |
dc.description.status | pub | |
dc.eprint.id | https://eprints.ucm.es/id/eprint/17521 | |
dc.identifier.doi | 10.1016/s0893-9659(04)00040-0 | |
dc.identifier.issn | 0893-9659 | |
dc.identifier.officialurl | http://www.sciencedirect.com/science/article/pii/S0893965904900766 | |
dc.identifier.relatedurl | http://www.sciencedirect.com | |
dc.identifier.uri | https://hdl.handle.net/20.500.14352/50248 | |
dc.issue.number | 4 | |
dc.journal.title | Applied Mathematics Letters | |
dc.language.iso | eng | |
dc.page.final | 374 | |
dc.page.initial | 367 | |
dc.publisher | Pergamon-Elsevier Science | |
dc.relation.projectID | DGI BFM-2000-0800 | |
dc.rights.accessRights | restricted access | |
dc.subject.cdu | 519.22 | |
dc.subject.keyword | Categorical data | |
dc.subject.keyword | K¢-divergence | |
dc.subject.keyword | Minimum K¢-divergence estimator | |
dc.subject.keyword | Consistency | |
dc.subject.keyword | Simulation. | |
dc.subject.ucm | Estadística matemática (Matemáticas) | |
dc.subject.unesco | 1209 Estadística | |
dc.title | Minimum K-phi-divergence estimator | |
dc.type | journal article | |
dc.volume.number | 17 | |
dcterms.references | S. Kullback and A. Leibler, On information and sufficiency, Annals of Mathematical Statistics 22, 76-86, (1951). N. Cressie and T.P~.C. Read, Multinomial goodness-of-fit tests, Journal of the Royal Statistical Society, Series B 46, 440-464, (1984). D. Morales, L. Pardo and I. Vajda, Asymptotic divergence of estimates of discrete distributions, Journal of Statistical Planning and Inference 48, 347-369, (1995). S.M. Ali and S.D. Silvey, A general class of coefficients of divergence of one distribution from another, Journal of the Royal Statistical Society, Series B 26, 131-142, (1966). I. Csisz£r, Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit~it on Markhoffschen Ketten., Publications of the Mathematical Institute of Hungarian Academy of Sciences, Series A 8, 85-108, (1963). M.C. Pardo, Asymptotic behaviour of an estimator based on Rao's divergence, Kybernetika 33 (5), 489-504, (1997). M.C. Pardo, A comparison of some estimators of the mixture proportion of mixed normal distributions, Journal of Computational and Applied Mathematics 84, 207-217, (1997). J. Burbea and C.R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Transactions on Information Theory 28, 489-495, (1982). N.W. Birch, A new proof of the Pearson-Fisher theorem, Annals of Mathematical Statistics 35, 817-824, (1964). | |
dspace.entity.type | Publication | |
relation.isAuthorOfPublication | 5e051d08-2974-4236-9c25-5e14369a7b61 | |
relation.isAuthorOfPublication.latestForDiscovery | 5e051d08-2974-4236-9c25-5e14369a7b61 |
Download
Original bundle
1 - 1 of 1