Aviso: para depositar documentos, por favor, inicia sesión e identifícate con tu cuenta de correo institucional de la UCM con el botón MI CUENTA UCM. No emplees la opción AUTENTICACIÓN CON CONTRASEÑA Disculpen las molestias.
 

Minimum (h,phi )-divergences estimators with weights

dc.contributor.authorLandaburu Jiménez, María Elena
dc.contributor.authorPardo Llorente, Leandro
dc.date.accessioned2023-06-20T09:39:05Z
dc.date.available2023-06-20T09:39:05Z
dc.date.issued2003-07-30
dc.description.abstractWe consider experiments involving the observation of a discrete random variable, or quantitative classification process and we also assume that in addition to probability of each value or class we know its "utility" or "weight" (or more precisely, we can quantify the "nature" of each value or class). In this paper a procedure of minimum divergence estimation based on (h;phi)-divergences is analyzed, for the considered experiments, and its asymptotic behaviour is studied. (C) 2002 Elsevier Science Inc. All rights reserved
dc.description.departmentDepto. de Estadística e Investigación Operativa
dc.description.facultyFac. de Ciencias Matemáticas
dc.description.refereedTRUE
dc.description.sponsorshipDGI
dc.description.statuspub
dc.eprint.idhttps://eprints.ucm.es/id/eprint/16500
dc.identifier.doi10.1016/S0096-3003(02)00188-1
dc.identifier.issn0096-3003
dc.identifier.officialurlhttp://www.sciencedirect.com/science/article/pii/S0096300302001881
dc.identifier.relatedurlhttp://www.sciencedirect.com/
dc.identifier.urihttps://hdl.handle.net/20.500.14352/50114
dc.issue.number1
dc.journal.titleApplied Mathematics and Computation
dc.language.isoeng
dc.page.final28
dc.page.initial15
dc.publisherElsevier
dc.relation.projectIDBFM2000-0800
dc.rights.accessRightsrestricted access
dc.subject.cdu519.24
dc.subject.keywordWeights
dc.subject.keywordAsymptotic distributions
dc.subject.keyword($h
dc.subject.keyword\phi$)-divergences
dc.subject.keywordMinimum
dc.subject.keyword\phi$)-divergence weighted estimator
dc.subject.keywordRenyi's divergence
dc.subject.ucmProbabilidades (Matemáticas)
dc.titleMinimum (h,phi )-divergences estimators with weights
dc.typejournal article
dc.volume.number140
dcterms.referencesM.W. Birch, A new proof of the Pearson–Fisher theorem, Annals of Mathematical Statistics 35 (1964) 817–824. C. Cox, An elementary introduction to maximum likelihood estimation for multinomial models: Birch´s theorem and the delta method, The American Statistician 38 (1984) 283–287. N. Cressie, T.R.C. Read, Multinomial goodness-of-fit tests, Journal of the Royal Statistic Society B 46 (1984) 440–464. I. Csiszár, Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit€at von Markoffschen Ketten, Publications of the Mathematical Institute of Hungarian Academy of Sciences 8 (Series A) (1963) 85–108. M.A. Gil, R. Perez, P. Gil, A family of measures of uncertainty involving utilities: definition, properties, applications and statistical inferences, Metrika 36 (1989) 129-147. S. Guiasu, Grouping data by using the weighted entropy, Journal Statistical Planning Inference 15 (1986) 63–69. F. Liese, I. Vajda, Convex Statistical Distances, Teubner, Leipzig, 1987. E. Landaburu, L. Pardo, Goodness of fit tests with weights in the classes based on ðh;/Þdivergences, Kybernetika 36 (2000) 589–602. G. Longo, Quantitative and Qualitative Measure of Information, Springer, New York, 1970. K. Matusita, On the estimation by minimum distance method, Annals of the Institute of Statistical Mathematics 24 (1954) 473–482. M.L. Menéndez, D. Morales, L. Pardo, M. Salicrú, Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: a unified study, Statistical Papers 36 (1995) 1–29. D. Morales, L. Pardo, I. Vajda, Asymptotic divergence of estimates of discrete distributions, Journal of Statistical Planning and Inference 48 (1995) 347–369. J. Neyman, Contribution to the theory of the v2 test, in: Proceeding of the First Berkeley Symposium on Mathematical Statistics and Probability, 1949, pp. 239–273. C.T. Taneja, On the mean and the variance of estimates of Kullback information and relative useful information measures, Aplikace Matematiky 30 (1985) 166–175. I. Vajda, Theory of Statistical Inference and Information, Kluwer Academic Publishers, Dordrecht, 1989. A.M. Yaglom, I.M. Yaglom, Probability and Information Theory, North-Holland, Amsterdam, 1983.
dspace.entity.typePublication
relation.isAuthorOfPublication0cf1bfef-b105-422e-9f20-80ca13261ed7
relation.isAuthorOfPublicationa6409cba-03ce-4c3b-af08-e673b7b2bf58
relation.isAuthorOfPublication.latestForDiscovery0cf1bfef-b105-422e-9f20-80ca13261ed7

Download

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Landaburu05.pdf
Size:
139.18 KB
Format:
Adobe Portable Document Format

Collections