About distances of discrete distributions satisfying the data processing theorem of information theory

dc.contributor.authorPardo Llorente, María del Carmen
dc.date.accessioned2023-06-20T17:08:50Z
dc.date.available2023-06-20T17:08:50Z
dc.date.issued1997-07
dc.descriptionThis research was supported by DGICYT under Grant PB 93-0068 and GA CR under Grant 102/94/0320.
dc.description.abstractDistances of discrete probability distributions are considered. Necessary and sufficient conditions for validity of the data processing theorem of information theory are established. These conditions are applied to the Burbea–Rao divergences and Bregman distances.
dc.description.departmentDepto. de Estadística e Investigación Operativa
dc.description.facultyFac. de Ciencias Matemáticas
dc.description.refereedTRUE
dc.description.sponsorshipDGICYT
dc.description.sponsorshipGA CR
dc.description.statuspub
dc.eprint.idhttps://eprints.ucm.es/id/eprint/17926
dc.identifier.doi10.1109/18.605597
dc.identifier.issn0018-9448
dc.identifier.officialurlhttp://ieeexplore.ieee.org/Xplore/defdeny.jsp?url=http%3A%2F%2Fieeexplore.ieee.org%2Fstamp%2Fstamp.jsp%3Ftp%3D%26arnumber%3D605597%26userType%3Dinst&denyReason=-133&arnumber=605597&productsMatched=null&userType=inst
dc.identifier.relatedurlhttp://ieeexplore.ieee.org/Xplore/home.jsp?tag=1
dc.identifier.urihttps://hdl.handle.net/20.500.14352/57842
dc.issue.number4
dc.journal.titleIEEE transactions on information theory
dc.language.isoeng
dc.page.final1293
dc.page.initial1288
dc.publisherInstitute of Electrical and Electronics Engineers
dc.relation.projectIDPB 93-0068
dc.relation.projectID102/94/0320
dc.rights.accessRightsrestricted access
dc.subject.cdu007
dc.subject.keywordBregman distance
dc.subject.keywordBurbea-Rao divergence
dc.subject.keywordCsiszár divergence
dc.subject.keyworddistance of probability measures
dc.subject.keyworddata processing theorem
dc.subject.ucmTeoría de la información
dc.subject.unesco5910.01 Información
dc.titleAbout distances of discrete distributions satisfying the data processing theorem of information theory
dc.typejournal article
dc.volume.number43
dcterms.referencesM. S. Ali and D. Silvey, “A general class of coefficients of divergence of one distribution from another,” J. Roy. Statist. Soc. Ser. B, vol. 28, pp. 131–140, 1966. J. Aczél, Lectures on Functional Equations and their Applications. New York: Academic, 1966. S. Amari, Diferential Geometric Methods in Statistics, 2nd ed. New York: Springer, 1990 A. R. Barron, “The strong ergodic theorem for densities: Generalized Shannon–McMillan–Breiman theorem,” Ann. Probab., vol. 13, pp. 1292–1303, 1985. A. Battacharyya, “On a measure of divergence between two statistical populations defined by their probability distributions,” Bull. Calcutta. Math. Soc., vol. 35, pp. 99–109, 1946. P. Billingsley, Convergence of Probability Measures. New York: Wiley, 1968. R. E. Blahut, Principles and Practice of Information Theory. Reading, MA: Adisson-Wesley, 1987. L. M. Bregman, “The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming,” USSR Comput. Math. and Math. Phys., vol. 7, pp. 200–217, 1967. J. Burbea and C. R. Rao, “On the convexity of some divergence measures based on entropy functions,” Trans. IEEE Inform. Theory, vol. IT-28, pp. 489–495, 1982. B. S. Clarke and A. R. Barron, “Information-theoretic asymptotics of Bayes methods,” IEEE Trans. Inform. Theory, vol. 36, pp. 453–471, 1990. J.E.Cohen,Y.Derrienovicz,and Ch. Zbaganu, “Majorization, monotonicity of relative entropy and stochastic matrices,” Contemp. Math., vol. 49, pp. 251–259, 1993. T. M. Cover and J. B. Thomas, Elements of Information Theory. New York: Wiley, 1991. N. Cressie and T. R. C. Read, “Multinomial goodness of fit test,” J. Roy. Statist. Soc. Ser. B, vol. 46, pp. 440–464, 1984. I. Csiszár, “Eine Informationtheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit¨at von Markhoffschen Ketten,” Publ. Math. Inst. Hungar. Acad. Sci. Ser. A, vol. 8, pp. 85–108, 1963. “Information-type measures of difference of probability distributions and indirect observations,” Studia Sci. Math. Hungar., vol. 2, pp. 299–318, 1967. “Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems,” Ann. Statist., vol. 19, pp. 2031–2066, 1991. “Maximum entropy and related methods,” in Trans. 12th Prague Conf on Information Theory,Statist.Dec.Functions and Random Processes.Prague,Czech Rep.: Czech Acad. Sci., 1994, pp. 58–62. “Generalized cutoff rates and Rényi’s information measures ,” Trans. IEEE Inform. Theory, vol. 41, pp. 26–34, 1995. P. J. Huber, Robust Statistics. New York: Wiley, 1981. S.Kullback and R.Leibler, “On information and sufficiency,” Ann. Math. Statist., vol. 22, pp. 79–86, 1951. S. Kullback, Information Theory and Statistics. New York: Wiley, 1959. F. Liese and I.Vajda,Convex Statistical Distances. Leipzig, Germany: Teubner, 1987. A. W. Marshall and I. Olkin, Inequalities: Theory of Majorization and its Applications. New York: Academic, 1979. K. Matusita, “Distances and decision rules,” Ann. Inst. Statist. Math., vol. 16, pp. 305–320, 1964. J. Neyman, “Contribution to the theory of the X2 test,” in Proc. 1st Berkeley Symp. Math. Statist. Probab. Berkeley, CA: Univ. of Berkeley Press, 1949, pp. 239–273. F. Österreicher, “On a class of perimeter-type distances of probability distributions,” Kybernetika, vol. 32, pp. 389–393, 1996. M. C. Pardo, “Minimum R-divergence estimators. Asymptotic behaviour and statistical applications,”Ph.D. dissertation, Complutense University, Madrid, Spain, 1996. L. Pardo, M.Salicrú, M. L.Menéndez, and D. Morales, “Divergence measures based on entropy function and statistical inference,” Sankhya, Ser. B., vol. 57, pp. 315–337, 1995. C. R. Rao, “Asymptotic efficiency and limiting information,” in Proc. 4th Berkeley Symp. on Mathematical Statistics and Probability, vol. 1. Berkeley, CA: Univ. of Calif. Press, 1961, pp. 531–546. T. R. C. Read and N. Cressie, Goodness of Fit Statistics for Discrete Multivariate Data. New York: Springer, 1988. A. Rényi, “On measures of entropy and information,” in Proc. 4th Berkeley Symp. on Mathematical Statistics and Probability, vol. 1. Berkeley, CA: Univ. of Calif. Press, 1961, pp. 547–561. I. Vajda, Theory of Statistical Inference and Information. Boston, MA: Kluwer, 1989. I. Vajda and V. Kus, “Relation between divergences, total variation and Euclidean distances,” Res. Rep. 1853, Institute of Inform. Theory, Prague, Czech Rep., 1995
dspace.entity.typePublication

Download

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
PardoCarmen17.pdf
Size:
218.44 KB
Format:
Adobe Portable Document Format

Collections