Franck, OveMenéndez Calleja, María LuisaPardo Llorente, Leandro2023-06-202023-06-2019980361-092610.1080/03610929808832133https://hdl.handle.net/20.500.14352/57877A divergence measure between discrete probability distributions introduced by Csiszar (1967) generalizes the Kullback-Leibler information and several other information measures considered in the literature. We introduce a weighted divergence which generalizes the weighted Kullback-Leibler information considered by Taneja (1985). The weighted divergence between an empirical distribution and a fixed distribution and the weighted divergence between two independent empirical distributions are here investigated for large simple random samples, and the asymptotic distributions are shown to be either normal or equal to the distribution of a linear combination of independent chi(2)-variables.Asymptotic distributions of weighted divergence between discrete distributionsjournal articlehttp://www.tandfonline.com/doi/abs/10.1080/03610929808832133http://www.tandfonline.com/metadata only access519.2informationentropydivergencegoodness-of-fitasymptotic sampling distributionsstatisticsgoodnesstestsfitEstadística aplicada