Person:
Pardo Llorente, Leandro

Loading...
Profile Picture
First Name
Leandro
Last Name
Pardo Llorente
Affiliation
Universidad Complutense de Madrid
Faculty / Institute
Ciencias Matemáticas
Department
Estadística e Investigación Operativa
Area
Estadística e Investigación Operativa
Identifiers
UCM identifierScopus Author IDWeb of Science ResearcherIDDialnet IDGoogle Scholar ID

Search Results

Now showing 1 - 9 of 9
  • Item
    Informational distances and related statistics in mixed continuous and categorical variables
    (Journal of Statistical Planning and Inference, 1998) Morales González, Domingo; Pardo Llorente, Leandro; Zografos, Konstantinos
    A general class of dissimilarity measures among k greater than or equal to 2 distributions and their sample estimators are considered, for mixed continuous and categorical variables. The distributional properties are studied for the location model and the asymptotic distributions are investigated, in the general parametric case. The asymptotic distributions of the resulting statistics are used in various settings, to test statistical hypotheses.
  • Item
    The generalized entropy measure to the design and comparison of regression experiment in a Bayesian context
    (Information Sciences, 1993) Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro; Menéndez Calleja, María Luisa; Taneja, I.J.
    Taneja [14] studied a unified (r, s)-entropy that includes as a particular case some of the known entropies. Based on this unified (r, s)-entropy, Pardo et al. [8] defined the average amount of information provided by an experiment X over the unknown parameter θ with prior knowledge p(θ). By using average amount of information in unified form, we compare experiments based on the Bayesian approach. Some connections with the criterion of Blackwell and Lehmann are also made. In this paper, an application of generalized entropy measures to the design and comparison of linear regression experiment is presented.
  • Item
    Asymptotic distributions of weighted divergence between discrete distributions
    (Communications in statistics. Theory and methods, 1998) Franck, Ove; Menéndez Calleja, María Luisa; Pardo Llorente, Leandro
    A divergence measure between discrete probability distributions introduced by Csiszar (1967) generalizes the Kullback-Leibler information and several other information measures considered in the literature. We introduce a weighted divergence which generalizes the weighted Kullback-Leibler information considered by Taneja (1985). The weighted divergence between an empirical distribution and a fixed distribution and the weighted divergence between two independent empirical distributions are here investigated for large simple random samples, and the asymptotic distributions are shown to be either normal or equal to the distribution of a linear combination of independent chi(2)-variables.
  • Item
    Comparison of experiments based on generalized entropy measures.
    (Communications in statistics. Theory and methods, 1993) Pardo Llorente, Julio Ángel; Menéndez, María Luisa; Taneja, I.J.; Pardo Llorente, Leandro
    Taneja (1989) studied a unified (r,s)-entropy which includes as a particular case some of the known entropies.Based on this unified (r,s)-entropy, we have defined the average amount of information provided by an experiment X over the unknown parameter theta with prior knowledge p(theta). By using average amount of information in unified form, we have compared experiments based on bayesian approach. Some connections with the criteria of Blackwell and Lehmann are also made.
  • Item
    Statistical inference for finite Markov chains based on divergences
    (Statistics and probability letters, 1999) Menéndez Calleja, María Luisa; Morales González, Domingo; Pardo Llorente, Leandro; Zografos, Konstantinos
    We consider statistical data forming sequences of states of stationary finite irreducible Markov chains, and draw statistical inference about the transition matrix. The inference consists in estimation of parameters of transition probabilities and testing simple and composite hypotheses about them. The inference is based on statistics which are suitable weighted sums of normed phi-divergences of theoretical row distributions, evaluated at suitable points, and observed empirical row distributions. The asymptotic distribution of minimum phi-divergence estimators is obtained, as well as critical values of asymptotically alpha-level tests.
  • Item
    Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems
    (Fuzzy Sets and Systems, 1992) Menéndez Calleja, María Luisa; Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro
    In previous papers we have compared statistical experiments by considering unified scalar parametric generalizations of the Jensen difference divergence measure. In this paper we first introduce the generalized Jensen difference divergence measures when the available information from the experiment is not exact in order to compare fuzzy information systems. Then we establish the relationship between extended Fisher information and the generalized (r, s)-Jensen difference divergence measures.
  • Item
    Some new statistics for testing hypotheses in parametric models
    (Journal of multivariate analysis, 1997) Morales González, Domingo; Pardo Llorente, Leandro; Vadja, Igor
    The paper deals with simple and composite hypotheses in statistical models with i.i.d. observations and with arbitrary families dominated by a finite measures and parametrized by vector-valued variables. It introduces phi-divergence testing statistics as alternatives to the classical ones: the generalized likelihood ratio and the statistics of Wald and Rao. It is shown that, under the assumptions of standard type about hypotheses and model densities, the results about asymptotic distribution of the classical statistics established so far for the counting and Lebesgue dominating measures (discrete and continuous models) remain true also in the general case. Further, these results are extended to the phi-divergence statistics with smooth convex functions phi. The choice of phi-divergence statistics optimal from the point of view of power is discussed and illustrated by several examples.
  • Item
    φ-divergences and nested models.
    (Applied Mathematics Letters, 1997) Menéndez Calleja, María Luisa; Morales González, Domingo; Pardo Llorente, Leandro
    We consider a wide class of statistics, namely phi-divergences. We obtain asymptotic distributions of these statistics in nested models. Our result generalizes previous results in this field.
  • Item
    Asymptotic approximations for the distributions of the (h, f)-divergence goodness-of-fit statistics: application to Renyi’s statistic
    (Kybernetes, 1997) Menéndez Calleja, María Luisa; Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro; Pardo Llorente, María del Carmen
    Read (1984) presented an asymptotic expansion for the distribution function of the power divergence statistics whose speed of convergence is dependent on the parameter of the family. Generalizes that result by considering the family of ((h) under bar, <(phi)under bar>)-divergence measures. Considers two other closer approximations to the exact distribution. Compares these three approximations for the Renyi's statistic in small samples.