Person:
Pardo Llorente, Leandro

Loading...
Profile Picture
First Name
Leandro
Last Name
Pardo Llorente
Affiliation
Universidad Complutense de Madrid
Faculty / Institute
Ciencias Matemáticas
Department
Estadística e Investigación Operativa
Area
Estadística e Investigación Operativa
Identifiers
UCM identifierScopus Author IDWeb of Science ResearcherIDDialnet IDGoogle Scholar ID

Search Results

Now showing 1 - 5 of 5
  • Item
    On tests of independence based on minimum phi-divergence estimator with constraints: An application to modeling DNA
    (Computational Statistics and Data Analysis, 2006) Menéndez Calleja, María Luisa; Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro; Zografos, Konstantinos
    A new family of estimators, Minimum phi-divergence estimators, is introduced for the problem of independence in a two-way contingency table and their asymptotic properties are studied. Based on this new family of estimators, a new family of test statistics for the problem of independence is defined. This new family of test statistics yield the likelihood ratio test and the Pearson test statistic as special cases. A simulation study is presented to show that some new test statistics offer an attractive alternative to the classical Pearson and likelihood ratio test statistics for this problem. The procedures proposed in this paper can be used for testing positional independence of a DNA sequence as it is illustrated by a numerical example.
  • Item
    Order-Restricted Dose-Related Trend Phi-Divergence Tests For Generalized Linear Models
    (Journal of Applied Statistics, 2007) Felipe Ortega, Ángel; Menéndez Calleja, María Luisa; Pardo Llorente, Leandro
    In This Paper A New Family Of Test Statistics Is Presented For Testing The Independence. Between The Binary Response Y And An Ordered Categorical Explanatory Variable X (Doses) Against The Alternative Hypothesis Of An Increase Dose-Response Relationship Between A Response Variable Y And X (Doses). The Properties Of These Test Statistics Are Studied. This New Family Of Test Statistics Is Based On The Family Of Φ-Divergence Measures And Contains As A Particular Case The Likelihood Ratio Test. We Pay Special Attention To The Family Of Test Statistics Associated With The Power Divergence Family. A Simulation Study Is Included In Order To Analyze The Behavior Of The Power Divergence Family Of Test Statistics
  • Item
    Asymptotic approximations for the distributions of the (h, f)-divergence goodness-of-fit statistics: application to Renyi’s statistic
    (Kybernetes, 1997) Menéndez Calleja, María Luisa; Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro; Pardo Llorente, María del Carmen
    Read (1984) presented an asymptotic expansion for the distribution function of the power divergence statistics whose speed of convergence is dependent on the parameter of the family. Generalizes that result by considering the family of ((h) under bar, <(phi)under bar>)-divergence measures. Considers two other closer approximations to the exact distribution. Compares these three approximations for the Renyi's statistic in small samples.
  • Item
    Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems
    (Fuzzy Sets and Systems, 1992) Menéndez Calleja, María Luisa; Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro
    In previous papers we have compared statistical experiments by considering unified scalar parametric generalizations of the Jensen difference divergence measure. In this paper we first introduce the generalized Jensen difference divergence measures when the available information from the experiment is not exact in order to compare fuzzy information systems. Then we establish the relationship between extended Fisher information and the generalized (r, s)-Jensen difference divergence measures.
  • Item
    The generalized entropy measure to the design and comparison of regression experiment in a Bayesian context
    (Information Sciences, 1993) Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro; Menéndez Calleja, María Luisa; Taneja, I.J.
    Taneja [14] studied a unified (r, s)-entropy that includes as a particular case some of the known entropies. Based on this unified (r, s)-entropy, Pardo et al. [8] defined the average amount of information provided by an experiment X over the unknown parameter θ with prior knowledge p(θ). By using average amount of information in unified form, we compare experiments based on the Bayesian approach. Some connections with the criterion of Blackwell and Lehmann are also made. In this paper, an application of generalized entropy measures to the design and comparison of linear regression experiment is presented.