Person:
Pardo Llorente, Leandro

Loading...
Profile Picture
First Name
Leandro
Last Name
Pardo Llorente
Affiliation
Universidad Complutense de Madrid
Faculty / Institute
Ciencias Matemáticas
Department
Estadística e Investigación Operativa
Area
Estadística e Investigación Operativa
Identifiers
UCM identifierScopus Author IDWeb of Science ResearcherIDDialnet IDGoogle Scholar ID

Search Results

Now showing 1 - 5 of 5
  • Item
    Comparison of experiments based on generalized entropy measures.
    (Communications in statistics. Theory and methods, 1993) Pardo Llorente, Julio Ángel; Menéndez, María Luisa; Taneja, I.J.; Pardo Llorente, Leandro
    Taneja (1989) studied a unified (r,s)-entropy which includes as a particular case some of the known entropies.Based on this unified (r,s)-entropy, we have defined the average amount of information provided by an experiment X over the unknown parameter theta with prior knowledge p(theta). By using average amount of information in unified form, we have compared experiments based on bayesian approach. Some connections with the criteria of Blackwell and Lehmann are also made.
  • Item
    Asymptotic approximations for the distributions of the (h, f)-divergence goodness-of-fit statistics: application to Renyi’s statistic
    (Kybernetes, 1997) Menéndez Calleja, María Luisa; Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro; Pardo Llorente, María del Carmen
    Read (1984) presented an asymptotic expansion for the distribution function of the power divergence statistics whose speed of convergence is dependent on the parameter of the family. Generalizes that result by considering the family of ((h) under bar, <(phi)under bar>)-divergence measures. Considers two other closer approximations to the exact distribution. Compares these three approximations for the Renyi's statistic in small samples.
  • Item
    A necessary power divergence-type family of tests for testing elliptical symmetry
    (Journal of Statistical Computation and Simulation, 2014) Batsidis, Apostolos; Martin, Nirian; Pardo Llorente, Leandro; Zografos, Konstantinos
    This paper presents a family of power divergence-type test statistics for testing the hypothesis of elliptical symmetry. We assess the performance of the new family of test statistics, using Monte Carlo simulation. In this context, the type I error rate as well as the power of the tests are studied. Specifically, for selected alternatives, we compare the power of the proposed procedure with that proposed by Schott [Testing for elliptical symmetry in covariance-matrix-based analyses, Stat. Probab. Lett. 60 (2002), pp. 395-404]. This last test statistic is an easily computed one with a tractable null distribution and very good power for various alternatives, as it has established in previous published simulations studies [F. Huffer and C. Park, A test for elliptical symmetry, J. Multivariate Anal. 98 (2007), pp. 256-281; L. Sakhanenko, Testing for ellipsoidal symmetry: A comparison study, Comput. Stat. Data Anal. 53 (2008), pp. 565-581]. Finally, a well-known real data set is used to illustrate the method developed in this paper.
  • Item
    Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems
    (Fuzzy Sets and Systems, 1992) Menéndez Calleja, María Luisa; Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro
    In previous papers we have compared statistical experiments by considering unified scalar parametric generalizations of the Jensen difference divergence measure. In this paper we first introduce the generalized Jensen difference divergence measures when the available information from the experiment is not exact in order to compare fuzzy information systems. Then we establish the relationship between extended Fisher information and the generalized (r, s)-Jensen difference divergence measures.
  • Item
    The generalized entropy measure to the design and comparison of regression experiment in a Bayesian context
    (Information Sciences, 1993) Pardo Llorente, Julio Ángel; Pardo Llorente, Leandro; Menéndez Calleja, María Luisa; Taneja, I.J.
    Taneja [14] studied a unified (r, s)-entropy that includes as a particular case some of the known entropies. Based on this unified (r, s)-entropy, Pardo et al. [8] defined the average amount of information provided by an experiment X over the unknown parameter θ with prior knowledge p(θ). By using average amount of information in unified form, we compare experiments based on the Bayesian approach. Some connections with the criterion of Blackwell and Lehmann are also made. In this paper, an application of generalized entropy measures to the design and comparison of linear regression experiment is presented.