Person:
Susi García, María Del Rosario

Loading...
Profile Picture
First Name
María Del Rosario
Last Name
Susi García
Affiliation
Universidad Complutense de Madrid
Faculty / Institute
Estudios estadísticos
Department
Estadística y Ciencia de los Datos
Area
Estadística e Investigación Operativa
Identifiers
UCM identifierORCIDScopus Author IDWeb of Science ResearcherIDDialnet IDGoogle Scholar ID

Search Results

Now showing 1 - 4 of 4
  • Item
    Project number: PIMCD144/23-24
    Uso del cine y otros medios audiovisuales como herramienta en la docencia de Matemáticas y Estadística
    (2024) Cabrera Gómez, Gloria; Gamboa Pérez, María; Mateo Navas, Luis Miguel; Monge Romojaro; Mireya; Pérez Pérez, Teresa; Pons Bordería, María Jesús; Susi García, María Del Rosario; Taipe Hidalgo, Diana Paulina; Cabrera Gómez, Gloria
    El presente proyecto de innovación docente tiene como objetivo utilizar imágenes, podcasts, vídeos, canciones, cortes de películas, series de televisión, prensa digital, programa de radio o mensajes en redes sociales, en los que aparezcan contenidos matemáticos o estadísticos e insertarlos de manera dinámica en el desarrollo de la clase con el fin de mejorar la motivación, atención y aprendizaje.
  • Item
    Assessing the effect of kurtosis deviations from Gaussianity on conditional distributions
    (Applied Mathematics and Computation, 2013) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Navarro, H.; Susi García, María Del Rosario
    The multivariate exponential power family is considered for n-dimensional random variables, Z, with a known partition Z equivalent to (Y, X) of dimensions p and n - p, respectively, with interest focusing on the conditional distribution Y vertical bar X. An infinitesimal variation of any parameter of the joint distribution produces perturbations in both the conditional and marginal distributions. The aim of the study was to determine the local effect of kurtosis deviations using the Kullback-Leibler divergence measure between probability distributions. The additive decomposition of this measure in terms of the conditional and marginal distributions, Y vertical bar X and X, is used to define a relative sensitivity measure of the conditional distribution family {Y vertical bar X = x}. Finally, simulated results suggest that for large dimensions, the measure is approximately equal to the ratio p/n, and then the effect of non-normality with respect to kurtosis depends only on the relative size of the variables considered in the partition of the random vector.
  • Item
    The effect of block parameter perturbations in Gaussian Bayesian networks: Sensitivity and robustness
    (Information Sciences, 2013) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Susi García, María Del Rosario
    n this work we study the effects of model inaccuracies on the description of a Gaussian Bayesian network with a set of variables of interest and a set of evidential variables. Using the Kullback-Leibler divergence measure, we compare the output of two different networks after evidence propagation: the original network, and a network with perturbations representing uncertainties in the quantitative parameters. We describe two methods for analyzing the sensitivity and robustness of a Gaussian Bayesian network on this basis. In the sensitivity analysis, different expressions are obtained depending on which set of parameters is considered inaccurate. This fact makes it possible to determine the set of parameters that most strongly disturbs the network output. If all of the divergences are small, we can conclude that the network output is insensitive to the proposed perturbations. The robustness analysis is similar, but considers all potential uncertainties jointly. It thus yields only one divergence, which can be used to confirm the overall sensitivity of the network. Some practical examples of this method are provided, including a complex, real-world problem
  • Item
    Sensitivity to hyperprior parameters in Gaussian Bayesian networks
    (Journal of multivariate analysis, 2014) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Navarro, H.; Susi García, María Del Rosario
    Bayesian networks (BNs) have become an essential tool for reasoning under uncertainty in complex models. In particular, the subclass of Gaussian Bayesian networks (GBNs) can be used to model continuous variables with Gaussian distributions. Here we focus on the task of learning GBNs from data. Factorization of the multivariate Gaussian joint density according to a directed acyclic graph (DAG) provides an alternative and interchangeable representation of a GBN by using the Gaussian conditional univariate densities of each variable given its parents in the DAG. With this latter conditional specification of a GBN, the learning process involves determination of the mean vector, regression coefficients and conditional variances parameters. Some approaches have been proposed to learn these parameters from a Bayesian perspective using different priors, and therefore some hyperparameter values are tuned. Our goal is to deal with the usual prior distributions given by the normal/inverse gamma form and to evaluate the effect of prior hyperparameter choice on the posterior distribution. As usual in Bayesian robustness, a large class of priors expressed by many hyperparameter values should lead to a small collection of posteriors. From this perspective and using Kullback-Leibler divergence to measure prior and posterior deviations, a local sensitivity measure is proposed to make comparisons. If a robust Bayesian analysis is developed by studying the sensitivity of Bayesian answers to uncertain inputs, this method will also be useful for selecting robust hyperparameter values.