Person:
Susi García, María Del Rosario

Loading...
Profile Picture
First Name
María Del Rosario
Last Name
Susi García
Affiliation
Universidad Complutense de Madrid
Faculty / Institute
Estudios estadísticos
Department
Estadística y Ciencia de los Datos
Area
Estadística e Investigación Operativa
Identifiers
UCM identifierORCIDScopus Author IDWeb of Science ResearcherIDDialnet IDGoogle Scholar ID

Search Results

Now showing 1 - 4 of 4
  • Item
    Sensitivity to hyperprior parameters in Gaussian Bayesian networks
    (2010) Gómez Villegas, Miguel Á.; Main Yaque, Paloma; Navarro, H.; Susi García, María Del Rosario
    Our focus is on learning Gaussian Bayesian networks (GBNs) from data. In GBNs the multivariate normal joint distribution can be alternatively specified by the normal regression models of each variable given its parents in the DAG (directed acyclic graph). In the later representation the paramenters are the mean vector, the regression coefficients and the corresponding conditional variances. the problem of Bayesian learning in this context has been handled with different approximations, all of them concerning the use of different priors for the parameters considered we work with the most usual prior given by the normal/inverse gamma form. In this setting we are inteserested in evaluating the effect of prior hyperparameters choice on posterior distribution. The Kullback-Leibler divergence measure is used as a tool to define local sensitivity comparing the prior and posterior deviations. This method can be useful to decide the values to be chosen for the hyperparameters.
  • Item
    Perturbing the structure in Gaussian Bayesian networks
    (2009) Susi García, María Del Rosario; Navarro, H.; Main Yaque, Paloma; Gómez Villegas, Miguel Á.
    This paper introduces a n-way sensitivity analysis for Gaussian Bayesian networks where it studies the joint effect of variations in a set of similar parameters. The aim is to determine the sensitivity of the model when the parameters that describe the quantitative part are given by the structure of the graph. Therefore, with this analysis it studies the effect of uncertainty about the regression coefficients and the conditional variances of variables with their parents given in the graph.
  • Item
    Assessing the effect of kurtosis deviations from Gaussianity on conditional distributions
    (Applied Mathematics and Computation, 2013) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Navarro, H.; Susi García, María Del Rosario
    The multivariate exponential power family is considered for n-dimensional random variables, Z, with a known partition Z equivalent to (Y, X) of dimensions p and n - p, respectively, with interest focusing on the conditional distribution Y vertical bar X. An infinitesimal variation of any parameter of the joint distribution produces perturbations in both the conditional and marginal distributions. The aim of the study was to determine the local effect of kurtosis deviations using the Kullback-Leibler divergence measure between probability distributions. The additive decomposition of this measure in terms of the conditional and marginal distributions, Y vertical bar X and X, is used to define a relative sensitivity measure of the conditional distribution family {Y vertical bar X = x}. Finally, simulated results suggest that for large dimensions, the measure is approximately equal to the ratio p/n, and then the effect of non-normality with respect to kurtosis depends only on the relative size of the variables considered in the partition of the random vector.
  • Item
    Sensitivity to hyperprior parameters in Gaussian Bayesian networks
    (Journal of multivariate analysis, 2014) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Navarro, H.; Susi García, María Del Rosario
    Bayesian networks (BNs) have become an essential tool for reasoning under uncertainty in complex models. In particular, the subclass of Gaussian Bayesian networks (GBNs) can be used to model continuous variables with Gaussian distributions. Here we focus on the task of learning GBNs from data. Factorization of the multivariate Gaussian joint density according to a directed acyclic graph (DAG) provides an alternative and interchangeable representation of a GBN by using the Gaussian conditional univariate densities of each variable given its parents in the DAG. With this latter conditional specification of a GBN, the learning process involves determination of the mean vector, regression coefficients and conditional variances parameters. Some approaches have been proposed to learn these parameters from a Bayesian perspective using different priors, and therefore some hyperparameter values are tuned. Our goal is to deal with the usual prior distributions given by the normal/inverse gamma form and to evaluate the effect of prior hyperparameter choice on the posterior distribution. As usual in Bayesian robustness, a large class of priors expressed by many hyperparameter values should lead to a small collection of posteriors. From this perspective and using Kullback-Leibler divergence to measure prior and posterior deviations, a local sensitivity measure is proposed to make comparisons. If a robust Bayesian analysis is developed by studying the sensitivity of Bayesian answers to uncertain inputs, this method will also be useful for selecting robust hyperparameter values.