Person:
Susi García, María Del Rosario

Loading...
Profile Picture
First Name
María Del Rosario
Last Name
Susi García
Affiliation
Universidad Complutense de Madrid
Faculty / Institute
Estudios estadísticos
Department
Estadística y Ciencia de los Datos
Area
Estadística e Investigación Operativa
Identifiers
UCM identifierORCIDScopus Author IDWeb of Science ResearcherIDDialnet IDGoogle Scholar ID

Search Results

Now showing 1 - 7 of 7
  • Item
    Assessing the effect of kurtosis deviations from Gaussianity on conditional distributions
    (Applied Mathematics and Computation, 2013) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Navarro, H.; Susi García, María Del Rosario
    The multivariate exponential power family is considered for n-dimensional random variables, Z, with a known partition Z equivalent to (Y, X) of dimensions p and n - p, respectively, with interest focusing on the conditional distribution Y vertical bar X. An infinitesimal variation of any parameter of the joint distribution produces perturbations in both the conditional and marginal distributions. The aim of the study was to determine the local effect of kurtosis deviations using the Kullback-Leibler divergence measure between probability distributions. The additive decomposition of this measure in terms of the conditional and marginal distributions, Y vertical bar X and X, is used to define a relative sensitivity measure of the conditional distribution family {Y vertical bar X = x}. Finally, simulated results suggest that for large dimensions, the measure is approximately equal to the ratio p/n, and then the effect of non-normality with respect to kurtosis depends only on the relative size of the variables considered in the partition of the random vector.
  • Item
    Evaluating The Difference Between Graph Structures In Gaussian Bayesian Networks
    (Expert Systems With Applications, 2011) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Navarro Veguillas, Hilario; Susi García, María Del Rosario
    In this work, we evaluate the sensitivity of Gaussian Bayesian networks to perturbations or uncertainties in the regression coefficients of the network arcs and the conditional distributions of the variables. The Kullback–Leibler divergence measure is used to compare the original network to its perturbation. By setting the regression coefficients to zero or non-zero values, the proposed method can remove or add arcs, making it possible to compare different network structures. The methodology is implemented with some case studies.
  • Item
    The effect of block parameter perturbations in Gaussian Bayesian networks: Sensitivity and robustness
    (Information Sciences, 2013) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Susi García, María Del Rosario
    n this work we study the effects of model inaccuracies on the description of a Gaussian Bayesian network with a set of variables of interest and a set of evidential variables. Using the Kullback-Leibler divergence measure, we compare the output of two different networks after evidence propagation: the original network, and a network with perturbations representing uncertainties in the quantitative parameters. We describe two methods for analyzing the sensitivity and robustness of a Gaussian Bayesian network on this basis. In the sensitivity analysis, different expressions are obtained depending on which set of parameters is considered inaccurate. This fact makes it possible to determine the set of parameters that most strongly disturbs the network output. If all of the divergences are small, we can conclude that the network output is insensitive to the proposed perturbations. The robustness analysis is similar, but considers all potential uncertainties jointly. It thus yields only one divergence, which can be used to confirm the overall sensitivity of the network. Some practical examples of this method are provided, including a complex, real-world problem
  • Item
    Calculando la matriz de covarianzas con la estructura de una red Bayesiana Gaussiana
    (2012) Gómez Villegas, Miguel Ángel; Susi García, María Del Rosario
    En este trabajo se introduce una fórmula recursiva que permite calcular la matriz de covarianzas de una red Bayesiana Gaussiana dados los parámetros de la especificación condicionada de la parte cuantitativa del modelo. Además se determinan las varianzas y las covarianzas del problema considerando los distintos caminos que aparecen en el grafo que recoge la parte cualitativa de la red.
  • Item
    Extreme Inaccuracies In Gaussian Bayesian Networks
    (Journal Of Multivariate Analysis, 2008) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Susi García, María Del Rosario
    To evaluate the impact of model inaccuracies over the network’s output, after the evidence propagation, in a Gaussian Bayesian network, a sensitivity measure is introduced. This sensitivity measure is the Kullback–Leibler divergence and yields different expressions depending on the type of parameter to be perturbed, i.e. on the inaccurate parameter. In this work, the behavior of this sensitivity measure is studied when model inaccuracies are extreme,i.e. when extreme perturbations of the parameters can exist. Moreover, the sensitivity measure is evaluated for extreme situations of dependence between the main variables of the network and its behavior with extreme inaccuracies. This analysis is performed to find the effect of extreme uncertainty about the initial parameters of the model in a Gaussian Bayesian network and about extreme values of evidence. These ideas and procedures are illustrated with an example.
  • Item
    Sensitivity Analysis in Gaussian Bayesian Networks Using a Divergence Measure
    (Communications in statistics. Theory and methods, 2007) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Susi García, María Del Rosario
    This article develops a method for computing the sensitivity analysis in a Gaussian Bayesian network. The measure presented is based on the Kullback–Leibler divergence and is useful to evaluate the impact of prior changes over the posterior marginal density of the target variable in the network. We find that some changes do not disturb the posterior marginal density of interest. Finally, we describe a method to compare different sensitivity measures obtained depending on where the inaccuracy was. An example is used to illustrate the concepts and methods presented.
  • Item
    Sensitivity to hyperprior parameters in Gaussian Bayesian networks
    (Journal of multivariate analysis, 2014) Gómez Villegas, Miguel Ángel; Main Yaque, Paloma; Navarro, H.; Susi García, María Del Rosario
    Bayesian networks (BNs) have become an essential tool for reasoning under uncertainty in complex models. In particular, the subclass of Gaussian Bayesian networks (GBNs) can be used to model continuous variables with Gaussian distributions. Here we focus on the task of learning GBNs from data. Factorization of the multivariate Gaussian joint density according to a directed acyclic graph (DAG) provides an alternative and interchangeable representation of a GBN by using the Gaussian conditional univariate densities of each variable given its parents in the DAG. With this latter conditional specification of a GBN, the learning process involves determination of the mean vector, regression coefficients and conditional variances parameters. Some approaches have been proposed to learn these parameters from a Bayesian perspective using different priors, and therefore some hyperparameter values are tuned. Our goal is to deal with the usual prior distributions given by the normal/inverse gamma form and to evaluate the effect of prior hyperparameter choice on the posterior distribution. As usual in Bayesian robustness, a large class of priors expressed by many hyperparameter values should lead to a small collection of posteriors. From this perspective and using Kullback-Leibler divergence to measure prior and posterior deviations, a local sensitivity measure is proposed to make comparisons. If a robust Bayesian analysis is developed by studying the sensitivity of Bayesian answers to uncertain inputs, this method will also be useful for selecting robust hyperparameter values.