Main Yaque, Paloma

Profile Picture
First Name
Last Name
Main Yaque
Universidad Complutense de Madrid
Faculty / Institute
Ciencias Matemáticas
Estadística e Investigación Operativa
UCM identifierScopus Author IDDialnet ID

Search Results

Now showing 1 - 10 of 21
  • Publication
    Functional proteomics outlines the complexity of breast cancer molecular subtypes
    (Nature publishing group, 2017) Gamez-Pozo, A.; Trilla-Fuentes, L.; Berges-Soria, J.; Selevsek, N.; López-Vacas, R.; Díaz-Almiron, M.; Nanni,, P.; Arevalillo, J. M.; Navarro, H.; Grossmann, J.; Moreno, F. G.; Rioja, R. G.; Prado-Vazquez, G.; Zapater-Moros, A.; Main Yaque, Paloma; Feliu, J.; del Prado, P.; Zamora, P.; Ciruelos, E.; Espinosa, E.; Vara, J. A.F.
    Breast cancer is a heterogeneous disease comprising a variety of entities with various genetic backgrounds. Estrogen receptor-positive, human epidermal growth factor receptor 2-negative tumors typically have a favorable outcome; however, some patients eventually relapse, which suggests some heterogeneity within this category. In the present study, we used proteomics and miRNA profiling techniques to characterize a set of 102 either estrogen receptor-positive (ER+)/progesterone receptorpositive (PR+) or triple-negative formalin-fixed, paraffin-embedded breast tumors. Protein expressionbased probabilistic graphical models and flux balance analyses revealed that some ER+/PR+ samples had a protein expression profile similar to that of triple-negative samples and had a clinical outcome similar to those with triple-negative disease. This probabilistic graphical model-based classification had prognostic value in patients with luminal A breast cancer. This prognostic information was independent of that provided by standard genomic tests for breast cancer, such as MammaPrint, OncoType Dx and the 8-gene Score.
  • Publication
    Analyzing the effect of introducing a kurtosis parameter in Gaussian Bayesian networks
    (Elsevier Sci. Ltd., 2009-05) Main Yaque, Paloma; Navarro Veguillas, Hilario
    Gaussian Bayesian networks are graphical models that represent the dependence structure of a multivariate normal random variable with a directed acyclic graph (DAG). In Gaussian Bayesian networks the output is usually the conditional distribution of some unknown variables of interest given a set of evidential nodes whose values are known. The problem of uncertainty about the assumption of normality is very common in applications. Thus a sensitivity analysis of the non-normality effect in our conclusions could be necessary. The aspect of non-normality to be considered is the tail behavior. In this line, the multivariate exponential power distribution is a family depending on a kurtosis parameter that goes from a leptokurtic to a platykurtic distribution with the normal as a mesokurtic distribution. Therefore a more general model can be considered using the multivariate exponential power distribution to describe the joint distribution of a Bayesian network, with a kurtosis parameter reflecting deviations from the normal distribution. The sensitivity of the conclusions to this perturbation is analyzed using the Kullback-Leibler divergence measure that provides an interesting formula to evaluate the effect.
  • Publication
    Sensitivity to hyperprior parameters in Gaussian Bayesian networks
    (2010-06-01) Gómez Villegas, Miguel Á.; Main Yaque, Paloma; Navarro, H.; Susi García, Rosario
    Our focus is on learning Gaussian Bayesian networks (GBNs) from data. In GBNs the multivariate normal joint distribution can be alternatively specified by the normal regression models of each variable given its parents in the DAG (directed acyclic graph). In the later representation the paramenters are the mean vector, the regression coefficients and the corresponding conditional variances. the problem of Bayesian learning in this context has been handled with different approximations, all of them concerning the use of different priors for the parameters considered we work with the most usual prior given by the normal/inverse gamma form. In this setting we are inteserested in evaluating the effect of prior hyperparameters choice on posterior distribution. The Kullback-Leibler divergence measure is used as a tool to define local sensitivity comparing the prior and posterior deviations. This method can be useful to decide the values to be chosen for the hyperparameters.
  • Publication
    Conditional Specification with Exponential Power Distributions
    (Marcel Dekker Inc., 2010-06-10) Main Yaque, Paloma; Navarro Veguillas, Hilario
    The problem of modeling Bayesian networks with continuous nodes deals with discrete approximations and conditional linear Gaussian models. In this article we have considered the possibility of using the exponential power family as conditional probability densities. It will be shown that for some platikurtic conditional distributions in this family, conditional regression functions are constant. These results give conditions to avoid compatibility problems when distributions with lighter tails than the normal are used in the description of conditional densities to specify joint densities, like in Bayesian networks.
  • Publication
    A Bayesian Analysis For The Multivariate Point Null Testing Problem
    (Taylor & Francis, 2009-08) Gómez Villegas, Miguel A.; Main Yaque, Paloma; Sanz San Miguel, Luis
    A Bayesian test for the point null testing problem in the multivariate case is developed. A procedure to get the mixed distribution using the prior density is suggested. For comparisons between the Bayesian and classical approaches, lower bounds on posterior probabilities of the null hypothesis, over some reasonable classes of prior distributions, are computed and compared with the p-value of the classical test. With our procedure, a better approximation is obtained because the p-value is in the range of the Bayesian measures of evidence.
  • Publication
    Extreme Inaccuracies In Gaussian Bayesian Networks
    (Elsevier, 2008) Gómez Villegas, Miguel A.; Main Yaque, Paloma; Susi García, Rosario
    To evaluate the impact of model inaccuracies over the network’s output, after the evidence propagation, in a Gaussian Bayesian network, a sensitivity measure is introduced. This sensitivity measure is the Kullback–Leibler divergence and yields different expressions depending on the type of parameter to be perturbed, i.e. on the inaccurate parameter. In this work, the behavior of this sensitivity measure is studied when model inaccuracies are extreme,i.e. when extreme perturbations of the parameters can exist. Moreover, the sensitivity measure is evaluated for extreme situations of dependence between the main variables of the network and its behavior with extreme inaccuracies. This analysis is performed to find the effect of extreme uncertainty about the initial parameters of the model in a Gaussian Bayesian network and about extreme values of evidence. These ideas and procedures are illustrated with an example.
  • Publication
    On tail behavior in Bayesian location inference
    (Elsevier Science Bv., 1997-11-01) Main Yaque, Paloma; Navarro Veguillas, Hilario
    The asymptotic behavior in the right tail of the hazard rate function is considered to compare probability distributions. Using this tail ordering, the position of the posterior distribution with respect to the prior and the likelihood distributions is analyzed for a Bayesian location problem, and it is proved that, under rather general conditions, the posterior distribution is equivalent to the lightest-tailed distribution, except when both the likelihood and the prior are very heavy-tailed distributions. The relationship between the posterior distributions based on random samples of sizes n and 1, respectively, is also studied, as well as its dependence on the relative position of the prior distribution and the model for observations in the hazard rate scale.
  • Publication
    A suitable Bayesian approach in testing point null hypothesis: some examples revisited
    (Marcel Dekker, 2002) Gómez Villegas, Miguel A.; Main Yaque, Paloma; Sanz San Miguel, Luis
    In the problem of testing the point null hypothesis H-0: theta = theta(0) versus H-1: theta not equal theta(0), with a previously given prior density for the parameter theta, we propose the following methodology: to fix an interval of radius epsilon around theta(0) and assign a prior mass, pi(0), to H-0 computed by the density pi(theta) over the interval (theta(0) - epsilon, theta(0) + epsilon), spreading the remainder, 1 - pi(0), over H-1 according to pi(theta). It is shown that for Lindley's paradox, the Normal model with some different priors and Darwin-Fisher's example, this procedure makes the posterior probability of H-0 and the p-value matching better than if the prior mass assigned to H-0 is 0.5.
  • Publication
    Assessing the effect of kurtosis deviations from Gaussianity on conditional distributions
    (Elsevier, 2013-07-01) Gómez Villegas, Miguel A.; Main Yaque, Paloma; Navarro, H.; Susi, R.
    The multivariate exponential power family is considered for n-dimensional random variables, Z, with a known partition Z equivalent to (Y, X) of dimensions p and n - p, respectively, with interest focusing on the conditional distribution Y vertical bar X. An infinitesimal variation of any parameter of the joint distribution produces perturbations in both the conditional and marginal distributions. The aim of the study was to determine the local effect of kurtosis deviations using the Kullback-Leibler divergence measure between probability distributions. The additive decomposition of this measure in terms of the conditional and marginal distributions, Y vertical bar X and X, is used to define a relative sensitivity measure of the conditional distribution family {Y vertical bar X = x}. Finally, simulated results suggest that for large dimensions, the measure is approximately equal to the ratio p/n, and then the effect of non-normality with respect to kurtosis depends only on the relative size of the variables considered in the partition of the random vector.
  • Publication
    Simulación de Sucesos Discretos. Prácticas en casos reales con R. Competición de estudiantes de simulación
    (La autora, 2016) Main Yaque, Paloma
    La Simulación de Sucesos Discretos (SSD)es una metodología que permite aplicar los procedimientos de simulación estocástica, para representar un sistema en el que las variables aleatorias que lo componen están relacionadas entre si. En esta monografía se recogen distintos casos reales en los que puede aplicarse SSD junto con su implementación en R y resultados finales.