Robust Wald-type tests based on minimum Rényi pseudodistance estimators for the multiple linear regression model

Thumbnail Image
Full text at PDC
Publication Date
Advisors (or tutors)
Journal Title
Journal ISSN
Volume Title
Taylor & Francis
Google Scholar
Research Projects
Organizational Units
Journal Issue
We introduce a new family of Wald-type tests, based on minimum Rényi pseudodistance estimators, for testing general linear hypotheses and the variance of the residuals in the multiple regression model. The classical Wald test, based on the maximum likelihood estimator, can be seen as a particular case inside our family. Theoretical results, supported by an extensive simulation study, point out how some tests included in this family have a better behaviour, in the sense of robustness, than the Wald test. Finally, we provide a data-driven procedure for the choice of the optimal test given any data set.
[1] Balakrishnan, N., Castilla, E., Martn N. and Pardo, L. (2019).Robust estimators and test-statistics for one-shot device testing under the exponential distribution. IEEE transactions on Information Theory, 65(5), 3080-3096 [2] Basu, A., Ghosh, A. Mandal, Martn, N. and Pardo, L. (2017). A Wald-type test statistic for testing linear hypothesis in logistic regression models based on minimum density power divergence estimator. Electonic Journal of Statistics, 11, 2741{2772. [3] Basu, A., Harris, I. R., Hjort, N. L. and Jones, M. C. (1998). Robust and recient estimation by minimizing a density power divergence. Biometrika, 85, 549-559. 4] Basu, A., Mandal, A., Martn, N. and Pardo, L. (2016). Generalized Wald-type tests based on minimum density power divergence estimators. Statistics, 50 (1), 1-26. [5] Broniatowski, M., Toma, A. and Vajda, I. (2012). Decomposable pseudodistances and applications in statistical estimation, Journal of Statistical Planning and Inference, 142, 2574-2585. [6] Castilla, E, Martn, N. and Pardo, L. (2018). Pseudo minimum phi-divergence estimator for the multinomial logistic regression model with complex sample design. AStA Advanced Statistical Analysis, 102(3), 381{411. [7] Durio, A., Isaia, E. D. (2011). The Minimum Density Power Divergence Approach in Building Robust Regression Models. Informatica, 22(1), 43-56. [8] Fraser, D.A.S. (1957). Nonparametric Methods in Statistics. John Wiley & Sons, New York. [9] Ghosh, A. and Basu, A. (2015). Robust estimation for non- homogeneous data and the selection of the optimal tuning parameter: The density power divergence approach. Journal of Applied Statitsics, 42, 2056-2072. [10] Ghosh, A. and Basu, A. (2016). Robust Estimation in Generalized Linear Models: The Density Power Divergence Approach. TEST, 25, 269-290. [11] Ghosh, A., Mandal, A., Martn, N. and Pardo, L. (2016). Influence analysis of robust Wald-type tests. Journal of Multivariate Analysis, 147, 102-126. [12] Hong, C. and Kim, Y. (2001). Automatic selection of the tuning parameter in the minimum density power divergence estimation. Journal of the Korean Statistical Society, 30, 453-465. [13] Jones, M.C., Hjort, N.L., Harris, I.R. and Basu, A. (2001). A comparison of related density-based minimum divergence estimators. Biometrika, 88, 865-873. [14] Liese, F. and Vajda, I. (1987). Convex Statistical Distances. Teubner, Leipzig. [15] Pardo, L. (2006). Statistical Inference Based on Divergence Measures. Chapman & Hall/CRC, Boca de Raton. [16] Renyi, A. (1961). On measures of entropy and information. In: Neyman, J. (ed.) Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, 1, Berkeley, 547-561. [17] Warwick, J., and Jones, M. C. (2005). Choosing a robustness tuning parameter. Journal of Statistical Computation and Simulation, 75, 581-588.