RT Journal Article T1 Embedded Feature Selection for Robust Probability Learning Machines A1 Carrasco, Miguel A1 Ivorra, Benjamín Pierre Paul A1 López, Julio A1 Ramos Del Olmo, Ángel Manuel AB Feature selection is essential for building effective machine learning models in binary classification. Eliminating unnecessary features can reduce the risk of overfitting and improve classification performance. Moreover, the data we handle always has a stochastic component, making it important to have robust models that are insensitive to data perturbations. Although there are numerous methods and tools for feature selection, relatively few works deal with embedded feature selection performed with robust classification models. In this work, we introduce robust classifiers with integrated feature selection capabilities, utilizing probability machines based on different penalization techniques such as the L1-norm or the elastic-net, combined with a novel Direct Feature Elimination process. Numerical experiments on standard databases demonstrate the effectiveness and robustness of the proposed models in classification tasks with a reduced number of features, using original indicators.The study also discusses the trade-offs in combining different penalties to select the most relevant features while minimizing empirical risk. YR 2024 FD 2024-07-19 LK https://hdl.handle.net/20.500.14352/107037 UL https://hdl.handle.net/20.500.14352/107037 LA eng NO Ministerio de Ciencia e Innovación DS Docta Complutense RD 6 oct 2024