%0 Journal Article %A Carrasco, Miguel %A Ivorra, Benjamín Pierre Paul %A López, Julio %A Ramos Del Olmo, Ángel Manuel %T Embedded Feature Selection for Robust Probability Learning Machines %D 2024 %U https://hdl.handle.net/20.500.14352/107037 %X Feature selection is essential for building effective machine learning models in binary classification. Eliminating unnecessary features can reduce the risk of overfitting and improve classification performance. Moreover, the data we handle always has a stochastic component, making it important to have robust models that are insensitive to data perturbations. Although there are numerous methods and tools for feature selection, relatively few works deal with embedded feature selection performed with robust classification models. In this work, we introduce robust classifiers with integrated feature selection capabilities, utilizing probability machines based on different penalization techniques such as the L1-norm or the elastic-net, combined with a novel Direct Feature Elimination process. Numerical experiments on standard databases demonstrate the effectiveness and robustness of the proposed models in classification tasks with a reduced number of features, using original indicators.The study also discusses the trade-offs in combining different penalties to select the most relevant features while minimizing empirical risk. %~