Aviso: para depositar documentos, por favor, inicia sesión e identifícate con tu cuenta de correo institucional de la UCM con el botón MI CUENTA UCM. No emplees la opción AUTENTICACIÓN CON CONTRASEÑA
 

Embedded Feature Selection for Robust Probability Learning Machines

Loading...
Thumbnail Image

Official URL

Full text at PDC

Publication date

2024

Advisors (or tutors)

Editors

Journal Title

Journal ISSN

Volume Title

Publisher

Citations
Google Scholar

Citation

Abstract

Feature selection is essential for building effective machine learning models in binary classification. Eliminating unnecessary features can reduce the risk of overfitting and improve classification performance. Moreover, the data we handle always has a stochastic component, making it important to have robust models that are insensitive to data perturbations. Although there are numerous methods and tools for feature selection, relatively few works deal with embedded feature selection performed with robust classification models. In this work, we introduce robust classifiers with integrated feature selection capabilities, utilizing probability machines based on different penalization techniques such as the L1-norm or the elastic-net, combined with a novel Direct Feature Elimination process. Numerical experiments on standard databases demonstrate the effectiveness and robustness of the proposed models in classification tasks with a reduced number of features, using original indicators.The study also discusses the trade-offs in combining different penalties to select the most relevant features while minimizing empirical risk.

Research Projects

Organizational Units

Journal Issue

Description

Unesco subjects

Keywords

Collections