Neuromorphic tuning of feature spaces to overcome the challenge of low-sample high-dimensional data
Loading...
Official URL
Full text at PDC
Publication date
2023
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Citation
Zhou, Qinghua, Oliver J. Sutton, Yu-Dong Zhang, Alexander N. Gorban, Valeri A. Makarov, y Ivan Y. Tyukin. «Neuromorphic tuning of feature spaces to overcome the challenge of low-sample high-dimensional data». En 2023 International Joint Conference on Neural Networks (IJCNN), 1-8. Gold Coast, Australia: IEEE, 2023. https://doi.org/10.1109/IJCNN54540.2023.10191304.
Abstract
For learning algorithms, accessing large volumes of annotated data is highly desirable but not always available, especially in real-world scenarios. Accordingly, learning in the highdimensional and low-sample size (HDLS) domain is recognised as one of the core challenges for modern AI systems. In this work, we consider a particular but very practical scenario in the HDLS domain where the number of training samples is not limited to mere few observations but yet it is not large enough to reliably build models with high degrees of expressivity. To address the problem, we present a new neuromorphic algorithm capable of fine-tuning existing feature spaces via learning relevant associations in high dimensional data with high probability. The algorithm is based on the idea of Concept Cells [1] and mimics properties attributed to memory and learning inherent to live neural systems. We demonstrate, through numerous numerical experiments, that the algorithm can ”finetune” and ”adapt” the feature space of pre-trained neural networks for better performance on new tasks in the HDLS domain. In addition, we study the impact of this ”tuning” on quasi-orthogonal measures, which correlates with classification and calibration metrics.