Blessing of dimensionality in spiking neural networks: the by-chance functional learning

Loading...
Thumbnail Image

Full text at PDC

Publication date

2025

Advisors (or tutors)

Editors

Journal Title

Journal ISSN

Volume Title

Publisher

Frontiers Media
Citations
Google Scholar

Citation

Abstract

Spiking neural networks (SNNs) have significant potential for a power-efficient neuromorphic AI. However, their training is challenging since most of the learning principles known from artificial neural networks are hardly applicable. Recently, the concept of “blessing of dimensionality” has successfully been used to treat high-dimensional data and representations of reality. It exploits the fundamental trade-off between the complexity and simplicity of statistical sets in high-dimensional spaces without relying on global optimization techniques. We show that the frequency encoding of memories in SNNs can leverage this paradigm. It enables detecting and learning arbitrary information items, given that they operate in high dimensions. To illustrate the hypothesis, we develop a minimalist model of information processing in layered brain structures and study the emergence of extreme selectivity to multiple stimuli and associative memories. Our results suggest that global optimization of cost functions may be circumvented at different levels of information processing in SNNs, and replaced by chance learning, greatly simplifying the design of AI devices.

Research Projects

Organizational Units

Journal Issue

Description

Unesco subjects

Keywords

Collections