Enhanced representation learning with temporal coding in sparsely spiking neural networks - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Article Dans Une Revue Frontiers in Computational Neuroscience Année : 2023

Enhanced representation learning with temporal coding in sparsely spiking neural networks

Résumé

Current representation learning methods in Spiking Neural Networks (SNNs) rely on rate-based encoding, resulting in high spike counts, increased energy consumption, and slower information transmission. In contrast, our proposed method, Weight-Temporally Coded Representation Learning (W-TCRL), utilizes temporally coded inputs, leading to lower spike counts and improved efficiency. To address the challenge of extracting representations from a temporal code with low reconstruction error, we introduce a novel Spike-Timing-Dependent Plasticity (STDP) rule. This rule enables stable learning of relative latencies within the synaptic weight distribution and is locally implemented in space and time, making it compatible with neuromorphic processors. We evaluate the performance of W- TCRL on the MNIST and natural image datasets for image reconstruction tasks. Our results demonstrate relative improvements of 53% for MNIST and 75% for natural images in terms of reconstruction error compared to the SNN state of the art. Additionally, our method achieves significantly higher sparsity, up to 900 times greater, when compared to related work. These findings emphasize the efficacy of W-TCRL in leveraging temporal coding for enhanced representation learning in Spiking Neural Networks.
Fichier principal
Vignette du fichier
fncom-17-1250908.pdf (4.16 Mo) Télécharger le fichier
Origine : Publication financée par une institution
licence : CC BY - Paternité

Dates et versions

hal-04420023 , version 1 (26-01-2024)

Licence

Paternité

Identifiants

Citer

Adrien Fois, Bernard Girau. Enhanced representation learning with temporal coding in sparsely spiking neural networks. Frontiers in Computational Neuroscience, 2023, 17, ⟨10.3389/fncom.2023.1250908⟩. ⟨hal-04420023⟩
22 Consultations
6 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More