Neural Networks, volume 134, pages 64-75

Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network

Publication typeJournal Article
Publication date2021-02-01
Journal: Neural Networks
Q1
Q1
SJR2.605
CiteScore13.9
Impact factor6
ISSN08936080, 18792782
Artificial Intelligence
Cognitive Neuroscience
Abstract
This work is aimed to study experimental and theoretical approaches for searching effective local training rules for unsupervised pattern recognition by high-performance memristor-based Spiking Neural Networks (SNNs). First, the possibility of weight change using Spike-Timing-Dependent Plasticity (STDP) is demonstrated with a pair of hardware analog neurons connected through a (CoFeB) x (LiNbO 3 ) 1 − x nanocomposite memristor. Next, the learning convergence to a solution of binary clusterization task is analyzed in a wide range of memristive STDP parameters for a single-layer fully connected feedforward SNN. The memristive STDP behavior supplying convergence in this simple task is shown also to provide it in the handwritten digit recognition domain by the more complex SNN architecture with a Winner-Take-All competition between neurons. To investigate basic conditions necessary for training convergence, an original probabilistic generative model of a rate-based single-layer network with independent or competing neurons is built and thoroughly analyzed. The main result is a statement of “correlation growth-anticorrelation decay” principle which prompts near-optimal policy to configure model parameters. This principle is in line with requiring the binary clusterization convergence which can be defined as the necessary condition for optimal learning and used as the simple benchmark for tuning parameters of various neural network realizations with population-rate information coding. At last, a heuristic algorithm is described to experimentally find out the convergence conditions in a memristive SNN, including robustness to a device variability. Due to the generality of the proposed approach, it can be applied to a wide range of memristors and neurons of software- or hardware-based rate-coding single-layer SNNs when searching for local rules that ensure their unsupervised learning convergence in a pattern recognition task domain. • Supporting correlations in activities of neurons is a near-optimal learning policy. • Binary clusterization can be a benchmark for tuning parameters of a rate-coding SNN. • Shaping memristive STDP window for binary clusterization helps in more complex tasks. • Nanocomposite LiNbO 3 -based memristors are suitable for always-on learning SNNs.

Top-30

Journals

2
4
6
8
10
Nanobiotechnology Reports
10 publications, 10.75%
Studies in Computational Intelligence
5 publications, 5.38%
Neural Networks
4 publications, 4.3%
Chaos, Solitons and Fractals
4 publications, 4.3%
Neural Computing and Applications
3 publications, 3.23%
Sensors
2 publications, 2.15%
Mathematics
2 publications, 2.15%
Applied Surface Science
2 publications, 2.15%
Journal of Alloys and Compounds
2 publications, 2.15%
Journal Physics D: Applied Physics
2 publications, 2.15%
Neuromorphic Computing and Engineering
2 publications, 2.15%
Advanced Intelligent Systems
2 publications, 2.15%
ACS applied materials & interfaces
2 publications, 2.15%
Neurocomputing
2 publications, 2.15%
Big Data and Cognitive Computing
2 publications, 2.15%
Symmetry
1 publication, 1.08%
Uspekhi Fizicheskih Nauk
1 publication, 1.08%
Discrete Dynamics in Nature and Society
1 publication, 1.08%
Journal of Low Power Electronics and Applications
1 publication, 1.08%
Metals
1 publication, 1.08%
Frontiers in Computational Neuroscience
1 publication, 1.08%
Frontiers in Neuroscience
1 publication, 1.08%
Nanomaterials
1 publication, 1.08%
Scientific Reports
1 publication, 1.08%
Journal of Materials Science and Technology
1 publication, 1.08%
Thin Solid Films
1 publication, 1.08%
Journal of Materiomics
1 publication, 1.08%
AEU - International Journal of Electronics and Communications
1 publication, 1.08%
Physica Status Solidi - Rapid Research Letters
1 publication, 1.08%
2
4
6
8
10

Publishers

5
10
15
20
25
Elsevier
21 publications, 22.58%
Springer Nature
13 publications, 13.98%
Pleiades Publishing
12 publications, 12.9%
MDPI
11 publications, 11.83%
Institute of Electrical and Electronics Engineers (IEEE)
10 publications, 10.75%
Wiley
6 publications, 6.45%
IOP Publishing
4 publications, 4.3%
American Chemical Society (ACS)
3 publications, 3.23%
Hindawi Limited
2 publications, 2.15%
Frontiers Media S.A.
2 publications, 2.15%
Uspekhi Fizicheskikh Nauk Journal
1 publication, 1.08%
Chinese Ceramic Society
1 publication, 1.08%
Acta Physica Sinica, Chinese Physical Society and Institute of Physics, Chinese Academy of Sciences
1 publication, 1.08%
Tyumen State University
1 publication, 1.08%
American Institute of Mathematical Sciences (AIMS)
1 publication, 1.08%
AIP Publishing
1 publication, 1.08%
The Russian Academy of Sciences
1 publication, 1.08%
Treatise
1 publication, 1.08%
5
10
15
20
25
  • We do not take into account publications without a DOI.
  • Statistics recalculated only for publications connected to researchers, organizations and labs registered on the platform.
  • Statistics recalculated weekly.

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
Share
Cite this
GOST |
Cite this
GOST Copy
Demin V. et al. Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network // Neural Networks. 2021. Vol. 134. pp. 64-75.
GOST all authors (up to 50) Copy
Demin V., Nekhaev D. V., Surazhevsky I. A., Nikiruy K. E., Emelyanov A., Nikolaev S., Rylkov V., Kovalchuk M. V. Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network // Neural Networks. 2021. Vol. 134. pp. 64-75.
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.1016/j.neunet.2020.11.005
UR - https://doi.org/10.1016/j.neunet.2020.11.005
TI - Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network
T2 - Neural Networks
AU - Demin, Vyacheslav
AU - Nekhaev, D V
AU - Surazhevsky, I A
AU - Nikiruy, K E
AU - Emelyanov, Andrey
AU - Nikolaev, S.
AU - Rylkov, Vladimir
AU - Kovalchuk, M. V.
PY - 2021
DA - 2021/02/01
PB - Elsevier
SP - 64-75
VL - 134
SN - 0893-6080
SN - 1879-2782
ER -
BibTex
Cite this
BibTex (up to 50 authors) Copy
@article{2021_Demin,
author = {Vyacheslav Demin and D V Nekhaev and I A Surazhevsky and K E Nikiruy and Andrey Emelyanov and S. Nikolaev and Vladimir Rylkov and M. V. Kovalchuk},
title = {Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network},
journal = {Neural Networks},
year = {2021},
volume = {134},
publisher = {Elsevier},
month = {feb},
url = {https://doi.org/10.1016/j.neunet.2020.11.005},
pages = {64--75},
doi = {10.1016/j.neunet.2020.11.005}
}
Found error?