Scalable energy-efficient, low-latency implementations of trained spiking Deep Belief Networks on SpiNNaker

Publication typeProceedings Article
Publication date2015-07-01
Abstract
Deep neural networks have become the state-of-the-art approach for classification in machine learning, and Deep Belief Networks (DBNs) are one of its most successful representatives. DBNs consist of many neuron-like units, which are connected only to neurons in neighboring layers. Larger DBNs have been shown to perform better, but scaling-up poses problems for conventional CPUs, which calls for efficient implementations on parallel computing architectures, in particular reducing the communication overhead. In this context we introduce a realization of a spike-based variation of previously trained DBNs on the biologically-inspired parallel SpiNNaker platform. The DBN on SpiNNaker runs in real-time and achieves a classification performance of 95% on the MNIST handwritten digit dataset, which is only 0.06% less than that of a pure software implementation. Importantly, using a neurally-inspired architecture yields additional benefits: during network run-time on this task, the platform consumes only 0.3 W with classification latencies in the order of tens of milliseconds, making it suitable for implementing such networks on a mobile platform. The results in this paper also show how the power dissipation of the SpiNNaker platform and the classification latency of a network scales with the number of neurons and layers in the network and the overall spike activity rate.

Top-30

Journals

1
2
Neuromorphic Computing and Engineering
2 publications, 3.77%
Lecture Notes in Computer Science
2 publications, 3.77%
IEEE Transactions on Biomedical Circuits and Systems
2 publications, 3.77%
IEEE Transactions on Multi-Scale Computing Systems
2 publications, 3.77%
IEEE Transactions on Circuits and Systems I: Regular Papers
2 publications, 3.77%
IEEE Access
2 publications, 3.77%
Proceedings of the IEEE
2 publications, 3.77%
ACM Transactions on Parallel Computing
1 publication, 1.89%
ACM Transactions on Reconfigurable Technology and Systems
1 publication, 1.89%
Operating Systems Review (ACM)
1 publication, 1.89%
ACM SIGARCH Computer Architecture News
1 publication, 1.89%
MATEC Web of Conferences
1 publication, 1.89%
Nature Machine Intelligence
1 publication, 1.89%
Synthesis Lectures on Computer Architecture
1 publication, 1.89%
ACM SIGPLAN Notices
1 publication, 1.89%
IEEE Transactions on Neural Networks and Learning Systems
1 publication, 1.89%
IEEE Geoscience and Remote Sensing Letters
1 publication, 1.89%
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
1 publication, 1.89%
IEEE Transactions on Image Processing
1 publication, 1.89%
IEEE Transactions on Parallel and Distributed Systems
1 publication, 1.89%
IEEE Circuits and Systems Magazine
1 publication, 1.89%
IEEE Journal on Emerging and Selected Topics in Circuits and Systems
1 publication, 1.89%
National Science Review
1 publication, 1.89%
Biomimetics
1 publication, 1.89%
Nature Communications
1 publication, 1.89%
Mesoscience and Nanotechnology
1 publication, 1.89%
1
2

Publishers

5
10
15
20
25
30
35
Institute of Electrical and Electronics Engineers (IEEE)
35 publications, 66.04%
Association for Computing Machinery (ACM)
7 publications, 13.21%
Springer Nature
4 publications, 7.55%
IOP Publishing
2 publications, 3.77%
EDP Sciences
1 publication, 1.89%
Morgan & Claypool Publishers
1 publication, 1.89%
Oxford University Press
1 publication, 1.89%
MDPI
1 publication, 1.89%
Treatise
1 publication, 1.89%
5
10
15
20
25
30
35
  • We do not take into account publications without a DOI.
  • Statistics recalculated only for publications connected to researchers, organizations and labs registered on the platform.
  • Statistics recalculated weekly.

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
Share
Found error?