Document Type

Conference Proceeding

Publication Date

7-2020

Abstract

Deep Neural Networks (DNNs) have two key deficiencies, their dependence on high precision computing and their inability to perform sequential learning, that is, when a DNN is trained on a first task and the same DNN is trained on the next task it forgets the first task. This phenomenon of forgetting previous tasks is also referred to as catastrophic forgetting. On the other hand a mammalian brain outperforms DNNs in terms of energy efficiency and the ability to learn sequentially without catastrophically forgetting. Here, we use bio-inspired Spike Timing Dependent Plasticity (STDP) in the feature extraction layers of the network with instantaneous neurons to extract meaningful features. In the classification sections of the network we use a modified synaptic intelligence that we refer to as cost per synapse metric as a regularizer to immunize the network against catastrophic forgetting in a Single-Incremental-Task scenario (SIT). In this study, we use MNIST handwritten digits dataset that was divided into five sub-tasks.

Copyright Statement

This is an author-produced, peer-reviewed version of this article. The final, definitive version of this document can be found online at ICONS 2020: International Conference on Neuromorphic Systems 2020, published by Association for Computing Machinery. Copyright restrictions may apply. https://doi.org/10.1145/3407197.3407213. The content of this document may vary from the final published version.

Share

COinS