Publication Date
12-2024
Date of Final Oral Examination (Defense)
10-4-2024
Type of Culminating Activity
Dissertation
Degree Title
Doctor of Philosophy in Electrical and Computer Engineering
Department
Electrical and Computer Engineering
Supervisory Committee Chair
Kurtis D. Cantley, Ph.D.
Supervisory Committee Member
Benjamin C. Johnson, Ph.D.
Supervisory Committee Member
Nader Rafla, Ph.D.
Abstract
As Moore's law ends, the conventional von Neumann computer architecture with binary-coded data representation has reached its bottleneck because of having separate computing and memory modules. In this architecture, continuous power is required due to sequential processing, making it challenging to improve efficiency further. The human brain can be regarded as the most energy-efficient computer architecture. In the brain, data is represented as small voltage pulses as spikes. That is why the neural units are called spiking neural networks (SNN). In SNNs, energy is required only when there is a spike, making it more energy efficient than the von-Neumann computer architecture. Therefore, in recent decades, researchers in this field have been interested in mimicking the data processing of the human brain in electronic circuits. In biological SNNs, data is propagated from one neuron to another via a synapse. When there is an incoming spike, the chemical weight changes in the synapse, and the next neuron receives the signal. In the electronic neural network, a memristor, a two-terminal nonvolatile memory element, can emulate the function of a synapse by changing its conductance while receiving a pulse. The most common learning rule in the spiking neural network is Spike-Timing-Dependent Plasticity (STDP), which refers to the change of synaptic weight with respect to the time difference between pre- and post-synaptic neural spikes. Although electronic SNNs comprised of memristors with the STDP learning rule have shown promising performance in various event-driven tasks, circuit complexity limitations and a lack of third-order parametric inclusion exist. A solution can be using a simple circuit element to modulate the memristor response.
DOI
https://doi.org/10.18122/td.2302.boisestate
Recommended Citation
Afrin, Farhana, "Analysis of Learning Mechanisms in Spiking Neural Networks with R(t) Elements and Memristive Synapses" (2024). Boise State University Theses and Dissertations. 2302.
https://doi.org/10.18122/td.2302.boisestate
Comments
ORCID: 0009-0008-3346-6217