Neuromorphic computers mimic how the brain works. This simplified overview explains the four key components - neurons, spike generators, artificial synapses, and learning mechanisms - and how they can be implemented electronically.
The Human Brain
The human brain contains billions of neurons. Without going deep into the science, here’s a much-simplified overview of how it works. Each neuron is a specialized cell that receives inputs from other neurons, integrates (adds up) those inputs and, if the resulting signal reaches a threshold, fires an electrical pulse (a spike).
Neurons have three main parts, each with different functions, but I am ignoring that for simplicity. Synapses connect between neurons and excite or inhibit the next Neuron. Frequent activity can strengthen a synapse, while inactivity can weaken or remove it. These ongoing changes, helped by supporting glial cells (again ignored for simplicity), underlie learning and memory.
Neuromorphic Computer Systems
Neuromorphic computer systems attempt to mimic how the brain processes information. The key elements are:
Electronic Neurons
Spike Generators
Artificial synapses
Learning mechanisms
What follows is a (much simplified) top-level view of how each element can be implemented electronically.
Electronic Neurons
The electronic Neuron consists of circuitry designed to replicate the biological Neuron's ability to integrate input signals. Those input signals are via their connected synapses (see below). The circuitry can be analogue, digital or (more often) a mix.
Analogue variants incorporate capacitors (the input signal voltages accumulate across the capacitor) or transconductance amplifiers (convert an input voltage to an output current).
Digital versions tend to use digital counters, shift registers, or arithmetic circuits that sum discrete input events. Digital versions are easier to implement (and scale) than analogue variants.
Spike Generators
Spike generation involves input integration, leakage, and threshold-triggered firing. These spikes encode information in a biologically realistic, energy-efficient manner.
Inputs are accumulated over time, usually in a capacitor or digital counter (see above). A spike occurs when the input current or voltage reaches a defined threshold. The accumulated input signal can gradually decay if not reinforced, mimicking the biological Neuron's leaky integration.
After each firing (in the simpler configurations), the neuron resets. The threshold definition can be preset (hard-wired), but more often, it is configurable. A configurable system uses registers or control memory to tune the threshold during training. In some systems, the threshold adapts based on recent activity (firing-rate control)
Electronic neurons, spike generators and synapses (see below) are often integrated on a single chip.
Artificial Synapses
Artificial synapses modulate signal transmission between electronic neurons, usually by adjusting a synaptic weight that determines the strength of the connection. They transmit a spike, the magnitude of which is moderated by the weight.
The weights are stored as analogue values or (more often) in digital memory. Synapses can adapt over time, adjusting weights through learning mechanisms (see below).
Analogue synapses tend to use memristors, but phase-change materials or floating-gate MOS transistors are other options. Changes occur continuously (analogue) and closely mimic the brain's synaptic activity. The weight is represented by the conductance of memristors or stored charge on a gate (MOS transistors).
Digital synapses store weights in memory (Non-volatile, SRAM or DRAM). The weights are adjusted via learning algorithms. Mixed approaches (analogue and digital combined) are common as they deliver accuracy and energy efficiency, balancing complexity with robustness.
Learning Mechanisms
Feedback mechanisms in neuromorphic systems allow networks to learn and adapt dynamically, but it’s not hard-wired. This is where things start to get complicated.
To summarise with a much simplified (analogue) example. Neuron1 receives voltage spikes from its upstream Neurons through synapses. Each synapse scales the incoming spike according to its weight, and these weighted pulses are summed at Neuron1. When a threshold is reached Neuron1 fires a spike.
That spike is at a set level (let’s say 2V) and transmitted down every outgoing synapse. The outgoing synapses connect to many neurons, let’s take just one that connects to Neuron2. The synapse modifies the level of the spike depending on its weight. In this example, it reduces the level of the spike to 0.8V, which arrives at Neuron2 to be summed with all other incoming synapses.
Now, the learning (or mimics learning) part. Let's say to generate a spike a threshold of 4V has to be reached at Neuron2. Fifty synapses connect to Neuron2 and the voltages on forty-nine of those add to 3.2V at one moment in time. Synapse fifty connects at 0.8V and the spike is generated. Because synapse fifty delivered its spike just before Neuron2 fired, the system interprets this as a meaningful contribution. It doesn't matter what the voltages were at a particular point in time, as only relative spike timing matters.
As a result of this spike timing a voltage is applied across synapse fifty, which increases its weight (spike-timing dependent plasticity - STDP). When Neuron2 fires a spike, that signal is routed back to its input synapses. Therefore, each synapse locally receives both the pre-synaptic and post-synaptic spikes and uses the relative timing of these spikes to determine whether and how to adjust its weight.
In analogue implementations, Memristor resistance or charge on a floating gate changes. Digital implementations uses digital hardware and software to facilitate event logging.
Neuromorphic systems attempt to mimic the brain's efficient, adaptive processing and are fundamentally different from traditional computing. The above is intended as a much-simplified overview of how a neuromorphic system works. Future posts will consider each element in more detail, including the current status of the technology.