-
Notifications
You must be signed in to change notification settings - Fork 5
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Initial commit of the NeuroPack module
- Loading branch information
0 parents
commit 74814be
Showing
19 changed files
with
3,505 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
.*.swp | ||
*.pyc | ||
*.tmp | ||
build/* | ||
dist/* | ||
dist.win32/* | ||
*.1.gz | ||
*.8.gz | ||
*.1 | ||
*.8 | ||
man/*.1.gz | ||
man/*.8.gz | ||
man/*.1 | ||
man/*.8 | ||
*.mo | ||
*~ | ||
*.rej | ||
*.orig | ||
#*# | ||
MANIFEST | ||
tags | ||
*.egg-info | ||
deps/* | ||
nn*.py | ||
__pycache__/* | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
NeuroPack core library | ||
====================== | ||
|
||
Place your NeuroPack cores in this folder. All python modules that have their | ||
filename starting with "core_" will be regarded as a distinct core from ArC | ||
ONE. You can also have additional modules if you so require (for example common | ||
code shared by different cores) and as long as they are not prefixed with | ||
"core_" they will be ignored by ArC ONE. | ||
|
||
A core file consists of the three functions | ||
|
||
* The `init` function is initial setup of the network before any operation is | ||
done if required. For example you may want to introduce new fields on the | ||
network object or its state for subsequent use. | ||
* The `neurons` function implements the "evolution" of the network over each | ||
time step. No training is done in this step. | ||
* The `plast` (plasticity) function implements the "learning" capacity of the | ||
network for each time step. | ||
|
||
If the core produces additional data that need to be saved at the end of the | ||
execution an additional function must be implemented `additional_data` which | ||
returns a dictionary with the parameters that need to be saved in the output | ||
file. | ||
|
||
The network | ||
----------- | ||
|
||
Common argument for all functions is the network itself (`net`). The network | ||
should have the following fields defined. | ||
|
||
* `LTP_V` and `LTP_pw`: Voltage and pulse width for potentiation | ||
* `LTD_V` and `LTD_pw`: Voltage and pulse width for depression | ||
* `epochs`: The total number of timesteps | ||
* `NETSIZE`: The total number of neurons | ||
* `LTPWIN` and `LTDWIN`: Window for LTP and LTD | ||
* `DEPTH`: Depth of the network | ||
* `rawin`: The raw state of all neurons | ||
* `stimin`: The stimulus input (see NeuroData/motif_stim.txt for an example) | ||
* `ConnMat`: The connectivity matrix (see NeuroData/motif_connmat.txt for an | ||
example) | ||
* `params`: A dict containing any user defined parameters defined in the base | ||
configuration excluding `NETSIZE`, `LTPWIN`, `LTDWIN` and `DEPTH` that must | ||
be *always* defined. By using an alternate base configuration file (see | ||
NeuroData/Neurobase.json for the base configuration) additional parameters can | ||
be introduced and will be available under the `params` dict. | ||
* `state`: The internal state of the network (see below). Usually the state of | ||
the network is what is altered during the `neurons` and `plast` steps. | ||
|
||
Network state | ||
------------- | ||
The network object has a `state` field defined. This variable described the | ||
current status of the network and has the following fields defined. | ||
|
||
* `weights`: The weights of the neurons for all epochs. This should be altered | ||
during the plasticity step as it is inherent to training the network. | ||
* `NeurAccum`: Membrane capacitance of the network. Should be updated during | ||
the `neurons` step. | ||
* `fireCells`: The neuros that should fire during the plasticity step. This is | ||
introduced from the stimulus file. `fireCells` should be updated during the | ||
`neurons` step. | ||
* `fireHist`: History of firing neurons. It should be updated during the | ||
`neurons` step. | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
# NeuroPack core entry point |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,190 @@ | ||
import numpy as np | ||
|
||
# A NeuroPack core implements plasticity events and updates | ||
# Essentialy NeuroPack will call the following functions | ||
# | ||
# init(network) | ||
# for trial in trials: | ||
# neurons(network, step) | ||
# plast(network, step) | ||
|
||
# This particular core implements a LIF network, the neurons function | ||
# calculates which neurons are set to fire and the plast function | ||
# implements plasticity event propagation. | ||
|
||
# neurons and plast both take two arguments. The first one is the | ||
# network itself (see `NeuroPack.Network`) and the second is the | ||
# timestep. | ||
|
||
# This core requires `NeuroData/SevenMotif.json`. | ||
|
||
def normalise_weight(net, w): | ||
PCEIL = 1.0/net.params['PFLOOR'] | ||
PFLOOR = 1.0/net.params['PCEIL'] | ||
|
||
val = net.params['WEIGHTSCALE']*(float(w) - PFLOOR)/(PCEIL - PFLOOR) | ||
|
||
# Clamp weights in-between 0.0 and 1.0 | ||
if val < 0.0: | ||
return 0.0 | ||
elif val > 1.0: | ||
return 1.0 | ||
else: | ||
return val | ||
|
||
|
||
def init(net): | ||
# Renormalise weights if needed | ||
if not net.params.get('NORMALISE', False): | ||
return | ||
|
||
for postidx in range(len(net.ConnMat)): | ||
# For every presynaptic input the neuron receives. | ||
for preidx in np.where(net.ConnMat[:, postidx, 0] != 0)[0]: | ||
old_weight = net.state.weights[preidx, postidx, 0] | ||
new_weight = normalise_weight(net, old_weight) | ||
net.state.weights[preidx, postidx, 0] = new_weight | ||
|
||
|
||
def neurons(net, time): | ||
|
||
rawin = net.rawin # Raw input | ||
stimin = net.stimin[:, time] # Stimulus input for current timestep | ||
|
||
full_stim = np.bitwise_or([int(x) for x in rawin], [int(x) for x in stimin]) | ||
net.log("**** FULL_STIM = ", full_stim) | ||
|
||
if time > 0: | ||
# if this isn't the first step copy the accumulators | ||
# from the previous step onto the new one | ||
net.state.NeurAccum[time] = net.state.NeurAccum[time-1] | ||
|
||
# reset the accumulators of neurons that have already fired | ||
for (idx, v) in enumerate(full_stim): | ||
if v != 0: | ||
net.state.NeurAccum[time][idx] = 0.0 | ||
|
||
# For this example we'll make I&F neurons - if changing this file a back-up | ||
# is strongly recommended before proceeding. | ||
|
||
# -FIX- implementing 'memory' between calls to this function. | ||
# NeurAccum = len(net.ConnMat)*[0] #Define neuron accumulators. | ||
# Neurons that unless otherwise dictated to by net or ext input will | ||
# fire. | ||
wantToFire = len(net.ConnMat)*[0] | ||
|
||
# Gather/define other pertinent data to function of neuron. | ||
leakage = net.params.get('LEAKAGE', 1.0) | ||
bias = np.array(len(net.ConnMat)*[leakage]) #No active biases. | ||
|
||
# STAGE I: See what neurons do 'freely', i.e. without the constraints of | ||
# WTA or generally other neurons' activities. | ||
for postidx in range(len(net.state.NeurAccum[time])): | ||
# Unconditionally add bias term | ||
net.state.NeurAccum[time][postidx] += bias[postidx] | ||
if net.state.NeurAccum[time][postidx] < 0.0: | ||
net.state.NeurAccum[time][postidx] = 0.0 | ||
|
||
#For every presynaptic input the neuron receives. | ||
for preidx in np.where(net.ConnMat[:, postidx, 0] != 0)[0]: | ||
|
||
# Excitatory case | ||
if net.ConnMat[preidx, postidx, 2] > 0: | ||
# net.log("Excitatory at %d %d" % (preidx, postidx)) | ||
# Accumulator increases as per standard formula. | ||
net.state.NeurAccum[time][postidx] += \ | ||
full_stim[preidx] * net.state.weights[preidx, postidx, time] | ||
|
||
net.log("POST=%d PRE=%d NeurAccum=%g full_stim=%g weight=%g" % \ | ||
(postidx, preidx, net.state.NeurAccum[time][postidx], \ | ||
full_stim[preidx], net.state.weights[preidx, postidx, time])) | ||
|
||
# Inhibitory case | ||
elif net.ConnMat[preidx, postidx, 2] < 0: | ||
# Accumulator decreases as per standard formula. | ||
net.state.NeurAccum[time][postidx] -= \ | ||
full_stim[preidx]*net.state.weights[preidx, postidx, time] | ||
|
||
# Have neurons declare 'interest to fire'. | ||
for neuron in range(len(net.state.NeurAccum[time])): | ||
if net.state.NeurAccum[time][neuron] > net.params.get('FIRETH', 0.8): | ||
# Register 'interest to fire'. | ||
wantToFire[neuron] = 1 | ||
|
||
# STAGE II: Implement constraints from net-level considerations. | ||
# Example: WTA. No resitrictions from net level yet. All neurons that | ||
# want to fire will fire. | ||
net.state.firingCells = wantToFire | ||
|
||
# Barrel shift history | ||
net.state.fireHist[:-1, np.where(np.array(full_stim) != 0)[0]] = \ | ||
net.state.fireHist[1:, np.where(np.array(full_stim) != 0)[0]] | ||
# Save last firing time for all cells that fired in this time step. | ||
net.state.fireHist[net.DEPTH, np.where(np.array(full_stim) != 0)[0]] = \ | ||
time | ||
|
||
# Load 'NN'. | ||
net.state.fireCells[time] = full_stim | ||
|
||
|
||
def plast(net, time): | ||
|
||
if time+2 > net.epochs: | ||
return | ||
|
||
rawin = net.rawin # Raw input | ||
stimin = net.stimin[:, time] # Stimulus input | ||
|
||
full_stim = np.bitwise_or([int(x) for x in rawin], [int(x) for x in stimin]) | ||
|
||
net.state.weights[:, :, time+1] = net.state.weights[:, :, time] | ||
|
||
# For every neuron in the raw input | ||
for neuron in range(len(full_stim)): | ||
|
||
# If neuron is not set to fire (full_stim > 0) just skip the neuron | ||
if full_stim[neuron] == 0: | ||
continue | ||
|
||
# For every presynaptic input the neuron receives. | ||
for preidx in np.where(net.ConnMat[:, neuron, 0] != 0)[0]: | ||
w,b = net.ConnMat[preidx, neuron, 0:2] | ||
if (time - np.max(net.state.fireHist[:, preidx])) <= net.LTPWIN: | ||
# -FIX- parametrise learning step. | ||
# Actually, link it to bias for devices. | ||
p = 1.0/net.pulse(w, b, net.LTP_V, net.LTP_pw) | ||
if net.params.get('NORMALISE', False): | ||
net.state.weights[preidx, neuron, time+1] = normalise_weight(net, p) | ||
else: | ||
net.state.weights[preidx, neuron, time+1] = p | ||
net.log(" LTP --- spiking synapse %d -- %d" % (preidx, neuron)) | ||
|
||
# For every postsynaptic input the neuron receives. | ||
for postidx in np.where(net.ConnMat[neuron, :, 0] != 0)[0]: | ||
w,b=net.ConnMat[neuron,postidx,0:2] | ||
if (time - np.max(net.state.fireHist[:, postidx])) <= net.LTDWIN: | ||
# -FIX- parametrise learning step. | ||
# Actually, link it to bias for devices. | ||
p = 1.0/net.pulse(w, b, net.LTD_V, net.LTD_pw) | ||
if net.params.get('NORMALISE', False): | ||
net.state.weights[neuron, postidx, time+1] = normalise_weight(net, p) | ||
else: | ||
net.state.weights[neuron, postidx, time+1] = p | ||
net.log(" LTD --- spiking synapse %d -- %d" % (neuron, postidx)) | ||
|
||
# For every valid connection between neurons, find out which the | ||
# corresponding memristor is. Then, if the weight is still uninitialised | ||
# take a reading and ensure that the weight has a proper value. | ||
for preidx in range(len(rawin)): | ||
for postidx in range(len(rawin)): | ||
if net.ConnMat[preidx, postidx, 0] != 0: | ||
w, b = net.ConnMat[preidx, postidx, 0:2] | ||
if net.state.weights[preidx, postidx, time] == 0.0: | ||
net.state.weights[preidx, postidx, time] = \ | ||
1.0/net.read(w, b, "NN") | ||
|
||
|
||
def additional_data(net): | ||
# This function should return any additional data that might be produced | ||
# by this core. In this particular case there are None. | ||
return None |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
{ | ||
"LTDWIN": 1, | ||
"LTPWIN": 1, | ||
"NETSIZE": 5, | ||
"DEPTH": 1 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
{ | ||
"LTDWIN": 1, | ||
"EXTIN": 3, | ||
"LTPWIN": 1, | ||
"NETSIZE": 7, | ||
"DEPTH": 1, | ||
"PFLOOR": 4000, | ||
"PCEIL": 9300, | ||
"WEIGHTSCALE": 1.0, | ||
"NORMALISE": true, | ||
"LEAKAGE": -0.2, | ||
"FIRETH": 0.8 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
# PREID, POSTID, W, B, TYPE | ||
# TYPE can be either +1 (excitatory) or -1 (inhibitory) | ||
1, 2, 1, 2, 1 | ||
2, 3, 2, 3, 1 | ||
3, 1, 3, 1, 1 | ||
1, 3, 1, 3, 1 | ||
3, 2, 3, 2, 1 | ||
2, 1, 2, 1, 1 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
# timestamp - comma-separated list of neurons forced to spike at this timestep | ||
11 - 1 | ||
12 - 2 | ||
13 - 3 | ||
14 - 1 | ||
15 - 2 | ||
16 - 3 | ||
17 - 1 | ||
18 - 2 | ||
19 - 3 | ||
20 - 1 | ||
21 - 2 | ||
22 - 3 | ||
23 - 1, 2 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,52 @@ | ||
# PREID, POSTID, W, B, TYPE | ||
# TYPE can be either +1 (excitatory) or -1 (inhibitory) | ||
# this file has been autogenerated with | ||
# | ||
# for i in range(1,8): | ||
# for j in range(1,8): | ||
# if i == j: | ||
# continue | ||
# print("%d, %d, %d, %d, 1" % (i, j, i, j)) | ||
# | ||
1, 2, 1, 2, 1 | ||
1, 3, 1, 3, 1 | ||
1, 4, 1, 4, 1 | ||
1, 5, 1, 5, 1 | ||
1, 6, 1, 6, 1 | ||
1, 7, 1, 7, 1 | ||
2, 1, 2, 1, 1 | ||
2, 3, 2, 3, 1 | ||
2, 4, 2, 4, 1 | ||
2, 5, 2, 5, 1 | ||
2, 6, 2, 6, 1 | ||
2, 7, 2, 7, 1 | ||
3, 1, 3, 1, 1 | ||
3, 2, 3, 2, 1 | ||
3, 4, 3, 4, 1 | ||
3, 5, 3, 5, 1 | ||
3, 6, 3, 6, 1 | ||
3, 7, 3, 7, 1 | ||
4, 1, 4, 1, 1 | ||
4, 2, 4, 2, 1 | ||
4, 3, 4, 3, 1 | ||
4, 5, 4, 5, 1 | ||
4, 6, 4, 6, 1 | ||
4, 7, 4, 7, 1 | ||
5, 1, 5, 1, 1 | ||
5, 2, 5, 2, 1 | ||
5, 3, 5, 3, 1 | ||
5, 4, 5, 4, 1 | ||
5, 6, 5, 6, 1 | ||
5, 7, 5, 7, 1 | ||
6, 1, 6, 1, 1 | ||
6, 2, 6, 2, 1 | ||
6, 3, 6, 3, 1 | ||
6, 4, 6, 4, 1 | ||
6, 5, 6, 5, 1 | ||
6, 7, 6, 7, 1 | ||
7, 1, 7, 1, 1 | ||
7, 2, 7, 2, 1 | ||
7, 3, 7, 3, 1 | ||
7, 4, 7, 4, 1 | ||
7, 5, 7, 5, 1 | ||
7, 6, 7, 6, 1 |
Oops, something went wrong.