Purpose: This demo uses the basal ganglia model to cycle through a 5 element sequence.
Comments: This basal ganglia is now hooked up to a memory. This allows it to update that memory based on its current input/action mappings. The mappings are defined in the code such that A->B, B->C, etc. until E->A completing a loop. This uses the ‘spa’ module from Nengo.
The ‘utility’ graph shows the utility of each rule going into the basal ganglia. The ‘rule’ graph shows which one has been selected and is driving thalamus.
Usage: When you run the network, it will go through the sequence forever. It’s interesting to note the distance between the ‘peaks’ of the selected items. It’s about 40ms for this simple action. We like to make a big deal of this.
Output: See the screen capture below.
from spa import * D=16 class Rules: #Define the rules by specifying the start state and the #desired next state def A(state='A'): #e.g. If in state A set(state='B') # then go to state B def B(state='B'): set(state='C') def C(state='C'): set(state='D') def D(state='D'): set(state='E') def E(state='E'): set(state='A') class Sequence(SPA): #Define an SPA model (cortex, basal ganglia, thalamus) dimensions=16 state=Buffer() #Create a working memory (recurrent network) object: #i.e. a Buffer BG=BasalGanglia(Rules()) #Create a basal ganglia with the prespecified #set of rules thal=Thalamus(BG) # Create a thalamus for that basal ganglia (so it # uses the same rules) input=Input(0.1,state='D') #Define an input; set the input to #state D for 100 ms seq=Sequence()