Controlled IntegratorΒΆ

Purpose: This demo implements a controlled one-dimensional neural integrator.

Comments: This is the first example of a controlled dynamic network in the demos. This is the same as the integrator circuit, but we now have introduced a population that explicitly effects how well the circuit ‘remembers’ its past state. The ‘control’ input can be used to make the integrator change from a pure communication channel (0) to an integrator (1) with various low-pass filtering occurring in between.

Usage: When you run this demo, it will automatically put in some step functions on the input, so you can see that the output is integrating (i.e. summing over time) the input. You can also input your own values. It is quite sensitive like the integrator. But, if you reduce the ‘control’ input below 1, it will not continuously add its input, but slowly allow that input to leak away. It’s interesting to rerun the simulation from the start, but decreasing the ‘control’ input before the automatic input starts to see the effects of ‘leaky’ integration.

Output: See the screen capture below

../_images/controlledintegrator.png
Code:
import nef

net=nef.Network('Controlled Integrator') #Create the network object
net.make_input('input',{0.2:5, 0.3:0, 0.44:-10,
                            0.54:0, 0.8:5, 0.9:0} )  #Create a controllable input 
                                                     #function with a default function
                                                     #that goes to 5 at time 0.2s, to 0 
                                                     #at time 0.3s and so on

net.make_input('control',[1])  #Create a controllable input function
                               #with a starting value of 1
net.make('A',225,2,radius=1.5) #Make a population with 225 neurons, 2 dimensions, and a 
                               #larger radius to accommodate large simulataneous inputs
                               
net.connect('input','A',transform=[0.1,0],pstc=0.1) #Connect all the relevant
                                                #objects with the relevant 1x2
                                                #mappings, postsynaptic time
                                                #constant is 10ms
net.connect('control','A',transform=[0,1],pstc=0.1)
def feedback(x):
    return x[0]*x[1]
net.connect('A','A',transform=[1,0],func=feedback,pstc=0.1) #Create the recurrent
                                                        #connection mapping the
                                                        #1D function 'feedback'
                                                        #into the 2D population
                                                        #using the 1x2 transform
net.add_to_nengo()

Nengo User Manual

Table Of Contents

This Page