Learning to Compute the Square of a VectorΒΆ

Purpose: This is demo shows learning a nonlinear function of a vector.

Comments: The set up here is very similar to the Learning a Communication Channel demo. The main difference is that this demo works in a 2D vector space (instead of a scalar), and that it is learning to compute a nonlinear function (the element-wise square) of its input.

Usage: When you run the network, it automatically has a random white noise input injected into it in both dimensions.

Turn learning on: To allow the learning rule to work, you need to move the ‘switch’ to +1.

Monitor the error: When the simulation starts and learning is on, the error is high. The average error slowly begins to decrease as the simulation continues. After 15s or so of simulation, it will do a reasonable job of computing the square, and the error in both dimensions should be quite small.

Is it working? To see if the right function is being computed, compare the ‘pre’ and ‘post’ population value graphs. You should note that ‘post’ looks kind of like an absolute value of ‘pre’, the ‘post’ will be a bit squashed. You can also check that both graphs of either dimension should hit zero at about the same time.

Output: See the screen capture below.

../_images/learn-square.png
Code:
N=60
D=2

import nef
import nef.templates.learned_termination as learning
import nef.templates.gate as gating
import random

random.seed(27)

net=nef.Network('Learn Square') #Create the network object

# Create input and output populations.
net.make('pre',N,D) #Make a population with 60 neurons, 1 dimensions
net.make('post',N,D) #Make a population with 60 neurons, 1 dimensions

# Create a random function input.
net.make_fourier_input('input', dimensions = D, base=0.1, high=8, power=0.4, seed=0)
               #Create a white noise input function .1 base freq, max 
               #freq 10 rad/s, and RMS of .4; 0 is a seed  

net.connect('input','pre')

# Create a modulated connection between the 'pre' and 'post' ensembles.
learning.make(net,errName='error', N_err=100, preName='pre', postName='post',
    rate=5e-4) #Make an error population with 100 neurons, and a learning 
            #rate of 5e-4

# Set the modulatory signal to compute the desired function
def square(x):
    return [xx*xx for xx in x]

net.connect('pre', 'error', func=square)
net.connect('post', 'error', weight=-1)

# Add a gate to turn learning on and off.
net.make_input('switch',[0]) #Create a controllable input function with 
                             #a starting value of 0 and 0 in the two 
                             #dimensions
gating.make(net,name='Gate', gated='error', neurons=40,
    pstc=0.01) #Make a gate population with 40 neurons, and a postsynaptic 
               #time constant of 10ms
net.connect('switch', 'Gate')

net.add_to_nengo()

Nengo User Manual

Table Of Contents

This Page