Learning MultiplicationΒΆ

Purpose: This is demo shows learning a familiar nonlinear function, multiplication.

Comments: The set up here is very similar to the other learning demos. The main difference is that this demo learns a nonlinear projection from a 2D to a 1D space (i.e. multiplication).

Usage: When you run the network, it automatically has a random white noise input injected into it in both dimensions.

Turn learning on: To allow the learning rule to work, you need to move the ‘switch’ to +1.

Monitor the error: When the simulation starts and learning is on, the error is high. After about 10s it will do a reasonable job of computing the produt, and the error should be quite small.

Is it working? To see if the right function is being computed, compare the ‘pre’ and ‘post’ population value graphs. You should note that if either dimension in the input is small, the output will be small. Only when both dimensions have larger absolute values does the output go away from zero (see the screen capture below).

Output: See the screen capture below.

../_images/learn-product.png
Code:
N=60
D=2

import nef
import nef.templates.learned_termination as learning
import nef.templates.gate as gating
import random

random.seed(37)

net=nef.Network('Learn Product') #Create the network object

# Create input and output populations.
net.make('pre',N,D)#Make a population with 60 neurons, 1 dimensions
net.make('post',N,1) #Make a population with 60 neurons, 1 dimensions

# Create a random function input.
net.make_fourier_input('input', dimensions = D, base=0.1, high=8, power=0.4, seed=0)
               #Create a white noise input function .1 base freq, max 
               #freq 10 rad/s, and RMS of .4; 0 is a seed             

net.connect('input','pre')

# Create a modulated connection between the 'pre' and 'post' ensembles.
learning.make(net,errName='error', N_err=100, preName='pre', postName='post',
    rate=5e-4) #Make an error population with 100 neurons, and a learning 
               #rate of 5e-4

# Set the modulatory signal to compute the desired function
def product(x):
    product=1.0
    for xx in x: product*=xx
    return product

net.connect('pre', 'error', func=product)
net.connect('post', 'error', weight=-1)

# Add a gate to turn learning on and off.
net.make_input('switch',[0]) #Create a controllable input function with 
                            #a starting value of 0 and 0 in the two 
                            #dimensions
gating.make(net,name='Gate', gated='error', neurons=40,
    pstc=0.01) #Make a gate population with 40 neurons, and a postsynaptic 
               #time constant of 10ms
net.connect('switch', 'Gate')

net.add_to_nengo()

Nengo User Manual

Table Of Contents

This Page