Home > TNNT_1_07 > @theta_neuron_network > theta_neuron_network.m

theta_neuron_network

PURPOSE ^

THETA_NEURON_NETWORK constructs a theta neuron network object

SYNOPSIS ^

function ThNN = theta_neuron_network(varargin)

DESCRIPTION ^

THETA_NEURON_NETWORK constructs a theta neuron network object

Description:
This function constructs theta neuron network object.  Inputs,
all of which are optional, follow a Name/Value format. 
Any invalid inputs cause -1 to be returned.

Syntax:
ThNN = THETA_NEURON_NETWORK('PropertyName',PropertyValue,...);

Input Parameters:
o SimulationMethod: Possible values are 'Event-Driven' or 'Numerical'.
    'Event-Driven' is the default value.
o NIPS2007Gradient: Boolean flag to indicate that the gradient should be
    calculated by the approximation method in "Temporal Coding using the
    Response Properties of Spiking Neurons" by T. Voegtlin. The default 
    value is zero.
o TimeStep: The time step to use is the simulation method is numerical
    integration.  The default value is 0.02.
o ReferenceTime: Assigning a number to the reference time and layer array
    is used for the structure format, then a neuron with index 1 is
    automatically connected to all neurons in the network and made to fire 
    at time equals reference time. Reference time may be an array. The 
    default value is 1.  If reference time is assigned to a negative 
    number, then no reference neuron is assumed to exist. 
o InitialWeightMethod: A cell array indicating the method by which initial
    weights are determined. Possible values are:
      {'Fixed', val} - val can either be a scalar or matrix
      {'RandomNormal', mean, variance}
      {'RandomUniform', minimum, maximum}
    The default value is {'RandomNormal', 0.01, 0.001}.
o InitialDelayMethod: A cell array indentical to the initial weight
    method, but which initializes the weights. The default is {'Fixed',
    0}.
o StructureFormat: A cell array that determines the structure of the
    network. Possible values are:
      {'LayerArray', [NumNeuronsLayer1 NumNeuronsLayer2 ...]}
      {'ConnectionMatrix', ConnectionMatrix}
      {'GivenStructure', Weights, Delays}
    For the connection matrix, a 1 at index (i,j) indicates that neuron i
    has a directed connection to neuron j. For given structure, the weight
    and delay initialization methods are overwritten and instead the
    weights and delays are directly assigned.  The default value is
    {'LayerArray' [1 1]}.
o Alpha: Scalar scaling constant for all neurons. Alpha may be a scalar or 
    an array.  An array of size NumNeurons, enables different values to be
    assigned to each neuron. The default value is 1.
o Io: Scalar baseline current for all neurons. Io may be a scalar or an
    array.  An array of size NumNeurons, enables different values to be
    assigned to each neuron. The default value is -0.005.
o Phases: A cell array in the case where are neurons are assigned the same 
    initial phase, or a cell array of cells in the case where each neuron
    is assigned a different initial phase. Each member of the low-level 
    array is a cell of size 1x2 or 1x1 that indicates the phase
    initialization method. Possible Values are:
      {'PositiveFixedPoint'}
      {'NegativeFixedPoint'}
      {'PositiveFixedPointDelta' Delta}
      {'NegativeFixedPointDelta' Delta}
      {'GivenValue' Value}  
    When Io>0, the default is a GivenValue of 0. When Io<0, the default
    is a PositiveFixedPointDelta of 1e-4.

Output Parameter:
o ThNN: A theta neuron network object. This object can be run or trained.

Class Functions:
Type "methods('theta_neuron_network')"

Examples:
>> ThNN1=theta_neuron_network;

>> ThNN2=theta_neuron_network('ReferenceTime',-1, ...
     'StructureFormat', {'LayerArray', [1 1 1]}, 'Phases', ...
     {{'PositiveFixedPoint'},{'NegativeFixedPoint'},{'GivenValue' 1}}, ...
     'Alpha',[1 2 3]);

See also
    theta_neuron
    theta_neuron_network/apply_gradient
    theta_neuron_network/calculate_gradient
    theta_neuron_network/code_data
    theta_neuron_network/display
    theta_neuron_network/get_error
    theta_neuron_network/get_input_neurons
    theta_neuron_network/get_output_neurons
    theta_neuron_network/numerical_gradient
    theta_neuron_network/relate_q_to_k
    theta_neuron_network/run_network
    theta_neuron_network/run_networked_neuron
    theta_neuron_network/subsasgn
    theta_neuron_network/subsref
    theta_neuron_network/train
    theta_neuron_network/train_dataset
    theta_neuron_network/train_epoch

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SUBFUNCTIONS ^

SOURCE CODE ^

0001 function ThNN = theta_neuron_network(varargin)
0002 %THETA_NEURON_NETWORK constructs a theta neuron network object
0003 %
0004 %Description:
0005 %This function constructs theta neuron network object.  Inputs,
0006 %all of which are optional, follow a Name/Value format.
0007 %Any invalid inputs cause -1 to be returned.
0008 %
0009 %Syntax:
0010 %ThNN = THETA_NEURON_NETWORK('PropertyName',PropertyValue,...);
0011 %
0012 %Input Parameters:
0013 %o SimulationMethod: Possible values are 'Event-Driven' or 'Numerical'.
0014 %    'Event-Driven' is the default value.
0015 %o NIPS2007Gradient: Boolean flag to indicate that the gradient should be
0016 %    calculated by the approximation method in "Temporal Coding using the
0017 %    Response Properties of Spiking Neurons" by T. Voegtlin. The default
0018 %    value is zero.
0019 %o TimeStep: The time step to use is the simulation method is numerical
0020 %    integration.  The default value is 0.02.
0021 %o ReferenceTime: Assigning a number to the reference time and layer array
0022 %    is used for the structure format, then a neuron with index 1 is
0023 %    automatically connected to all neurons in the network and made to fire
0024 %    at time equals reference time. Reference time may be an array. The
0025 %    default value is 1.  If reference time is assigned to a negative
0026 %    number, then no reference neuron is assumed to exist.
0027 %o InitialWeightMethod: A cell array indicating the method by which initial
0028 %    weights are determined. Possible values are:
0029 %      {'Fixed', val} - val can either be a scalar or matrix
0030 %      {'RandomNormal', mean, variance}
0031 %      {'RandomUniform', minimum, maximum}
0032 %    The default value is {'RandomNormal', 0.01, 0.001}.
0033 %o InitialDelayMethod: A cell array indentical to the initial weight
0034 %    method, but which initializes the weights. The default is {'Fixed',
0035 %    0}.
0036 %o StructureFormat: A cell array that determines the structure of the
0037 %    network. Possible values are:
0038 %      {'LayerArray', [NumNeuronsLayer1 NumNeuronsLayer2 ...]}
0039 %      {'ConnectionMatrix', ConnectionMatrix}
0040 %      {'GivenStructure', Weights, Delays}
0041 %    For the connection matrix, a 1 at index (i,j) indicates that neuron i
0042 %    has a directed connection to neuron j. For given structure, the weight
0043 %    and delay initialization methods are overwritten and instead the
0044 %    weights and delays are directly assigned.  The default value is
0045 %    {'LayerArray' [1 1]}.
0046 %o Alpha: Scalar scaling constant for all neurons. Alpha may be a scalar or
0047 %    an array.  An array of size NumNeurons, enables different values to be
0048 %    assigned to each neuron. The default value is 1.
0049 %o Io: Scalar baseline current for all neurons. Io may be a scalar or an
0050 %    array.  An array of size NumNeurons, enables different values to be
0051 %    assigned to each neuron. The default value is -0.005.
0052 %o Phases: A cell array in the case where are neurons are assigned the same
0053 %    initial phase, or a cell array of cells in the case where each neuron
0054 %    is assigned a different initial phase. Each member of the low-level
0055 %    array is a cell of size 1x2 or 1x1 that indicates the phase
0056 %    initialization method. Possible Values are:
0057 %      {'PositiveFixedPoint'}
0058 %      {'NegativeFixedPoint'}
0059 %      {'PositiveFixedPointDelta' Delta}
0060 %      {'NegativeFixedPointDelta' Delta}
0061 %      {'GivenValue' Value}
0062 %    When Io>0, the default is a GivenValue of 0. When Io<0, the default
0063 %    is a PositiveFixedPointDelta of 1e-4.
0064 %
0065 %Output Parameter:
0066 %o ThNN: A theta neuron network object. This object can be run or trained.
0067 %
0068 %Class Functions:
0069 %Type "methods('theta_neuron_network')"
0070 %
0071 %Examples:
0072 %>> ThNN1=theta_neuron_network;
0073 %
0074 %>> ThNN2=theta_neuron_network('ReferenceTime',-1, ...
0075 %     'StructureFormat', {'LayerArray', [1 1 1]}, 'Phases', ...
0076 %     {{'PositiveFixedPoint'},{'NegativeFixedPoint'},{'GivenValue' 1}}, ...
0077 %     'Alpha',[1 2 3]);
0078 %
0079 %See also
0080 %    theta_neuron
0081 %    theta_neuron_network/apply_gradient
0082 %    theta_neuron_network/calculate_gradient
0083 %    theta_neuron_network/code_data
0084 %    theta_neuron_network/display
0085 %    theta_neuron_network/get_error
0086 %    theta_neuron_network/get_input_neurons
0087 %    theta_neuron_network/get_output_neurons
0088 %    theta_neuron_network/numerical_gradient
0089 %    theta_neuron_network/relate_q_to_k
0090 %    theta_neuron_network/run_network
0091 %    theta_neuron_network/run_networked_neuron
0092 %    theta_neuron_network/subsasgn
0093 %    theta_neuron_network/subsref
0094 %    theta_neuron_network/train
0095 %    theta_neuron_network/train_dataset
0096 %    theta_neuron_network/train_epoch
0097 
0098 
0099 %Copyright (C) 2008 Sam McKennoch <Samuel.McKennoch@loria.fr>
0100 
0101 %Check to see if formatting is correct
0102 if mod(nargin,2)~=0
0103     disp('Error in theta neuron network constructor: There must be an even number of inputs');
0104     ThNN=-1;
0105     return;
0106 end
0107 
0108 %Assign Default Values, Alpha, Io, Phases left unassigned for default values
0109 ThNN.SimulationMethod='Event-Driven';
0110 ThNN.NIPS2007Gradient=0;
0111 ThNN.TimeStep=0.02;
0112 ThNN.ReferenceTime=1; %Assign this to a negative value if don't want to use a reference time and are using layers
0113 ThNN.InitialWeightMethod = {'RandomNormal', 0.01, 0.001};
0114 ThNN.InitialDelayMethod = {'Fixed', 0};
0115 ThNN.StructureFormat = {'LayerArray', [1 1]};
0116 
0117 %Cycle through inputs and assign naively
0118 for j=1:(nargin/2)
0119     switch varargin{2*j-1}
0120         case {'Alpha','Io'}
0121             eval([varargin{2*j-1},'=[',num2str(varargin{2*j}),'];']);
0122         case 'Phases'
0123             if ~iscell(varargin{2*j}{1})
0124                 if length(varargin{2*j})==1
0125                     eval([varargin{2*j-1},'={{''',varargin{2*j}{1},'''}};']);
0126                 else
0127                     eval([varargin{2*j-1},'={{''',varargin{2*j}{1},''',', num2str(varargin{2*j}{2}),'}};']);
0128                 end
0129             else
0130                 Phases=varargin{2*j};
0131             end
0132         case fields(ThNN)
0133             ThNN.(varargin{2*j-1})=varargin{2*j};
0134         otherwise
0135             disp(['Warning: Value ', varargin{2*j-1}, ' is unknown; ignoring...']);
0136     end
0137 end
0138 
0139 switch ThNN.StructureFormat{1}
0140         
0141         %Given Connections By Stating Numbers of Neurons in Each Layer
0142         %Default values used for parameters plus weight and delay values
0143         %'Structure Format',{'LayerArray',LayerArray,['Alpha',Alpha,'Io',Io,'Phases',Phases]}
0144     case 'LayerArray' 
0145         Layers = ThNN.StructureFormat{2};
0146         if ThNN.ReferenceTime<0
0147             Layers(1)=Layers(1)-1;
0148         end
0149         ConnectionMatrix=make_connection_matrix(Layers);
0150         if ThNN.ReferenceTime<0 && length(Layers)>2
0151             ConnectionMatrix(1,(Layers(1)+Layers(2)+2):(sum(Layers)+1))=0;
0152         end
0153         ThNN.Weights = initialize_connections(ConnectionMatrix,ThNN.InitialWeightMethod);
0154         ThNN.Delays = initialize_connections(ConnectionMatrix,ThNN.InitialDelayMethod);
0155         
0156         %Given Connections Through a Connection Matrix
0157         %Default values used for parameters plus weight and delay values
0158         %'Structure Format',{'ConnectionMatrix',ConnectionMatrix}
0159     case 'ConnectionMatrix' 
0160         ConnectionMatrix=ThNN.StructureFormat{2};
0161         ThNN.Weights = initialize_connections(ConnectionMatrix,ThNN.InitialWeightMethod);
0162         ThNN.Delays = initialize_connections(ConnectionMatrix,ThNN.InitialDelayMethod);
0163 
0164         %Given Connections Through a Connection Matrix
0165         %Default values used for parameters plus weight and delay values
0166         %'Structure Format',{'ConnectionMatrix',ConnectionMatrix}
0167     case 'ConnectionProbability' 
0168         ConnectionProbability=ThNN.StructureFormat{2};
0169         NumNeurons=ThNN.StructureFormat{3};
0170         ConnectionMatrix=rand(NumNeurons)<ConnectionProbability;
0171         ThNN.Weights = initialize_connections(ConnectionMatrix,ThNN.InitialWeightMethod);
0172         ThNN.Delays = initialize_connections(ConnectionMatrix,ThNN.InitialDelayMethod);
0173 
0174         %Explicitly Given Weights and Delays
0175         %Default values used for parameters
0176         %'StructureFormat',{'GivenStructure',Weights,Delays}
0177     case 'GivenStructure' %Explicitly Given Weights and Delays
0178         ThNN.Weights=ThNN.StructureFormat{2};
0179         ThNN.Delays=ThNN.StructureFormat{3};
0180         ConnectionMatrix = ((ThNN.Weights~=0) | (ThNN.Delays~=0));
0181         
0182     otherwise
0183         disp('Error in theta neuron network constructor: StructureFormat not Recognized');
0184         ThNN=-1;
0185         return;
0186 end
0187 
0188 %Placeholder Fields to Be updated by update_structure function (to save on computation later)
0189 ThNN.InputNeurons=0;
0190 ThNN.OutputNeurons=0;
0191 ThNN.RecursionFlag=0;
0192 ThNN.NeuronQueue=[];
0193 ThNN.RelativeInputNeurons={};
0194 ThNN.RelativeOutputNeurons={};
0195 
0196 %Use Neuron Defaults For Alpha, Io, InitialPhase unless specified
0197 ThNN.Neurons=theta_neuron('Size',max(size(ConnectionMatrix,1)));
0198 if exist('Alpha')==1 || exist('Io')==1 || exist('Phases')==1
0199     for j=1:max(size(ConnectionMatrix))
0200         ParamString='';
0201         if exist('Alpha')==1
0202             if isscalar(Alpha) %Matrix or Scalar?
0203                 CurrentAlpha=Alpha;
0204                 ParamString=[ParamString, '''Alpha'', CurrentAlpha'];                
0205             else
0206                 CurrentAlpha=Alpha(j);
0207                 ParamString=[ParamString, '''Alpha'', CurrentAlpha'];
0208             end
0209             if exist('Io')==1 || exist('Phases')==1
0210                 ParamString=[ParamString, ','];
0211             end
0212         end
0213         if exist('Io')==1
0214             if isscalar(Io) %Matrix or Scalar?
0215                 CurrentIo=Io;
0216                 ParamString=[ParamString, '''Io'', CurrentIo'];
0217             else
0218                 CurrentIo=Io(j);
0219                 ParamString=[ParamString, '''Io'', CurrentIo'];
0220             end
0221             if exist('Phases')==1
0222                 ParamString=[ParamString, ','];
0223             end            
0224         end
0225         if exist('Phases')==1
0226             if length(Phases)==1 %Matrix or Scalar?
0227                 CurrentPhase=Phases{1};
0228                 ParamString=[ParamString, '''InitialPhaseMethod'', CurrentPhase'];
0229             else
0230                 CurrentPhase=Phases{j};
0231                 ParamString=[ParamString, '''InitialPhaseMethod'', CurrentPhase'];                
0232             end
0233         end
0234         eval(['ThNN.Neurons(j)=theta_neuron(',ParamString,');']);
0235     end
0236 end
0237 ThNN = class(ThNN,'theta_neuron_network');
0238 
0239 ThNN=update_structure(ThNN);
0240 
0241 function ConnectionMatrix=make_connection_matrix(Layers)
0242 
0243 ConnectionMatrix=zeros(sum(Layers)+1); %+1 for the reference neuron
0244 ConnectionMatrix(1,((Layers(1)+2):(sum(Layers)+1)))=1;
0245 for m=1:(length(Layers)-1)
0246 %    CurrentLayer=[2 Layers(1)+1];
0247 %    NextLayer=[Layers(1)+2 sum(Layers(1:2))+1];
0248     CurrentLayer=[sum(Layers(1:(m-1)))+2 sum(Layers(1:m))+1];
0249     NextLayer=[sum(Layers(1:m))+2 sum(Layers(1:(m+1)))+1];
0250     for j=CurrentLayer(1):CurrentLayer(2)
0251         for k=NextLayer(1):NextLayer(2)
0252             ConnectionMatrix(j,k)=1;
0253         end
0254     end
0255 end
0256 
0257 function ValMatrix = initialize_connections(ConnectionMatrix,InitialMethod);
0258 %InitialMethods
0259 %'Fixed' 'val'
0260 %'RandomNormal' 'mean' 'std' %This is what I used before with 0.01 and maybe 0.1?
0261 %'RandomUniform' 'min' 'max'
0262 
0263 
0264 switch InitialMethod{1}
0265     case 'Fixed' %Can handle scalar or matrix
0266             ValMatrix = ConnectionMatrix.*InitialMethod{2};            
0267     case 'RandomNormal'
0268         ValMatrix = ConnectionMatrix.*((InitialMethod{3}*randn(size(ConnectionMatrix)))+InitialMethod{2});
0269     case 'RandomUniform'
0270         ValMatrix = ConnectionMatrix.*((rand(size(ConnectionMatrix))*(InitialMethod{3}-InitialMethod{2}))+InitialMethod{2});
0271 end

Generated on Wed 02-Apr-2008 15:16:32 by m2html © 2003