Home > TNNT_1_07 > FrontEnd > plot_extra.m

plot_extra

PURPOSE ^

PLOT_EXTRA plots data on the GUI specific to the dataset type

SYNOPSIS ^

function plot_extra(TrainingParams,tiOrder,tiAll,TrainingResults,tiAllTest,TestingResults)

DESCRIPTION ^

PLOT_EXTRA plots data on the GUI specific to the dataset type

Description:
Function to plot data on the GUI specific to the dataset type. For
example, for Classification, the Classification Error is plotted.

Syntax:
PLOT_EXTRA(TrainingParams,tiOrder,tiAll,TrainingResults);
PLOT_EXTRA(TrainingParams,tiOrder,tiAll,TrainingResults,tiAllTest,TestingResults);

Input Parameters:
o TrainingParams: A structure that contains information for training a
    theta neuron network, such as the learning method. This structure is
    generated by get_training_params.
o tiOrder: The order in which the input patterns were processed during this
    epoch.  The order is random when using online learning.
o tiAll: A cell array of length a, the number of input patterns, of which
    each cell is a qx2 array that contains the neuron indices for each
    spike time and the input spike times.  q may vary from cell to cell.
o TrainingResults: A structure containing error gradient with respect to
    the weights (DEDW) and delays (DEDTau).  The structure also includes
    the SSE, NonFireFlag to indicate is any neurons are not firing along
    with NonFireCount, an array used to keep track of how many times a
    neuron has not fired over multiple input patterns.  This structure is
    generated by calculate_gradient. If passed as an input, certain
    results are appended.
o tiAllTest: A cell array of input test patterns in the same format as 
    tiAll.
o TestingResults: A structure containing results generated on the testing
    data in the same format as TrainingResults.

Output Parameters:
o None

Example:
>> %Assumes valid simulation training results are loaded below
>> GUIHandle=start_ThNN;
>> Temp=get(GUIHandle,'UserData');
>> Handles=Temp{2};
>> cd('Results');
>> [File,PathLoad] = uigetfile('*.mat','Load Simulation...');
>> cd('..');
>> load([PathLoad, File]);
>> TrainingParams.Handles=Handles;
>> set_params(Handles,ThNN,TrainingParams);
>> plot_extra(TrainingParams,tiOrder,tiAll,TrainingResults);

See also theta_neuron_network

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function plot_extra(TrainingParams,tiOrder,tiAll,TrainingResults,tiAllTest,TestingResults)
0002 %PLOT_EXTRA plots data on the GUI specific to the dataset type
0003 %
0004 %Description:
0005 %Function to plot data on the GUI specific to the dataset type. For
0006 %example, for Classification, the Classification Error is plotted.
0007 %
0008 %Syntax:
0009 %PLOT_EXTRA(TrainingParams,tiOrder,tiAll,TrainingResults);
0010 %PLOT_EXTRA(TrainingParams,tiOrder,tiAll,TrainingResults,tiAllTest,TestingResults);
0011 %
0012 %Input Parameters:
0013 %o TrainingParams: A structure that contains information for training a
0014 %    theta neuron network, such as the learning method. This structure is
0015 %    generated by get_training_params.
0016 %o tiOrder: The order in which the input patterns were processed during this
0017 %    epoch.  The order is random when using online learning.
0018 %o tiAll: A cell array of length a, the number of input patterns, of which
0019 %    each cell is a qx2 array that contains the neuron indices for each
0020 %    spike time and the input spike times.  q may vary from cell to cell.
0021 %o TrainingResults: A structure containing error gradient with respect to
0022 %    the weights (DEDW) and delays (DEDTau).  The structure also includes
0023 %    the SSE, NonFireFlag to indicate is any neurons are not firing along
0024 %    with NonFireCount, an array used to keep track of how many times a
0025 %    neuron has not fired over multiple input patterns.  This structure is
0026 %    generated by calculate_gradient. If passed as an input, certain
0027 %    results are appended.
0028 %o tiAllTest: A cell array of input test patterns in the same format as
0029 %    tiAll.
0030 %o TestingResults: A structure containing results generated on the testing
0031 %    data in the same format as TrainingResults.
0032 %
0033 %Output Parameters:
0034 %o None
0035 %
0036 %Example:
0037 %>> %Assumes valid simulation training results are loaded below
0038 %>> GUIHandle=start_ThNN;
0039 %>> Temp=get(GUIHandle,'UserData');
0040 %>> Handles=Temp{2};
0041 %>> cd('Results');
0042 %>> [File,PathLoad] = uigetfile('*.mat','Load Simulation...');
0043 %>> cd('..');
0044 %>> load([PathLoad, File]);
0045 %>> TrainingParams.Handles=Handles;
0046 %>> set_params(Handles,ThNN,TrainingParams);
0047 %>> plot_extra(TrainingParams,tiOrder,tiAll,TrainingResults);
0048 %
0049 %See also theta_neuron_network
0050 
0051 %Copyright (C) 2008 Sam McKennoch <Samuel.McKennoch@loria.fr>
0052 
0053 
0054 if nargin==4
0055     TestFlag=0;
0056 else
0057     TestFlag=1;
0058 end
0059 
0060 if strcmp(TrainingParams.Type,'Classification')
0061     if TestFlag
0062         plot(TrainingParams.Handles.axes1,[0 1 TrainingParams.TestingFrequency*(1:(length(TrainingResults.ClassErr)-2))],TrainingResults.ClassErr,...
0063             [0 1 TrainingParams.TestingFrequency*(1:(length(TestingResults.ClassErr)-2))],TestingResults.ClassErr);
0064         legend(TrainingParams.Handles.axes1,'% Training Classification Error','% Testing Classification Error','Location','NorthWest');
0065     else
0066         plot(TrainingParams.Handles.axes1,[0 1 TrainingParams.TestingFrequency*(1:(length(TrainingResults.ClassErr)-2))],TrainingResults.ClassErr);%,...
0067         legend(TrainingParams.Handles.axes1,'% Training Classification Error','Location','NorthWest');
0068     end
0069 
0070 
0071 elseif strcmp(TrainingParams.Type,'Regression') %&& testing_valid
0072     Coding=TrainingParams.Coding;
0073     InputNeurons=[];
0074     OutputNeurons=[];
0075     %Note: I guess this could cause problems if DEDW goes to exactly zero
0076     %for some connection, in which we could report extra input or output
0077     %neurons.  A better solution is needed eventually.
0078     for j=1:size(TrainingResults.DEDW,1)
0079         if sum(TrainingResults.DEDW(:,j)~=0)==0
0080             InputNeurons=[InputNeurons; j];
0081         end
0082         if sum(TrainingResults.DEDW(j,:)~=0)==0
0083             OutputNeurons=[OutputNeurons; j];
0084         end
0085     end
0086     
0087     Inputs=code_data(InputNeurons, 'DecodeInputs', tiAll, Coding);
0088     Inputs=Inputs(tiOrder);
0089     Outputs=code_data(OutputNeurons, 'EncodeOutputs', TrainingResults.tsCurrent, Coding);
0090     if TestFlag
0091         TestingInputs=code_data(InputNeurons, 'DecodeInputs', tiAllTest, Coding);
0092         TestingOutputs=code_data(OutputNeurons, 'EncodeOutputs', TestingResults.tsCurrent, Coding);
0093     end
0094     
0095     if size(Inputs,2)>1
0096         figure;
0097         [X,Y]=meshgrid(Coding.InputRange(1):0.1:Coding.InputRange(2),Coding.InputRange(1):0.1:Coding.InputRange(2));
0098         RegressionOutput=TrainingParams.FunctionHandle(X,Y);
0099         surf(X,Y,RegressionOutput);
0100         hold on;
0101         if TestFlag
0102             plot3(TrainingParams.Handles.axes1,Inputs(:,1),Inputs(:,2),Outputs,'o',TestingInputs(:,1),TestingInputs(:,2),TestingOutputs,'x');
0103             legend(TrainingParams.Handles.axes1,'ThNN Train','ThNN Test','Location','North');
0104         else
0105             plot3(TrainingParams.Handles.axes1,Inputs(:,1),Inputs(:,2),Outputs,'o');
0106             legend(TrainingParams.Handles.axes1,'ThNN Train','Location','North');
0107         end
0108     else
0109         RegressionOutput=TrainingParams.FunctionHandle(Coding.InputRange(1):0.1:Coding.InputRange(2));
0110         if TestFlag
0111             plot(TrainingParams.Handles.axes1,Inputs,Outputs,'o',TestingInputs,TestingOutputs,'x',Coding.InputRange(1):0.1:Coding.InputRange(2),RegressionOutput);
0112             legend(TrainingParams.Handles.axes1,'ThNN Train','ThNN Test','Actual','Location','North');
0113         else
0114             plot(TrainingParams.Handles.axes1,Inputs,Outputs,'o',Coding.InputRange(1):0.1:Coding.InputRange(2),RegressionOutput);
0115             legend(TrainingParams.Handles.axes1,'ThNN Train','Actual','Location','North');            
0116         end
0117         set(TrainingParams.Handles.axes1,'YLim',[min(RegressionOutput) max(RegressionOutput)]);
0118     end
0119        
0120 end
0121 
0122 grid(TrainingParams.Handles.axes1,'on');
0123 pause(0.1);

Generated on Wed 02-Apr-2008 15:16:32 by m2html © 2003