Home > machines > prt_machine_sMKL_cla.m

prt_machine_sMKL_cla

PURPOSE ^

Run L1-norm MKL - wrapper for simpleMKL

SYNOPSIS ^

function output = prt_machine_sMKL_cla(d,args)

DESCRIPTION ^

 Run L1-norm MKL - wrapper for simpleMKL
 FORMAT output = prt_machine_sMKL_cla(d,args)
 Inputs:
   d         - structure with data information, with mandatory fields:
     .train      - training data (cell array of matrices of row vectors,
                   each [Ntr x D]). each matrix contains one representation
                   of the data. This is useful for approaches such as
                   multiple kernel learning.
     .test       - testing data  (cell array of matrices row vectors, each
                   [Nte x D])
     .tr_targets - training labels (for classification) or values (for
                   regression) (column vector, [Ntr x 1])
     .use_kernel - flag, is data in form of kernel matrices (true) of in 
                form of features (false)
    args     - simpleMKL arguments
 Output:
    output  - output of machine (struct).
     * Mandatory fields:
      .predictions - predictions of classification or regression [Nte x D]
     * Optional fields:
      .func_val - value of the decision function
      .type     - which type of machine this is (here, 'classifier')
      .
__________________________________________________________________________
 Copyright (C) 2011 Machine Learning & Neuroimaging Laboratory

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function output = prt_machine_sMKL_cla(d,args)
0002 % Run L1-norm MKL - wrapper for simpleMKL
0003 % FORMAT output = prt_machine_sMKL_cla(d,args)
0004 % Inputs:
0005 %   d         - structure with data information, with mandatory fields:
0006 %     .train      - training data (cell array of matrices of row vectors,
0007 %                   each [Ntr x D]). each matrix contains one representation
0008 %                   of the data. This is useful for approaches such as
0009 %                   multiple kernel learning.
0010 %     .test       - testing data  (cell array of matrices row vectors, each
0011 %                   [Nte x D])
0012 %     .tr_targets - training labels (for classification) or values (for
0013 %                   regression) (column vector, [Ntr x 1])
0014 %     .use_kernel - flag, is data in form of kernel matrices (true) of in
0015 %                form of features (false)
0016 %    args     - simpleMKL arguments
0017 % Output:
0018 %    output  - output of machine (struct).
0019 %     * Mandatory fields:
0020 %      .predictions - predictions of classification or regression [Nte x D]
0021 %     * Optional fields:
0022 %      .func_val - value of the decision function
0023 %      .type     - which type of machine this is (here, 'classifier')
0024 %      .
0025 %__________________________________________________________________________
0026 % Copyright (C) 2011 Machine Learning & Neuroimaging Laboratory
0027 
0028 % Written by J. Mourao-Miranda
0029 
0030 def = prt_get_defaults;
0031 
0032 %------------------------------------------------------
0033 % configure simpleMKL options
0034 %------------------------------------------------------
0035 verbose=0;
0036 options.algo='svmclass'; % Choice of algorithm in mklsvm can be either
0037 % 'svmclass' or 'svmreg'
0038 
0039 %------------------------------------------------------
0040 % choosing the stopping criterion
0041 %------------------------------------------------------
0042 options.stopvariation=0; % use variation of weights for stopping criterion
0043 options.stopKKT=0;       % set to 1 if you use KKTcondition for stopping criterion
0044 options.stopdualitygap=1; % set to 1 for using duality gap for stopping criterion
0045 
0046 %------------------------------------------------------
0047 % choosing the stopping criterion value
0048 %------------------------------------------------------
0049 options.seuildiffsigma=1e-2;        % stopping criterion for weight variation
0050 options.seuildiffconstraint=0.1;    % stopping criterion for KKT
0051 options.seuildualitygap=0.01;       % stopping criterion for duality gap
0052 
0053 %------------------------------------------------------
0054 % Setting some numerical parameters
0055 %------------------------------------------------------
0056 options.goldensearch_deltmax=1e-1; % initial precision of golden section search
0057 options.numericalprecision=1e-8;   % numerical precision weights below this value
0058 % are set to zero
0059 options.lambdareg = 1e-8;          % ridge added to kernel matrix
0060 
0061 %------------------------------------------------------
0062 % some algorithms paramaters
0063 %------------------------------------------------------
0064 options.firstbasevariable='first'; % tie breaking method for choosing the base
0065 % variable in the reduced gradient method
0066 options.nbitermax=def.model.l1MKLmaxitr;  % maximal number of iteration
0067 options.seuil=0;                   % forcing to zero weights lower than this
0068 options.seuilitermax=10;           % value, for iterations lower than this one
0069 
0070 options.miniter=0;                 % minimal number of iterations
0071 options.verbosesvm=0;              % verbosity of inner svm algorithm
0072 options.efficientkernel=0;         % use efficient storage of kernels
0073 
0074 %------------------------------------------------------
0075 % Sanity check
0076 %------------------------------------------------------
0077 SANITYCHECK=true; % can turn off for "speed". Expert only.
0078 
0079 if SANITYCHECK==true
0080     % args should be a string (empty or otherwise)
0081     if ~isnumeric(args)
0082         error('prt_machine_sMKL_cla:MKLargsNotNumber',['Error: L1_MKL'...
0083             ' args should be a number. ' ...
0084             ' SOLUTION: Please change range']);
0085     end
0086     
0087     % check it is indeed a two-class classification problem
0088     uTL=unique(d.tr_targets(:));
0089     nC=numel(uTL);
0090     if nC>2
0091         error('prt_machine_sMKL_cla:problemNotBinary',['Error:'...
0092             ' This machine is only for two-class problems but the' ...
0093             ' current problem has ' num2str(nC) ' ! ' ...
0094             'SOLUTION: Please select another machine than ' ...
0095             'prt_machine_sMKL_cla']);
0096     end
0097     % check it is indeed labelled correctly (probably should be done
0098     if ~all(uTL==[1 2]')
0099         error('prt_machine_sMKL_cla:LabellingIncorect',['Error:'...
0100             ' This machine needs labels to be in {1,2} ' ...
0101             ' but they are ' mat2str(uTL) ' ! ' ...
0102             'SOLUTION: Please relabel your classes by changing the '...
0103             ' ''tr_targets'' argument to prt_machine_sMKL_cla']);
0104     end
0105 end
0106 
0107 %--------------------------------------------------------------------------
0108 % Run simpleMKL
0109 %--------------------------------------------------------------------------
0110 C_opt = args;
0111 options.sigmainit = 1/size(d.train,2)*ones(1,size(d.train,2)); %initialize kernel weights
0112 
0113 % change targets from 1/2 to -1/1
0114 tr_targets = d.tr_targets;
0115 c1PredIdx  = tr_targets  ==1; 
0116 tr_targets  (c1PredIdx)  = 1; %positive values = 1
0117 tr_targets  (~c1PredIdx) = -1; %negative values = 2
0118 
0119 %reshape previously normalized kernel
0120 ktrain = zeros(size(d.train{1},1),size(d.train{1},1),size(d.train,2));
0121 ktest = zeros(size(d.test{1},1),size(d.train{1},1),size(d.train,2));
0122 for k = 1:size(d.train,2)
0123     if sum(sum(isnan(d.train{k})))==0;
0124         ktrain(:,:,k) =  d.train{k}  ;
0125     end
0126     if sum(sum(isnan(d.test{k}))) ==0
0127         ktest(:,:,k) =  d.test{k}   ;
0128     end
0129 end
0130 
0131 [beta,alpha_sv,b,pos] = mklsvm(ktrain,tr_targets,C_opt,options,verbose);
0132 
0133 alpha = zeros(length(d.tr_targets),1);
0134 alpha(pos) = alpha_sv;
0135 
0136 ktest_final = zeros(length(d.te_targets),length(d.tr_targets));
0137 
0138 for i = 1:size(d.train,2)
0139     ktest_final = ktest_final + beta(i)*ktest(:,:,i);
0140 end
0141 
0142 func_val = (ktest_final*alpha)+b;
0143 
0144 predictions = sign(func_val);
0145 
0146 
0147 % Outputs
0148 %--------------------------------------------------------------------------
0149 % change predictions from 1/-1 to 1/2
0150 c1PredIdx               = predictions==1; 
0151 predictions(c1PredIdx)  = 1; %positive values = 1
0152 predictions(~c1PredIdx) = 2; %negative values = 2
0153 
0154 output.predictions = predictions;
0155 output.func_val    = func_val;
0156 output.type        = 'classifier';
0157 output.alpha       = alpha;
0158 output.b           = b;
0159 output.totalSV     = length(alpha_sv);
0160 output.beta        = beta; %kernel weights
0161 
0162 end

Generated on Tue 10-Feb-2015 18:16:33 by m2html © 2005