Home > machines > prt_machine_sMKL_reg.m

prt_machine_sMKL_reg

PURPOSE ^

Run L1-norm MKL - wrapper for simpleMKL

SYNOPSIS ^

function output = prt_machine_sMKL_reg(d,args)

DESCRIPTION ^

 Run L1-norm MKL - wrapper for simpleMKL
 FORMAT output = prt_machine_sMKL_reg(d,args)
 Inputs:
   d         - structure with data information, with mandatory fields:
     .train      - training data (cell array of matrices of row vectors,
                   each [Ntr x D]). each matrix contains one representation
                   of the data. This is useful for approaches such as
                   multiple kernel learning.
     .test       - testing data  (cell array of matrices row vectors, each
                   [Nte x D])
     .tr_targets - training labels (for classification) or values (for
                   regression) (column vector, [Ntr x 1])
     .use_kernel - flag, is data in form of kernel matrices (true) of in 
                form of features (false)
    args     - simpleMKL arguments
 Output:
    output  - output of machine (struct).
     * Mandatory fields:
      .predictions - predictions of classification or regression [Nte x D]
     * Optional fields:
      .func_val - value of the decision function
      .type     - which type of machine this is (here, 'classifier')
      .
__________________________________________________________________________
 Copyright (C) 2011 Machine Learning & Neuroimaging Laboratory

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function output = prt_machine_sMKL_reg(d,args)
0002 % Run L1-norm MKL - wrapper for simpleMKL
0003 % FORMAT output = prt_machine_sMKL_reg(d,args)
0004 % Inputs:
0005 %   d         - structure with data information, with mandatory fields:
0006 %     .train      - training data (cell array of matrices of row vectors,
0007 %                   each [Ntr x D]). each matrix contains one representation
0008 %                   of the data. This is useful for approaches such as
0009 %                   multiple kernel learning.
0010 %     .test       - testing data  (cell array of matrices row vectors, each
0011 %                   [Nte x D])
0012 %     .tr_targets - training labels (for classification) or values (for
0013 %                   regression) (column vector, [Ntr x 1])
0014 %     .use_kernel - flag, is data in form of kernel matrices (true) of in
0015 %                form of features (false)
0016 %    args     - simpleMKL arguments
0017 % Output:
0018 %    output  - output of machine (struct).
0019 %     * Mandatory fields:
0020 %      .predictions - predictions of classification or regression [Nte x D]
0021 %     * Optional fields:
0022 %      .func_val - value of the decision function
0023 %      .type     - which type of machine this is (here, 'classifier')
0024 %      .
0025 %__________________________________________________________________________
0026 % Copyright (C) 2011 Machine Learning & Neuroimaging Laboratory
0027 
0028 % Written by J. Mourao-Miranda
0029 
0030 def = prt_get_defaults;
0031 
0032 %------------------------------------------------------
0033 % configure simpleMKL options
0034 %------------------------------------------------------
0035 verbose=0;
0036 options.algo='svmreg'; % Choice of algorithm in mklsvm can be either
0037 % 'svmclass' or 'svmreg'
0038 
0039 %------------------------------------------------------
0040 % choosing the stopping criterion
0041 %------------------------------------------------------
0042 options.stopvariation=0; % use variation of weights for stopping criterion
0043 options.stopKKT=0;       % set to 1 if you use KKTcondition for stopping criterion
0044 options.stopdualitygap=1; % set to 1 for using duality gap for stopping criterion
0045 
0046 %------------------------------------------------------
0047 % choosing the stopping criterion value
0048 %------------------------------------------------------
0049 options.seuildiffsigma=1e-2;        % stopping criterion for weight variation
0050 options.seuildiffconstraint=0.1;    % stopping criterion for KKT
0051 options.seuildualitygap=0.01;       % stopping criterion for duality gap
0052 
0053 %------------------------------------------------------
0054 % Setting some numerical parameters
0055 %------------------------------------------------------
0056 options.goldensearch_deltmax=1e-1; % initial precision of golden section search
0057 options.numericalprecision=1e-8;   % numerical precision weights below this value
0058 % are set to zero
0059 options.lambdareg = 1e-8;          % ridge added to kernel matrix
0060 
0061 %------------------------------------------------------
0062 % some algorithms paramaters
0063 %------------------------------------------------------
0064 options.firstbasevariable='first'; % tie breaking method for choosing the base
0065 % variable in the reduced gradient method
0066 options.nbitermax=def.model.l1MKLmaxitr;;             % maximal number of iteration
0067 options.seuil=0;                   % forcing to zero weights lower than this
0068 options.seuilitermax=10;           % value, for iterations lower than this one
0069 
0070 options.miniter=0;                 % minimal number of iterations
0071 options.verbosesvm=0;              % verbosity of inner svm algorithm
0072 options.efficientkernel=0;         % use efficient storage of kernels
0073 options.svmreg_epsilon=0.01;
0074 
0075 % Run simpleMKL
0076 %--------------------------------------------------------------------------
0077 C_opt = args;
0078 options.sigmainit = 1/size(d.train,2)*ones(1,size(d.train,2)); %initialize kernel weights
0079 m = mean(d.tr_targets);  % mean of the training data
0080 tr_targets = d.tr_targets - m; % mean centre targets
0081 
0082 %reshape previously normalized kernel
0083 ktrain = zeros(size(d.train{1},1),size(d.train{1},1),size(d.train,2));
0084 ktest = zeros(size(d.test{1},1),size(d.train{1},1),size(d.train,2));
0085 for k = 1:size(d.train,2)
0086     if sum(sum(isnan(d.train{k})))==0;
0087         ktrain(:,:,k) =  d.train{k}  ;
0088     end
0089     if sum(sum(isnan(d.test{k}))) ==0
0090         ktest(:,:,k) =  d.test{k}   ;
0091     end
0092 end
0093 
0094 [beta,alpha_sv,b,pos,history,obj,status] = mklsvm(ktrain,tr_targets,C_opt,options,verbose);
0095 
0096 alpha = zeros(length(d.tr_targets),1);
0097 alpha(pos) = alpha_sv;
0098 
0099 ktest_final = zeros(length(d.te_targets),length(d.tr_targets));
0100 
0101 for i = 1:size(d.train,2)
0102     ktest_final = ktest_final + beta(i)*ktest(:,:,i);
0103 end
0104 
0105 func_val = ((ktest_final*alpha)+b)+m; % add mean from the training set
0106 
0107 predictions = func_val;
0108 
0109 % Outputs
0110 %-------------------------------------------------------------------------
0111 output.predictions = predictions;
0112 output.func_val    = func_val;
0113 output.type        = 'regression';
0114 output.alpha       = alpha;
0115 output.b           = b;
0116 output.totalSV     = length(alpha_sv);
0117 output.beta        = beta; %kernel weights
0118 
0119 end

Generated on Tue 10-Feb-2015 18:16:33 by m2html © 2005