PRTools Contents PRTools User Guide
nusvc

NUSVC

### Trainable classifier: Support Vector Machine, nu-algorithme

[W,J,NU] = NUSVC(A,KERNEL,NU)
[W,J,NU] = A*NUSVC([],KERNEL,NU)
[W,J,NU] = A*NUSVC(KERNEL,NU)

 Input A Dataset KERNEL Untrained mapping to compute kernel by A*(A*KERNEL) during  training, or B*(A*KERNEL) during testing with dataset B.-  String to compute kernel matrices by FEVAL(KERNEL,B,A) Default: linear kernel (PROXM([],'p',1)); NU Regularisation parameter (0 < NU < 1): expected fraction of SV (optional; default: max(leave-one-out 1_NN error,0.01))

 Output W Mapping: Support Vector Classifier J Object indices of support objects NU Actual nu_value used

### Description

Optimises a support vector classifier for the dataset A by quadratic  programming. The difference with the standard SVC routine is the use and  interpretation of the regularisation parameter NU. It is an upperbound  for the expected classification error. By default NU is estimated by the  leave-one-error of the 1_NN rule. For NU = NaN an automatic optimisation  is performed using REGOPTC.

If KERNEL = 0 it is assumed that A is already the kernelmatrix (square).  In this case also a kernel matrix B should be supplied at evaluation by  B*W or PRMAP(B,W).

There are several ways to define KERNEL, e.g. PROXM([],'r',1) for a  radial basis kernel or by USERKERNEL for a user defined kernel.

SVC is basically a two-class classifier. Multi-class problems are solved  in a one-against-rest fashion by MCLASSC. The resulting base-classifiers  are combined by the maximum confidence rule. A better, non-linear  combiner might be QDC, e.g. W = A*(SVC*QDC([],[],1e-6))