A Matlab toolbox for pattern recognition Imported pages from 37Steps

Home   Guide   Software   Glossary   FAQ   Blog   About

DisTools examples: Combining dissimilarity matrices:

Different dissimilarity measure will generate different representations. Various ways of combining them are possible. Some examples are shown here. It is assumed that readers are familiar with PRTools and will consult the following pages where needed:

If for the same set of objects a set of dissimilarity measures has been measured, the following options may be considered to combine them:

  1. Select by crossvalidation the best one.
  2. Normalize and average.
  3. Weight and sum. Optimization is similar to what is studied in metric learning.
  4. Concatenate all dissimilarity spaces. This is realized by horizontal concatenation of the dissimilarity matrices.
  5. Determine for every dissmilarity measure (matrix) the best classifier and combine them.

Below are some examples. In PRDisData the following datasets consist of a set of dissimilarity matrices of the same objects: chickenpieces (44 sets), flowcytodis(4 sets),  mfeatdis(6 sets), cover80, covers_beethoven, covers_beatles, covers_songs. Moreover, the separately mentioned datasets coildelftdiff, coildelftsame and coilyork refer to the same set of images in the coil database and polydish57 and polydism57 are based on the same set of polygons.


  1. Take one of the multi-dismat problems.
  2. Normalize all matrices by disnorm
  3. Decide for one or more classifiers to use: knnc, fisherc, svc, loglc.
  4. Determine for every dismat by cross-validation the performance.
  5. Average all matrices and determine like above the performance of your classifier(s).
  6. Concatenate all matrices and determine like above the performance of your classifier(s).
  7. Are you able to combine classifiers and determine the performance of the combiner?

elements: datasets datafiles cells and doubles mappings classifiers mapping types.
operations:datasets datafiles cells and doubles mappings classifiers stacked parallel sequential dyadic.
user commands:datasets representation classifiers evaluation clustering examples support routines.
introductory examples:IntroductionScatterplotsDatasets Datafiles Mappings Classifiers Evaluation Learning curves Feature curves Dimension reductionCombining classifiers Dissimilarities.
advanced examples.