A Generalized Kernel Approach to Dissimilarity-based Classification

EM Pekalska, P Paclik, RPW Duin

    Research output: Contribution to journalArticleScientificpeer-review

    Abstract

    Usually, objects to be classified are represented by features. In this paper, we discuss an alternative object representation based on dissimilarity values. If such distances separate the classes well, the nearest neighbor method offers a good solution. However, dissimilarities used in practice are usually far from ideal and the performance of the nearest neighbor rule suffers from its sensitivity to noisy examples. We show that other, more global classification techniques are preferable to the nearest neighbor rule, in such cases. For classification purposes, two different ways of using generalized dissimilarity kernels are considered. In the first one, distances are isometrically embedded in a pseudo-Euclidean space and the classification task is performed there. In the second approach, classifiers are built directly on distance kernels. Both approaches are described theoretically and then compared using experiments with different dissimilarity measures and datasets including degraded data simulating the problem of missing values. Keywords: dissimilarity, embedding, pseudo-Euclidean space, nearest mean classifier, support vector classifier, Fisher linear discriminant
    Original languageUndefined/Unknown
    Pages (from-to)175-211
    Number of pages37
    JournalJournal of Machine Learning Research
    Volume2
    Issue number2
    Publication statusPublished - 2002

    Bibliographical note

    Special Issue on Kernel Methods, phpub 4

    Keywords

    • academic journal papers
    • ZX CWTS JFIS < 1.00

    Cite this