Limited Rank Matrix Learning, discriminative dimension reduction and visualization

K Bunte, Petra Schneider, B Hammer, FM Schleif, T Villmann, M Biehl

    Research output: Contribution to journalArticle

    77 Citations (Scopus)

    Abstract

    We present an extension of the recently introduced Generalized Matrix Learning Vector Quantization algorithm. In the original scheme, adaptive square matrices of relevance factors parameterize a discriminative distance measure. We extend the scheme to matrices of limited rank corresponding to low-dimensional representations of the data. This allows to incorporate prior knowledge of the intrinsic dimension and to reduce the number of adaptive parameters efficiently. In particular, for very large dimensional data, the limitation of the rank can reduce computation time and memory requirements significantly. Furthermore, two- or three-dimensional representations constitute an efficient visualization method for labeled data sets. The identification of a suitable projection is not treated as a pre-processing step but as an integral part of the supervised training. Several real world data sets serve as an illustration and demonstrate the usefulness of the suggested method. (c) 2011 Elsevier Ltd. All rights reserved.
    Original languageEnglish
    Pages (from-to)159-173
    Number of pages15
    JournalNeural Networks
    Volume26
    DOIs
    Publication statusPublished - 1 Feb 2012

    Keywords

    • Classification
    • Learning Vector Quantization
    • Visualization
    • Dimension reduction
    • Adaptive metrics

    Fingerprint

    Dive into the research topics of 'Limited Rank Matrix Learning, discriminative dimension reduction and visualization'. Together they form a unique fingerprint.

    Cite this