Finding small sets of random fourier features for shift-invariant kernel approximation

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Colleges, School and Institutes

External organisations

  • University of Applied Sciences Würzburg-Schweinfurt
  • Computational Intelligence Group
  • University of Applied Sciences Mittweida


Kernel based learning is very popular in machine learning, but many classical methods have at least quadratic runtime complexity. Random fourier features are very effective to approximate shift-invariant kernels by an explicit kernel expansion. This permits to use efficient linear models with much lower runtime complexity. As one key approach to kernelize algorithms with linear models they are successfully used in different methods. However, the number of features needed to approximate the kernel is in general still quite large with substantial memory and runtime costs. Here, we propose a simple test to identify a small set of random fourier features with linear costs, substantially reducing the number of generated features for low rank kernel matrices, while widely keeping the same representation accuracy. We also provide generalization bounds for the proposed approach.


Original languageEnglish
Title of host publicationArtificial Neural Networks in Pattern Recognition - 7th IAPR TC3 Workshop, ANNPR 2016, Proceedings
Publication statusE-pub ahead of print - 9 Sep 2016
Event7th IAPR TC3 Workshop on Artificial Neural Networks in Pattern Recognition, ANNPR 2016 - Ulm, Germany
Duration: 28 Sep 201630 Sep 2016

Publication series

NameLecture Notes in Computer Science
Volume9896 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference7th IAPR TC3 Workshop on Artificial Neural Networks in Pattern Recognition, ANNPR 2016