Finding small sets of random fourier features for shift-invariant kernel approximation

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Standard

Finding small sets of random fourier features for shift-invariant kernel approximation. / Schleif, Frank M.; Kaban, Ata; Tino, Peter.

Artificial Neural Networks in Pattern Recognition - 7th IAPR TC3 Workshop, ANNPR 2016, Proceedings. Vol. 9896 LNAI Springer Verlag, 2016. p. 42-54 (Lecture Notes in Computer Science; Vol. 9896 LNAI).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Schleif, FM, Kaban, A & Tino, P 2016, Finding small sets of random fourier features for shift-invariant kernel approximation. in Artificial Neural Networks in Pattern Recognition - 7th IAPR TC3 Workshop, ANNPR 2016, Proceedings. vol. 9896 LNAI, Lecture Notes in Computer Science, vol. 9896 LNAI, Springer Verlag, pp. 42-54, 7th IAPR TC3 Workshop on Artificial Neural Networks in Pattern Recognition, ANNPR 2016, Ulm, Germany, 28/09/16. https://doi.org/10.1007/978-3-319-46182-3_4

APA

Schleif, F. M., Kaban, A., & Tino, P. (2016). Finding small sets of random fourier features for shift-invariant kernel approximation. In Artificial Neural Networks in Pattern Recognition - 7th IAPR TC3 Workshop, ANNPR 2016, Proceedings (Vol. 9896 LNAI, pp. 42-54). (Lecture Notes in Computer Science; Vol. 9896 LNAI). Springer Verlag. https://doi.org/10.1007/978-3-319-46182-3_4

Vancouver

Schleif FM, Kaban A, Tino P. Finding small sets of random fourier features for shift-invariant kernel approximation. In Artificial Neural Networks in Pattern Recognition - 7th IAPR TC3 Workshop, ANNPR 2016, Proceedings. Vol. 9896 LNAI. Springer Verlag. 2016. p. 42-54. (Lecture Notes in Computer Science). https://doi.org/10.1007/978-3-319-46182-3_4

Author

Schleif, Frank M. ; Kaban, Ata ; Tino, Peter. / Finding small sets of random fourier features for shift-invariant kernel approximation. Artificial Neural Networks in Pattern Recognition - 7th IAPR TC3 Workshop, ANNPR 2016, Proceedings. Vol. 9896 LNAI Springer Verlag, 2016. pp. 42-54 (Lecture Notes in Computer Science).

Bibtex

@inproceedings{efceb5c2501344418f8e054eb5b7c30a,
title = "Finding small sets of random fourier features for shift-invariant kernel approximation",
abstract = "Kernel based learning is very popular in machine learning, but many classical methods have at least quadratic runtime complexity. Random fourier features are very effective to approximate shift-invariant kernels by an explicit kernel expansion. This permits to use efficient linear models with much lower runtime complexity. As one key approach to kernelize algorithms with linear models they are successfully used in different methods. However, the number of features needed to approximate the kernel is in general still quite large with substantial memory and runtime costs. Here, we propose a simple test to identify a small set of random fourier features with linear costs, substantially reducing the number of generated features for low rank kernel matrices, while widely keeping the same representation accuracy. We also provide generalization bounds for the proposed approach.",
author = "Schleif, {Frank M.} and Ata Kaban and Peter Tino",
year = "2016",
month = sep,
day = "9",
doi = "10.1007/978-3-319-46182-3_4",
language = "English",
isbn = "9783319461816",
volume = "9896 LNAI",
series = "Lecture Notes in Computer Science",
publisher = "Springer Verlag",
pages = "42--54",
booktitle = "Artificial Neural Networks in Pattern Recognition - 7th IAPR TC3 Workshop, ANNPR 2016, Proceedings",
note = "7th IAPR TC3 Workshop on Artificial Neural Networks in Pattern Recognition, ANNPR 2016 ; Conference date: 28-09-2016 Through 30-09-2016",

}

RIS

TY - GEN

T1 - Finding small sets of random fourier features for shift-invariant kernel approximation

AU - Schleif, Frank M.

AU - Kaban, Ata

AU - Tino, Peter

PY - 2016/9/9

Y1 - 2016/9/9

N2 - Kernel based learning is very popular in machine learning, but many classical methods have at least quadratic runtime complexity. Random fourier features are very effective to approximate shift-invariant kernels by an explicit kernel expansion. This permits to use efficient linear models with much lower runtime complexity. As one key approach to kernelize algorithms with linear models they are successfully used in different methods. However, the number of features needed to approximate the kernel is in general still quite large with substantial memory and runtime costs. Here, we propose a simple test to identify a small set of random fourier features with linear costs, substantially reducing the number of generated features for low rank kernel matrices, while widely keeping the same representation accuracy. We also provide generalization bounds for the proposed approach.

AB - Kernel based learning is very popular in machine learning, but many classical methods have at least quadratic runtime complexity. Random fourier features are very effective to approximate shift-invariant kernels by an explicit kernel expansion. This permits to use efficient linear models with much lower runtime complexity. As one key approach to kernelize algorithms with linear models they are successfully used in different methods. However, the number of features needed to approximate the kernel is in general still quite large with substantial memory and runtime costs. Here, we propose a simple test to identify a small set of random fourier features with linear costs, substantially reducing the number of generated features for low rank kernel matrices, while widely keeping the same representation accuracy. We also provide generalization bounds for the proposed approach.

U2 - 10.1007/978-3-319-46182-3_4

DO - 10.1007/978-3-319-46182-3_4

M3 - Conference contribution

AN - SCOPUS:84987935094

SN - 9783319461816

VL - 9896 LNAI

T3 - Lecture Notes in Computer Science

SP - 42

EP - 54

BT - Artificial Neural Networks in Pattern Recognition - 7th IAPR TC3 Workshop, ANNPR 2016, Proceedings

PB - Springer Verlag

T2 - 7th IAPR TC3 Workshop on Artificial Neural Networks in Pattern Recognition, ANNPR 2016

Y2 - 28 September 2016 through 30 September 2016

ER -