Abstract
Deep latent generative models have attracted increasing attention due to the capacity of combining the strengths of deep learning and probabilistic models in an elegant way. The data representations learned with the models are often continuous and dense. However in many applications, sparse representations are expected, such as learning sparse high dimensional embedding of data in an unsupervised setting, and learning multi-labels from thousands of candidate tags in a supervised setting. In some scenarios, there could be further restriction on degree of sparsity: the number of non-zero features of a representation cannot be larger than a pre-defined threshold L0. In this paper we propose a sparse deep latent generative model SDLGM to explicitly model degree of sparsity and thus enable to learn the sparse structure of the data with the quantified sparsity constraint. The resulting sparsity of a representation is not fixed, but fits to the observation itself under the pre-defined restriction. In particular, we introduce to each observation i an auxiliary random variable Li, which models the sparsity of its representation. The sparse representations are then generated with a two-step sampling process via two Gumbel-Softmax distributions. For inference and learning, we develop an amortized variational method based on MC gradient estimator. The resulting sparse representations are differentiable with backpropagation. The experimental evaluation on multiple datasets for unsupervised and supervised learning problems shows the benefits of the proposed method.
Original language | English |
---|---|
Title of host publication | 2021 International Joint Conference on Neural Networks (IJCNN) |
Publisher | IEEE |
Pages | 1-9 |
Number of pages | 9 |
ISBN (Electronic) | 9781665439008, 9780738133669 |
ISBN (Print) | 9781665445979 (PoD) |
DOIs | |
Publication status | Published - 20 Sept 2021 |
Event | 2021 International Joint Conference on Neural Networks (IJCNN) - Shenzhen, China Duration: 18 Jul 2021 → 22 Jul 2021 |
Publication series
Name | International Joint Conference on Neural Networks (IJCNN) |
---|---|
Publisher | IEEE |
ISSN (Print) | 2161-4393 |
ISSN (Electronic) | 2161-4407 |
Conference
Conference | 2021 International Joint Conference on Neural Networks (IJCNN) |
---|---|
Period | 18/07/21 → 22/07/21 |
Bibliographical note
Funding Information:The work of NEC Laboratories Europe was partially supported by H2020 MonB5G project (grant agreement no. 871780). The research of G. Serra was supported by H2020 ECOLE project (grant agreement no. 766186).
Publisher Copyright:
© 2021 IEEE.
Keywords
- Visualization
- Supervised learning
- Neural networks
- Memory
- Probabilistic logic
- Particle measurements
- Data models
- Amortized Variational Inference
- Sparsity of Representation
- Deep Latent Generative Models
ASJC Scopus subject areas
- Software
- Artificial Intelligence