Probabilistic modelling of general noisy multi-manifold data sets

Marco Canducci, Peter Tino, Michele Mastropietro

Research output: Contribution to journalArticlepeer-review

140 Downloads (Pure)


The intrinsic nature of noisy and complex data sets is often concealed in low-dimensional structures embedded in a higher dimensional space. Number of methodologies have been developed to extract and represent such structures in the form of manifolds (i.e. geometric structures that locally resemble continuously deformable intervals of ℝj. Usually a-priori knowledge of the manifold’s intrinsic dimensionality is required. Additionally, their performance can often be hampered by the presence of a significant high-dimensional noise aligned along the low-dimensional core manifold. In real-world applications, the data can contain several low-dimensional structures of different dimensionalities. We propose a framework for dimensionality estimation and reconstruction of multiple noisy manifolds embedded in a noisy environment. To the best of our knowledge, this work represents the first attempt at detection and modelling of a set of coexisting general noisy manifolds by uniting two aspects of multi-manifold learning: the recovery and approximation of core noiseless manifolds and the construction of their probabilistic models. The easy-to-understand hyper-parameters can be manipulated to obtain an emerging picture of the multi-manifold structure of the data. We demonstrate the workings of the framework on two synthetic data sets,
Original languageEnglish
Article number103579
Number of pages29
JournalArtificial Intelligence
Early online date31 Aug 2021
Publication statusPublished - Jan 2022


  • Density estimation
  • Dimensionality estimation
  • Generative topographic mapping
  • Latent variable models
  • Multi-manifold Learning
  • Probabilistic modelling
  • Riemannian manifolds


Dive into the research topics of 'Probabilistic modelling of general noisy multi-manifold data sets'. Together they form a unique fingerprint.

Cite this