Probabilistic Nonlinear Dimensionality Reduction

Date
2022-11-04
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
High-dimensional datasets are present across scientific disciplines. In the analysis of such datasets, dimensionality reduction methods which provide clear interpretations of their model parameters are required. Principal components analysis (PCA) has long been a preferred method for linear dimensionality reduction, but is not recommended for data lying on or near low-dimensional nonlinear manifolds. On the other hand, neural networks have been used for dimension reduction but the associated model parameters have no clear interpretation. The main contribution of the current work is the introduction of probabilistic piecewise PCA, an interpretable model for approximating nonlinear manifolds embedded in high-dimensional space. Probabilistic piecewise PCA serves as a bridge between linear PCA and highly nonlinear neural network approaches to dimensionality reduction. Our model is an extension of probabilistic PCA and may be used when assuming any member of the natural exponential family of distributions on the observations. The model is explicitly defined for Gaussian and Poisson distributions, and posterior distributions for prediction and sampling are computed. A full comparative study of probabilistic piecewise PCA and existing dimensionality reduction methods is presented with a real-world bibliometric dataset.
Description
Keywords
machine learning, Bayesian methods, principal components analysis, dimension reduction, variational methods
Citation
Adams, M. (2022). Probabilistic nonlinear dimensionality reduction (Doctoral thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca.