Probabilistic Nonlinear Dimensionality Reduction

dc.contributor.advisorRios, Cristian
dc.contributor.authorAdams, Matthew
dc.contributor.committeememberWare, Antony
dc.contributor.committeememberGreenberg, Matthew
dc.date2023-02
dc.date.accessioned2022-11-07T19:16:15Z
dc.date.available2022-11-07T19:16:15Z
dc.date.issued2022-11-04
dc.description.abstractHigh-dimensional datasets are present across scientific disciplines. In the analysis of such datasets, dimensionality reduction methods which provide clear interpretations of their model parameters are required. Principal components analysis (PCA) has long been a preferred method for linear dimensionality reduction, but is not recommended for data lying on or near low-dimensional nonlinear manifolds. On the other hand, neural networks have been used for dimension reduction but the associated model parameters have no clear interpretation. The main contribution of the current work is the introduction of probabilistic piecewise PCA, an interpretable model for approximating nonlinear manifolds embedded in high-dimensional space. Probabilistic piecewise PCA serves as a bridge between linear PCA and highly nonlinear neural network approaches to dimensionality reduction. Our model is an extension of probabilistic PCA and may be used when assuming any member of the natural exponential family of distributions on the observations. The model is explicitly defined for Gaussian and Poisson distributions, and posterior distributions for prediction and sampling are computed. A full comparative study of probabilistic piecewise PCA and existing dimensionality reduction methods is presented with a real-world bibliometric dataset.en_US
dc.identifier.citationAdams, M. (2022). Probabilistic nonlinear dimensionality reduction (Doctoral thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca.en_US
dc.identifier.urihttp://hdl.handle.net/1880/115428
dc.identifier.urihttps://dx.doi.org/10.11575/PRISM/40401
dc.language.isoengen_US
dc.publisher.facultyScienceen_US
dc.publisher.institutionUniversity of Calgaryen
dc.rightsUniversity of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission.en_US
dc.subjectmachine learningen_US
dc.subjectBayesian methodsen_US
dc.subjectprincipal components analysisen_US
dc.subjectdimension reductionen_US
dc.subjectvariational methodsen_US
dc.subject.classificationEducation--Mathematicsen_US
dc.subject.classificationArtificial Intelligenceen_US
dc.titleProbabilistic Nonlinear Dimensionality Reductionen_US
dc.typedoctoral thesisen_US
thesis.degree.disciplineMathematics & Statisticsen_US
thesis.degree.grantorUniversity of Calgaryen_US
thesis.degree.nameDoctor of Philosophy (PhD)en_US
ucalgary.item.requestcopytrueen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ucalgary_2022_adams_matthew.pdf
Size:
4.52 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.62 KB
Format:
Item-specific license agreed upon to submission
Description: