Schulz, Alexander: Discriminative dimensionality reduction: variations, applications, interpretations. 2017
Inhalt
- Introduction
- Motivation
- Scientific contributions and structural overview
- Publications in the context of this thesis
- Discriminative dimensionality reduction
- Motivation
- Kernel t-SNE
- T-distributed stochastic neighbor embedding (t-SNE)
- Assessing the quality of dimensionality reduction mappings
- Parametric extension of dimensionality reduction
- Illustration
- Definition of the Fisher metric
- Metrics
- Fisher metric as a special case of the Riemannian metric
- Approximation of the shortest paths
- Example
- Discriminative dimensionality reduction for classification tasks
- Discriminative dimensionality reduction in kernel space
- Discriminative dimensionality reduction for regression tasks
- Gaussian Processes for regression
- Estimating the Fisher matrix based on a Gaussian Process
- Justification for discriminative DR
- Experiments
- Conclusion
- Discussion
- Visualization of functions in high-dimensional spaces
- Motivation
- Dimensionality reduction techniques
- Inverse dimensionality reduction
- General framework
- Experiments with classification functions
- Experiments with regression functions
- Discussion
- Interpretation of data mappings
- Motivation
- Estimating interpretable components for nonlinear DR
- Neighborhood Retrieval Optimizer
- Feature selection for DR
- Relevance learning for DR
- Metric learning for DR
- Experiments
- Valid interpretation of feature relevance for linear data mappings
- Definition and measure of feature relevance
- Linear bounds
- Metric learning as linear data transformation
- Experiments for linear regression
- Experiments for metric learning
- Discussion
- Dimensionality reduction for transfer learning
- Conclusion
- Mathematical derivations
- The Fisher information matrix for a discrete auxiliary variable
- The Fisher information matrix for a continuous auxiliary variable
- Publications in the context of this thesis
- Bibliography
