| Notations | 13 |
|---|
| Acronyms | 15 |
|---|
| High-Dimensional Data | 16 |
|---|
| Practical motivations | 16 |
| Fields of application | 17 |
| The goals to be reached | 18 |
| Theoretical motivations | 18 |
| How can we visualize high-dimensional spaces? | 19 |
| Curse of dimensionality and empty space phenomenon | 21 |
| Some directions to be explored | 24 |
| Relevance of the variables | 25 |
| Dependencies between the variables | 25 |
| About topology, spaces, and manifolds | 26 |
| Two benchmark manifolds | 29 |
| Overview of the next chapters | 31 |
| Characteristics of an Analysis Method | 32 |
|---|
| Purpose | 32 |
| Expected functionalities | 33 |
| Estimation of the number of latent variables | 33 |
| Embedding for dimensionality reduction | 34 |
| Embedding for latent variable separation | 35 |
| Internal characteristics | 37 |
| Underlying model | 37 |
| Algorithm | 38 |
| Criterion | 38 |
| Example: Principal component analysis | 39 |
| Data model of PCA | 39 |
| Criteria leading to PCA | 41 |
| Functionalities of PCA | 44 |
| Algorithms | 46 |
| Examples and limitations of PCA | 48 |
| Toward a categorization of DR methods | 52 |
| Hard vs. soft dimensionality reduction | 53 |
| Traditional vs. generative model | 54 |
| Linear vs. nonlinear model | <