Dimensionality reduction: Difference between revisions
No edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
'''Dimensionality reduction''' is one of the main applications of [[unsupervised learning]] . It can be understood as the process of reducing the number of random variables under consideration by getting a set of principal variables.<ref name="pythonistaplanet.com">{{cite web |title=Real World Applications of Unsupervised Learning |url=https://pythonistaplanet.com/applications-of-unsupervised-learning/ |website=pythonistaplanet.com |accessdate=23 March 2020}}</ref> | '''Dimensionality reduction''' is one of the main applications of [[unsupervised learning]] . It can be understood as the process of reducing the number of random variables under consideration by getting a set of principal variables.<ref name="pythonistaplanet.com">{{cite web |title=Real World Applications of Unsupervised Learning |url=https://pythonistaplanet.com/applications-of-unsupervised-learning/ |website=pythonistaplanet.com |accessdate=23 March 2020}}</ref> | ||
== Categories == | |||
Dimensionaliyy reduction can be devided into two subcategories<ref name="cognitive class">{{cite web |title=Machine Learning - Dimensionality Reduction - Feature Extraction & Selection |url=https://www.youtube.com/watch?v=AU_hBML2H1c |website=youtube.com |accessdate=24 March 2020}}</ref>: | Dimensionaliyy reduction can be devided into two subcategories<ref name="cognitive class">{{cite web |title=Machine Learning - Dimensionality Reduction - Feature Extraction & Selection |url=https://www.youtube.com/watch?v=AU_hBML2H1c |website=youtube.com |accessdate=24 March 2020}}</ref>: | ||
Line 9: | Line 10: | ||
** Embedded | ** Embedded | ||
* Feature extraction: | * Feature extraction: | ||
** Principal component analysis | ** [[Principal component analysis]] | ||
== Algorithms == | |||
Some of the most common dimensionality reduction algorithms in machine learning are listed as follows<ref name="pythonistaplanet.com"/>: | Some of the most common dimensionality reduction algorithms in machine learning are listed as follows<ref name="pythonistaplanet.com"/>: | ||
* | * Principal Component Analysis | ||
* [[Kernel principal component analysis]] (Kernel PCA) | * [[Kernel principal component analysis]] (Kernel PCA) | ||
* [[Locally-Linear Embedding]] | * [[Locally-Linear Embedding]] |
Revision as of 00:53, 24 March 2020
Dimensionality reduction is one of the main applications of unsupervised learning . It can be understood as the process of reducing the number of random variables under consideration by getting a set of principal variables.[1]
Categories
Dimensionaliyy reduction can be devided into two subcategories[2]:
- Feature selection:
- Wrappers
- Filters
- Embedded
- Feature extraction:
Algorithms
Some of the most common dimensionality reduction algorithms in machine learning are listed as follows[1]:
- Principal Component Analysis
- Kernel principal component analysis (Kernel PCA)
- Locally-Linear Embedding