Dimensionality reduction: Difference between revisions
No edit summary |
|||
Line 1: | Line 1: | ||
'''Dimensionality reduction''' is one of the main applications of [[unsupervised learning]] . It can be understood as the process of reducing the number of random variables under consideration by getting a set of principal variables.<ref name="pythonistaplanet.com">{{cite web |title=Real World Applications of Unsupervised Learning |url=https://pythonistaplanet.com/applications-of-unsupervised-learning/ |website=pythonistaplanet.com |accessdate=23 March 2020}}</ref> | '''Dimensionality reduction''' is one of the main applications of [[unsupervised learning]] . It can be understood as the process of reducing the number of random variables under consideration by getting a set of principal variables.<ref name="pythonistaplanet.com">{{cite web |title=Real World Applications of Unsupervised Learning |url=https://pythonistaplanet.com/applications-of-unsupervised-learning/ |website=pythonistaplanet.com |accessdate=23 March 2020}}</ref> High dimensionality has many costs, including redundant and irrelevant features which degrade the performance of some algorithms, difficulty in interpretation and visualization, and infeasible computation.<ref name="courses.washington.edu">{{cite web |title=Dimensionality Reduction |url=http://courses.washington.edu/css581/lecture_slides/17_dimensionality_reduction.pdf |website=courses.washington.edu |accessdate=24 March 2020}}</ref> | ||
== Categories == | == Categories == |
Revision as of 01:36, 24 March 2020
Dimensionality reduction is one of the main applications of unsupervised learning . It can be understood as the process of reducing the number of random variables under consideration by getting a set of principal variables.[1] High dimensionality has many costs, including redundant and irrelevant features which degrade the performance of some algorithms, difficulty in interpretation and visualization, and infeasible computation.[2]
Categories
Dimensionality reduction can be devided into two subcategories[3]:
- Feature selection:
- Wrappers
- Filters
- Embedded
- Feature extraction:
Algorithms
Some of the most common dimensionality reduction algorithms in machine learning are listed as follows[1]:
- Principal Component Analysis
- Kernel principal component analysis (Kernel PCA)
- Locally-Linear Embedding