Dimensionality Reduction Based on ICA for Regression Problems
In manipulating data such as in supervised learning, we often extract new features from the original features for the purpose of reducing the dimensions of feature space and achieving better performance. In this paper, we show how standard algorithms for independent component analysis (ICA) can be applied to extract features for regression problems. The advantage is that general ICA algorithms become available to a task of feature extraction for regression problems by maximizing the joint mutual information between target variable and new features. Using the new features, we can greatly reduce the dimension of feature space without degrading the regression performance.
KeywordsFeature Extraction Mutual Information Independent Component Analysis Learning Rule Regression Problem
Unable to display preview. Download preview PDF.
- 1.Joliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (1986)Google Scholar
- 2.Bell, A.J., Sejnowski, T.J.: An information-maximization approach to blind separation and blind deconvolution. Neural Computation 7(6) (June 1995)Google Scholar
- 3.Lee, T.-W., Girolami, M., Sejnowski, T.J.: Independent component analysis using an extended infomax algorithm for mixed sub-gaussian and super-gaussian sources. Neural Computation 11(2) (February 1999)Google Scholar