Sparse Gaussian Processes Using Backward Elimination
Gaussian Processes (GPs) have state of the art performance in regression. In GPs, all the basis functions are required for prediction; hence its test speed is slower than other learning algorithms such as support vector machines (SVMs), relevance vector machine (RVM), adaptive sparseness (AS), etc. To overcome this limitation, we present a backward elimination algorithm, called GPs-BE that recursively selects the basis functions for GPs until some stop criterion is satisfied. By integrating rank-1 update, GPs-BE can be implemented at a reasonable cost. Extensive empirical comparisons confirm the feasibility and validity of the proposed algorithm.
KeywordsSupport Vector Machine Basis Function Gaussian Process Forward Selection Radial Basis Function Network
Unable to display preview. Download preview PDF.
- 1.Williams, C.K.I., Rasmussen, C.E.: Gaussian Processes for Regression. Advances in Neural Information Processing Systems 8, 514–520 (1996)Google Scholar
- 2.Rasmussen, C.E.: Evaluation of Gaussian Processes and Other Methods for Non-linear Regression. Ph.D. thesis, Dep.of Computer Science, University of Toronto., Available from, http://www.cs.utoronto.ca/~carl/
- 9.Williams, C.K.I.: Prediction with Gaussian Processes: from Linear Regression to Linear Prediction and Beyond. Learning and Inference in Graphical Models, 1–17 (1998)Google Scholar
- 11.Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases, Technical Report. Department of Information and Computer Science. University of California, Irvine (1998), Data available at, http://www.ics.uci.edu/~mlearn/MLRepository.html