Multi-view Restricted Boltzmann Machines with Posterior Consistency
Restricted Boltzmann machines (RBMs) have been proven to be powerful tools in many specific applications, such as representational learning and document modelling. However, the extensions of RBMs are rarely used in the field of multi-view learning. In this paper, we present a new multi-view RBM model, named as the RBM with posterior consistency, for multi-view classification. The RBM with posterior consistency computes multiple representations by regularizing the marginal likelihood function with the consistency among representations from different views. Contrasting with existing multi-view classification methods, such as multi-view Gaussian pro-cess with posterior consistency (MvGP) and consensus and complementarity based maximum entropy discrimination (MED-2C), the RBM with posterior consistency have achieved satisfactory results on two-class and multi-class classification datasets.
KeywordsRestricted Boltzmann machines Representational learning Multi-view learning
This work is supported by the National Natural Science Foundation of China under Grant no. 61672522 and no. 61379101.
- 4.Mittelman, R., Kuipers, B., Savarese, S., Lee, H.: Structured recurrent temporal restricted Boltzmann machines. In: Proceedings of International Conference on Machine Learning, ICML 2014, Beijing, China, pp. 1647–1655, 21–26 June 2014Google Scholar
- 8.Salakhutdinov, R.R., Hinton, G.E.: Deep Boltzmann machines. In: Proceedings of International Conference on Artificial Intelligence and Statistics, AISTATS 2009, Clearwater Beach, pp. 448–455, 16–18 April 2009Google Scholar
- 12.Ravanbakhsh, S., Póczos, B., Schneider, J., Schuurmans, D., Greiner, R.: Stochastic neural net-works with monotonic activation functions. In: Proceedings of International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, pp. 809–818, 9–11 May 2016Google Scholar