Distributed Minimum Temperature Prediction Using Mixtures of Gaussian Processes
Abstract
Minimum temperature predictions are required for agricultural producers in order to assess the magnitude of potential frost events. Several regression models can be used for the estimation problem at a single location but one common problem is the amount of required data for training, testing and validation. Nowadays, sensor networks can be used to gather environmental data from multiple locations. In order to alleviate the amount of data needed to model a single site, we can combine information from the different sources and then estimate the performance of the estimator using hold-out test sites. A mixture of Gaussian Processes (MGP) model is proposed for the distributed estimation problem and an efficient Hybrid Monte Carlo approach is also proposed for the estimation of the model parameters.
Keywords
Sensor Node Wireless Sensor Network Root Mean Square Gaussian Process Support Vector RegressionReferences
- 1.Asuncion, A., Smyth, P., Welling, M., Newman, D., Porteous, I., Triglia, S.: Distributed Gibbs sampling for latent variable models. In: Scaling up Machine Learning (2012)Google Scholar
- 2.Chandola, V., Vatsavai, R.R.: Implementing a gaussian process learning algorithm in mixed parallel environment. In: Proceedings of the Second Workshop on Scalable Algorithms for Large-Scale Systems, pp. 3–6. ACM, New York (2011)CrossRefGoogle Scholar
- 3.Chevalier, R.F., Hoogenboom, G., McClendon, R.W., Paz, J.A.: Support vector regression with reduced training sets for air temperature prediction: a comparison with artificial neural networks. Neural Comput. Appl. 20(1), 151–159 (2011)CrossRefGoogle Scholar
- 4.Eddelbuettel, D.: Seamless R and C++ Integration with Rcpp. Springer Publishing Company, Incorporated (2013)Google Scholar
- 5.Jain, A., McClendon, R., Hoogenboom, G., Ramyaa, R.: Prediction of frost for fruit protection using artificial neural networks. American Society of Agricultural Engineers, St. Joseph, MI, ASAE Paper pp. 03–3075 (2003)Google Scholar
- 6.Neal, R.: Regression and classification using Gaussian process priors. In: Bayesian Statistics 6: Proceedings of the Sixth Valencia International Meeting, vol. 6, p. 475 (1998)Google Scholar
- 7.Nguyen, T., Bonilla, E.: Fast allocation of gaussian process experts. In: Proceedings of The 31st International Conference on Machine Learning, pp. 145–153 (2014)Google Scholar
- 8.Peteiro-Barral, D., Guijarro-Berdiñas, B.: A survey of methods for distributed machine learning. Progress in Artificial Intelligence 2(1), 1–11 (2013)CrossRefGoogle Scholar
- 9.Ruiz-Garcia, L., Lunadei, L., Barreiro, P., Robla, I.: A review of wireless sensor technologies and applications in agriculture and food industry: State of the art and current trends. Sensors 9(6), 4728–4750 (2009)CrossRefGoogle Scholar
- 10.Shi, J.Q., Murray-Smith, R., Titterington, D.: Hierarchical gaussian process mixtures for regression. Statistics and Computing 15(1), 31–41 (2005)CrossRefMathSciNetGoogle Scholar
- 11.Smith, B.A., Hoogenboom, G., McClendon, R.W.: Artificial neural networks for automated year-round temperature prediction. Computers and Electronics in Agriculture 68(1), 52–61 (2009)CrossRefGoogle Scholar
- 12.Smith, B.A., McClendon, R.W., Hoogenboom, G.: Improving air temperature prediction with artificial neural networks. International Journal of Computational Intelligence 3(3), 179–186 (2006)Google Scholar
- 13.Snyder, R.L., de Melo-Abreu, J.: Frost forecasting and monitoring. Frost Protection: Fundamentals, Practice, and Economics 1, 91–112 (2005)Google Scholar
- 14.Tresp, V.: Mixtures of Gaussian processes. In: Advances in Neural Information Processing Systems, pp. 654–660 (2001)Google Scholar
- 15.Wang, N., Zhang, N., Wang, M.: Wireless sensors in agriculture and food industry–recent development and future perspective. Computers and Electronics in Agriculture 50(1), 1–14 (2006)CrossRefGoogle Scholar
- 16.Yang, Y., Ma, J.: An efficient EM approach to parameter learning of the mixture of gaussian processes. In: Advances in Neural Networks–ISNN 2011, pp. 165–174. Springer (2011)Google Scholar