Advertisement

Using Advanced Regression Models for Determining Optimal Soil Heterogeneity Indicators

  • Georg RußEmail author
  • Rudolf Kruse
  • Martin Schneider
  • Peter Wagner
Conference paper
Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)

Abstract

Nowadays in agriculture, with the advent of GPS-based vehicles and sensor-aided fertilization, large amounts of data are collected. With the importance of carrying out effective and sustainable agriculture getting more and more obvious, those data have to be turned into information – clearly a data analysis task. Furthermore, there are novel soil sensors which might indicate a field’s heterogeneity. Those sensors have to be evaluated and their potential usefulness should be assessed. Our approach consists of two stages, of which the first stage is presented in this article. The data attributes will be comparable to the ones described in Ruß (2008). In the first stage, we will build and evaluate models for the given data sets. We will present a comparison between results using neural networks, regression trees and SVM regression. Results for an MLP neural network have been published in Ruß et al. (2008). In a future second stage, we will use the model information to evaluate and classify new sensor data. We will then assess their usefulness for the purpose of (yield) optimization.

Keywords

Root Mean Square Error Radial Basis Function Regression Tree Yield Prediction Fertilization Strategy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory (pp. 144–152). New York: ACM Press.Google Scholar
  2. Breiman, L., Friedman, J., Olshen, R., & Stone, C. (1984). Classification and Regression Trees. Monterey, CA: Wadsworth and Brooks.zbMATHGoogle Scholar
  3. Collobert, R., Bengio, S., & Williamson, C. (2001). Svmtorch: Support vector machines for large-scale regression problems. Journal of Machine Learning Research, 1, 143–160.CrossRefGoogle Scholar
  4. Drummond, S., Joshi, A., & Sudduth, K. A. (1998). Application of neural networks: precision farming. In International Joint Conference on Neural Networks, IEEE World Congress on Computational Intelligence (Vol. 1, pp. 211–215).Google Scholar
  5. Gunn, S. R. (1998). Support vector machines for classification and regression. Technical Report, School of Electronics and Computer Science, University of Southampton, Southampton, U.K.Google Scholar
  6. Hagan, M. T. (1995). Neural network design (electrical engineering). Thomson Learning.Google Scholar
  7. Haykin, S. (1998). Neural networks: A Comprehensive Foundation (2nd ed.). Englewood, Cliffs, NJ: Prentice Hall.Google Scholar
  8. Hecht-Nielsen, R. (1990). Neurocomputing. Reading, MA, USA: Addison-Wesley.Google Scholar
  9. Mejía-Guevara, I., & Kuri-Morales, A. (2007). Evolutionary feature and parameter selection in support vector regression. In Lecture Notes in Computer Science (Vol. 4827, pp. 399–408). Berlin, Heidelberg: Springer.Google Scholar
  10. Mitchell, T. M. (1997). Machine learning. NY, USA: McGraw-Hill Science/Engineering/Math.zbMATHGoogle Scholar
  11. Quinlan, J. R. (1986). Induction of decision trees. Machine Learning, 1(1), 81–106.Google Scholar
  12. Quinlan, R. J. (1993). C4.5: Programs for Machine Learning (Morgan Kaufmann Series in Machine Learning). Los Altos, CA: Morgan Kaufmann.Google Scholar
  13. Ruß, G., Kruse, R., Schneider, M., & Wagner, P. (2008). Estimation of neural network parameters for wheat yield prediction. In M. Bramer (Ed.), Artificial Intelligence in Theory and Practice II of IFIP International Federation for Information Processing (Vol. 276, pp. 109–118). Berlin: Springer.Google Scholar
  14. Ruß, G., Kruse, R., Schneider, M., & Wagner, P. (2008). Optimizing wheat yield prediction using different topologies of neural networks. In J. L. Verdegay, M. Ojeda-Aciego, & Magdalena, L. (Eds.), Proceedings of IPMU-08 (pp. 576–582). University of Málaga.Google Scholar
  15. Ruß, G., Kruse, R., Wagner, P., & Schneider, M. (2008). Data mining with neural networks for wheat yield prediction. In P. Perner (Ed.), Advances in Data Mining (Proc. ICDM 2008) (pp. 47–56). Berlin, Heidelberg: Springer Verlag.Google Scholar
  16. Schneider, M., & Wagner, P. (2006). Prerequisites for the adoption of new technologies – the example of precision agriculture. In Agricultural Engineering for a Better World, Düsseldorf: VDI Verlag GmbH.Google Scholar
  17. Serele, C. Z., Gwyn, Q. H. J., Boisvert, J. B., Pattey, E., Mclaughlin, N., & Daoust, G. (2000). Corn yield prediction with artificial neural network trained using airborne remote sensing and topographic data. In 2000 IEEE International Geoscience and Remote Sensing Symposium, 1, 384–386.Google Scholar
  18. Smola, A. J., & Schölkopf, B. (1998). A tutorial on support vector regression. Technical report, Statistics and Computing.Google Scholar
  19. Stein, M. L. (1999). Interpolation of Spatial Data : Some Theory for Kriging (Springer Series in Statistics). Berlin: Springer.zbMATHGoogle Scholar
  20. Weigert, G. (2006). Data Mining und Wissensentdeckung im Precision Farming - Entwicklung von ökonomisch optimierten Entscheidungsregeln zur kleinräumigen Stickstoff-Ausbringung. PhD thesis, TU München.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Georg Ruß
    • 1
  • Rudolf Kruse
  • Martin Schneider
  • Peter Wagner
  1. 1.Otto-von-Guericke-Universität MagdeburgMagdeburgGermany

Personalised recommendations