Abstract
One of the main factors that affects the performance of MLP neural networks trained using the backpropagation algorithm in mineral-potential mapping isthe paucity of deposit relative to barren training patterns. To overcome this problem, random noise is added to the original training patterns in order to create additional synthetic deposit training data. Experiments on the effect of the number of deposits available for training in the Kalgoorlie Terrane orogenic gold province show that both the classification performance of a trained network and the quality of the resultant prospectivity map increasesignificantly with increased numbers of deposit patterns. Experiments are conducted to determine the optimum amount of noise using both uniform and normally distributed random noise. Through the addition of noise to the original deposit training data, the number of deposit training patterns is increased from approximately 50 to 1000. The percentage of correct classifications significantly improves for the independent test set as well as for deposit patterns in the test set. For example, using ±40% uniform random noise, the test-set classification performance increases from 67.9% and 68.0% to 72.8% and 77.1% (for test-set overall and test-set deposit patterns, respectively). Indices for the quality of the resultant prospectivity map, (i.e. D/A, D × (D/A), where D is the percentage of deposits and A is the percentage of the total area for the highest prospectivity map-class, and area under an ROC curve) also increase from 8.2, 105, 0.79 to 17.9, 226, 0.87, respectively. Increasing the size of the training-stop data set results in a further increase in classification performance to 73.5%, 77.4%, 14.7, 296, 0.87 for test-set overall and test-set deposit patterns, D/A, D × (D/A), and area under the ROC curve, respectively.
Similar content being viewed by others
REFERENCES
An, G., 1996, The effects of adding noise during backpropagation training on a generalization performance: Neural Computation, v.8, no.3, p. 643–647.
Bishop, C., 1993, Neural networks for pattern recognition: Oxford Univ. Press, Oxford, 482p.
Brown, W. M., 2002, Artificial neural networks: a new method of mineral prospectivity mapping: unpubl. doctoral dissertation, Univ. Western Australia (Perth), 760 p.
Brown, W. M., Groves, D. I., and Gedeon, T. D., 2002, Use of fuzzy membership input layers to combine subjective geological knowledge and empirical data in a neural network method for mineral potential mapping: Natural Resources Research, v.12, no. 3, in press.
Brown, W. M., Baddeley, A., Gedeon, T. D., and Groves, D. I., 2002, Bivariate J-function and other graphical statistical methods help select the best predictor variables as inputs for a neural network method of mineral prospectivity mapping, in Bayer, U., and Burger, H., and Skala, W., eds., IAMG 2002: 8th Ann. Conf. Intern. Assoc. Mathematical Geology (Berlin), v.1, p. 263–268.
Brown, W. M., Gedeon, T. D., Groves, D. I., and Barnes, R. G., 2000, Artificial neural networks: a new method for mineral prospectivity mapping: Australian Jour. Earth Sciences, v.47, no.4, p. 757–770.
Brown, W. M., Taylor, G. R. T., Jusmady, Groves, D. I., and Knox-Robinson, C. M., 1997, Application of artifical neural networks to prospectivity analysis in a GIS environment: a comparison with statistical and fuzzy logic methods for Au and Sn deposits of the Tenterfield area, NSW (abst.), in 14th Australian Geol. Conv. Abstracts, Geol. Soc. Australia, v.49, p. 57.
Clay, R. D., and Sequin, C. H., 1992, Fault tolerance training improves generalization and robustness, in Intern. Joint Conf. Neural Network (IJCNN 1992), IEEE, New York, v.4, p. 769–774.
Dayhoff, J. E., 1990, Neural network architectures: an introduction: Van Nostrand Reinhold, New York, 259p.
Groves, D. I., Goldfarb, R. J., Knox-Robinson, C., Ojala, J., Gardoll, S., Yun, G., and Holyland, P., 2000, Late-kinematic timing of orogenic gold deposits and significance for computer-based exploration techniques with emphasis on the Yilgarn Block, Western Australia: Ore Geology Reviews, v.17, no.1, p. 1–38.
Groves, D. I., Ojala, J., and Holyland, P., 1997, Use of geometric parameters of greenstone belts in conceptual exploration for orogenic lode-gold deposits: AGSO Record 1997/41, p. 103–108.
Hagan, M. T., and Menhaj, M., 1994, Training feedforward networks with the Marquardt algorithm: IEEE Trans. Neural Networks, v.5, no.6, p. 989–993.
Harris, D., and Pan, G., 1999, Mineral favorability mapping: a comparison of artificial neural networks, logistic regression, and discriminant analysis: Natural Resources Research, v.8, no.2, p. 93–109.
Holmstrom, L., and Koistinen, P., 1992, Using additive noise in back-propagation training: IEEE Trans. Neural Networks, v.3, no.1, p. 24–38.
Krogh, A., and Hertz, J. A., 1992, Generalization in a linear perceptron in the presence of noise: Jour. Physics A (Mathematical & General). v.25, no.5, p. 1135–47.
Masters, T., 1993, Practical neural network recipes in C++: Academic Press Inc., San Diego, California, 493p.
Matsuoka, K., 1992, Noise injection into inputs in back-propagation learning: IEEE Trans. Systems, Man and Cybernetics, v.22, no.3, p. 436–440.
Parikh, C. R., Pont, M. J., and Jones, N. B., 1999, Improving the performance of multi-layer perceptrons where limited training data are available for some classes, in ICANN 99, Ninth Intern. Conf. Artificial Neural Networks, Conf. Publ. no.470, v.1, p. 227–232.
Reed, R., Marks II, R. J., and Oh, S., 1992, An equivalence between sigmoidal gain scaling and training with noisy (jittered) input data, in RNNS/IEEE Symp. Neuroinformatics and Neurocomputers: IEEE, New York, v.1, p. 120–127.
Reed, R., Marks II, R. J., and Oh, S., 1995, Similarities of error regularization, sigmoid gain scaling, target smoothing, and training with jitter: IEEE Trans. Neural Networks, v.6, no.3, p. 529–538.
Rumelhart, D. E., Hinton, G. E., and Williams, R. J., 1986, Learning internal representations by error Propagation, in Rumelhart, D. E., and McClelland, J. L., eds., Parallel Data Processing, M.I.T. Press, v. 1, Cambridge, Massachusetts, p. 318–362.
Sietsma, J., and Dow, R. J. F., 1991, Creating artificial neural networks that generalize: Neural Networks, v.4, no.1, p. 67–79.
Uncini, A., Marchesi, M., Orlandi, G., and Piazza, F., 1990, Improved evoked potential estimation using neural networks: IEEE Conf. Neural Networks (San Diego), p. II, 143–148.
Wang, C., and Principe, J. C., 1999, Training neural networks with additive noise in the desired signal: IEEE Trans. Neural Networks, v.10, no.6, p. 1511–1517.
Wyborn, L. A. I., Gallagher, R., and Raymond, O., 1995, Using GIS for mineral potential evaluation in areas with few known mineral occurrences, in Second National Forum on GIS in the Geosciences—Forum Proc.: Australian Geol. Survey Organisation Record 1995/46, p. 199–211.
Zaknich, A., 2003, Neural networks for intelligent signal processing: World Scientific, Singapore, 508p.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Brown, W.M., Gedeon, T.D. & Groves, D.I. Use of Noise to Augment Training Data: A Neural Network Method of Mineral–Potential Mapping in Regions of Limited Known Deposit Examples. Natural Resources Research 12, 141–152 (2003). https://doi.org/10.1023/A:1024218913435
Issue Date:
DOI: https://doi.org/10.1023/A:1024218913435