Abstract
Algorithms based on Nested Generalized Exemplar (NGE) theory (Salzberg, 1991) classify new data points by computing their distance to the nearest “generalized exemplar” (i.e., either a point or an axis-parallel rectangle). They combine the distance-based character of nearest neighbor (NN) classifiers with the axis-parallel rectangle representation employed in many rule-learning systems. An implementation of NGE was compared to thek-nearest neighbor (kNN) algorithm in 11 domains and found to be significantly inferior to kNN in 9 of them. Several modifications of NGE were studied to understand the cause of its poor performance. These show that its performance can be substantially improved by preventing NGE from creating overlapping rectangles, while still allowing complete nesting of rectangles. Performance can be further improved by modifying the distance metric to allow weights on each of the features (Salzberg, 1991). Best results were obtained in this study when the weights were computed using mutual information between the features and the output class. The best version of NGE developed is a batch algorithm (BNGE FWMI) that has no user-tunable parameters. BNGE FWMI's performance is comparable to the first-nearest neighbor algorithm (also incorporating feature weights). However, thek-nearest neighbor algorithm is still significantly superior to BNGE FWMI in 7 of the 11 domains, and inferior to it in only 2. We conclude that, even with our improvements, the NGE approach is very sensitive to the shape of the decision boundaries in classification problems. In domains where the decision boundaries are axis-parallel, the NGE approach can produce excellent generalization with interpretable hypotheses. In all domains tested, NGE algorithms require much less memory to store generalized exemplars than is required by NN algorithms.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Aha, D.W. (1990).A Study of Instance-Based Algorithms for Supervised Learning Tasks. Technical Report, University of California, Irvine.
Bakiri, G. (1991).Converting English Text to Speech: A Machine Learning Approach. Ph.D. Thesis. Dregon State University, Corvallis, Oregon.
Carpenter, G.A., Grossberg, S., Markuzon, N., Reynolds, J.H., & Rosen, D.B. (1992). Fuzzy ARTMAP: A Neural Network Architecture for Incremental Supervised Learning of Analog Multidimensional Maps.IEEE Transactions on Neural Networks, 3, 698–713.
Dasarathy, B.V. (1991).Nearest Neighbor(NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press.
Detrano, R., Janosi, A., Steinbrunn, W., Pfisterer, M., Schmid, K., Sandhu, S., Guppy, K., Lee, S., & Froelicher, V. (1989). Rapid searches for complex patterns in biological molecules.American Journal of Cardiology, 64, 304–310.
Duda, R.O., & Hart, P.E. (1973).Pattern Classification and Scene Analysis. New York: Wiley.
Holte, R.C. (1993). Very Simple Classification Rules Perform Well on Most Commonly Used Datasets.Machine Learning, 11, 63–90.
Murphy, P.M., & Aha, D.W. (1994).UCI Repository of Machine Learning Databases [Machine-Readable Data Repository]. Technical Report, University of California, Irvine.
Quinlan, J.R. (1992).C4.5: Programs for Empirical Learning. San Mateo, CA: Morgan Kaufmann Publishers, INC.
Salzberg, S. (1991). A Nearest Hyperrectangle Learning Method.Machine Learning, 6, 277–309.
Simpson, P.K. (1992). Fuzzy min-max neural networks: 1. Classification.IEEE Transactions on Neural Networks, 3, 776–786.
Weiss, S.M., & Kulikowski, C.A. (1991).Computer Systems That Learn. San Mateo CA: Morgan Kaufmann Publishers, INC.
Wettschereck, D. (1994). A Hybrid Nearest-Neighbor and Nearest-Hyperrectangle Algorithm.Proceedings of the 7th European Conference on Machine Learning. In press.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Wettschereck, D., Dietterich, T.G. An experimental comparison of the nearest-neighbor and nearest-hyperrectangle algorithms. Mach Learn 19, 5–27 (1995). https://doi.org/10.1007/BF00994658
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF00994658