Abstract
Scope classification is a new instance-based learning (IBL) technique with a rule-based characterisation. Within the scope approach, the classification of an object o is based on the examples that are closer to o than every example labelled with another class. In contrast to standard distance-based IBL classifiers, scope classification relies on partial preorderings ≤o between examples, indexed by objects. Interestingly, the notion of closeness to o that is used characterises the classes predicted by all the rules that cover o and are relevant and consistent for the training set. Accordingly, scope classification is an IBL technique with a rule-based characterisation. Since rules do not have to be explicitly generated, the scope approach applies to classification problems where the number of rules prevents them from being exhaustively computed.
Currently at Department of Computer Science, University of Bristol, Merchant Venturers Building, Woodland Road, Bristol BS8 1UB, United Kingdom
Chapter PDF
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Peter Clark and Tim Niblett. The CN2 induction algorithm. Machine Learning, 3(4):261–283, 1989.
Scott Cost and Steven Salzberg. A weighted nearest neighbor algorithm for learning with symbolic features. Machine Learning, 10:57–78, 1993.
Gulsen Demiroz and H. Altay Guvenir. Classification by voting feature intervals. In Proc. of the Ninth European Conference on Machine Learning, pages 85–92, Prague, Czech Republic, 1997. LNAI 1224, Springer-Verlag.
Pedro Domingos. Unifying instance-based and rule-based induction. Machine Learning, 24:141–168, 1996.
Ron Kohavi and George H. John. Automatic parameter selection by minimizing estimated error. In Proc. of the Twelfth International Conference on Machine Learning, 1995. Morgan Kaufmann.
C.J. Merz and P.M. Murphy. UCI repository of machine learning databases. http://www.ics.uci.edu/≈mleaxn/MLRepository.html, 1996.
John Ross Quinlan. C4.5: Programs for machine learning. Series in machine learning. Morgan Kaufmann, 1993.
Ron Rymon. An SE-tree based characterization of the induction problem. In Proc. of the Tenth International Conference on Machine Learning, pages 268–275, 1993.
Ron Rymon. SE-Learn home page. http://www.isp.pitt.edu/≈rymon/SE-Learn.html, 1996.
Ron Rymon. SE-trees outperform decision trees in noisy domains. In Proc. of the Second International Conference on Knowledge Discovery in Databases, pages 331–334, 1996. AAAI Press.
Steven Salzberg. A nearest hyperrectangle learning method. Machine Learning, 6:251–276, 1991.
Michèle Sebag. Delaying the choice of bias: A disjunctive version space approach. In Proc. of the Thirteenth International Conference on Machine Learning, pages 444–452, 1996. Morgan Kaufmann.
Dietrich Wettschereck and Thomas G. Dietterich. An experimental comparison of the nearest-neighbor and nearest-hyperrectangle algorithms. Machine Learning, 19:5–27, 1995.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lachiche, N., Marquise, P. (1998). Scope classification: An instance-based learning algorithm with a rule-based characterisation. In: Nédellec, C., Rouveirol, C. (eds) Machine Learning: ECML-98. ECML 1998. Lecture Notes in Computer Science, vol 1398. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0026697
Download citation
DOI: https://doi.org/10.1007/BFb0026697
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64417-0
Online ISBN: 978-3-540-69781-7
eBook Packages: Springer Book Archive