Abstract
Classification assigns objects to various classes based on measured features. The features are considered as a feature vector in feature space. It is important to select the most informative features and/or combine features for successful classification. Typically a sample set (the training set) is selected to train the classifier, which is then applied to other objects (the test set). Supervised learning uses a labeled training set, in which it is known to which class the objects belong, and is an inductive reasoning process. There are a variety of approaches to classification; statistical approaches, characterized by an underlying probability model, are very important. We will consider a number of robust features and examples based on shape, size, and topology to classify various objects.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Aha, D.: Feature weighting for lazy learning algorithms. In: Liu, H., Motoda, H. (eds.) Feature Extraction, Construction and Selection: A Data Mining Perspective, pp. 13–32. Kluwer, Norwell, MA (1998)
Anderson, J., Pellionisz, A., Rosenfeld, E.: Neurocomputing 2: Directions for Research. MIT, Cambridge, MA (1990)
Barto, A.G., Sutton, R.S.: Reinforcement learning in artificial intelligence. In: Donahue, J.W., Packard Dorsal, V. (eds.) Neural Network Models of Cognition, pp. 358–386. Elsevier, Amsterdam (1997)
Batista, G., Monard, M.: An analysis of four missing data treatment methods for supervised learning. Appl. Artif. Intell. 17, 519–533 (2003)
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (2000)
Csiszar, I.: Maxent, mathematics, and information theory. In: Hanson, K.M., Silver, R.N. (eds.) Maximum Entropy and Bayesian Methods, pp. 35–50. Kluwer, Norwell, MA (1996)
De Mantaras, R.L., Armengol, E.: Machine learning from examples: inductive and lazy methods. Data Knowl. Eng. 25, 99–123 (1998)
Dubes, R.C., Jain, A.K.: Clustering techniques: the user’s dilemma. Pattern Recognit. 8, 247–290 (1976)
Fu, K.S.: Syntactic Pattern Recognition and Applications. Prentice-Hall, Englewood Cliffs (1982)
Hodge, V.J., Austin, J.: A survey of outlier detection methodologies. Artif. Intell. Rev. 22, 85–126 (2004)
Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: a review. IEEE Trans. Pattern Anal. Mach. Intell. 33, 1475–1485 (2000)
Jensen, F.V.: An Introduction to Bayesian Networks. UCL Press, London (1996)
Markovitch, S., Rosenstein, D.: Feature generation using general constructor functions. Mach. Learn. 49, 59–98 (2002)
Mitchell, T.: Machine Learning. McGraw Hill, New York (1997)
Otsu, N.: A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. SMC-9, 62–66 (1979)
Perlovsky, L.I.: Conundrum of combinatorial complexity. IEEE Trans. Pattern Anal. Mach. Intell. 20, 666–670 (1998)
Ripley, B.: Statistical aspects of neural networks. In: Bornndorff-Nielsen, U., Jensen, J., Kendal, W. (eds.) Networks and Chaos - Statistical and Probabilistic Aspects, pp. 40–123. Chapman and Hall, London (1993)
Yu, L., Liu, H.: Efficient feature selection via analysis of relevance and redundancy. J. Mach. Learn. Res. 5, 1205–1224 (2004)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this chapter
Cite this chapter
Dougherty, G. (2013). Classification. In: Pattern Recognition and Classification. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-5323-9_2
Download citation
DOI: https://doi.org/10.1007/978-1-4614-5323-9_2
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-5322-2
Online ISBN: 978-1-4614-5323-9
eBook Packages: Computer ScienceComputer Science (R0)