Abstract
Linear classifier techniques involve identifying the hyperplane that separates between the classes. The vector under test is treated as the one belonging to the particular class k if it is near to centroid of the k th class (Nearest Mean). The r vectors nearest to the vector under test is considered. Among r vectors, if the vector under consideration is nearest to maximum number of vectors belonging to the particular (say k th class, then the vector under test is declared as the one belonging to the k th class (Nearest Neighbour). The Support Vector Machine is used to identify the equation of the hyperplane that partitions two classes. This is achieved by using the vectors mapped to the higher dimensional space without explicit mapping to the higher dimensional space using “kernel trick.” With modification in the objective functions and the constraints, Support Vector Machine is used for regression techniques. The coefficients describing the equation of the hyperplane are assumed as Gaussian distributed with nonidentical diagonal co-variance matrix and are estimated using Relevance vector machine (RSVM). This helps to get sparse coefficients. RSVM is also used for the classification technique.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Gopi, E.S. (2020). Linear Classifier Techniques. In: Pattern Recognition and Computational Intelligence Techniques Using Matlab. Transactions on Computational Science and Computational Intelligence. Springer, Cham. https://doi.org/10.1007/978-3-030-22273-4_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-22273-4_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22272-7
Online ISBN: 978-3-030-22273-4
eBook Packages: EngineeringEngineering (R0)