Skip to main content

Advertisement

SpringerLink
Log in
Menu
Find a journal Publish with us
Search
Cart
Book cover

Joint IAPR International Workshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR)

SSPR /SPR 2012: Structural, Syntactic, and Statistical Pattern Recognition pp 60–69Cite as

  1. Home
  2. Structural, Syntactic, and Statistical Pattern Recognition
  3. Conference paper
Learning Sparse Kernel Classifiers in the Primal

Learning Sparse Kernel Classifiers in the Primal

  • Zhouyu Fu24,
  • Guojun Lu25,
  • Kai-Ming Ting25 &
  • …
  • Dengsheng Zhang25 
  • Conference paper
  • 2417 Accesses

Part of the Lecture Notes in Computer Science book series (LNIP,volume 7626)

Abstract

The increasing number of classification applications in large data sets demands that efficient classifiers be designed not only in training but also for prediction. In this paper, we address the problem of learning kernel classifiers with reduced complexity and improved efficiency for prediction in comparison to those trained by standard methods. A single optimisation problem is formulated for classifier learning which optimises both classifier weights and eXpansion Vectors (XVs) that define the classification function in a joint fashion. Unlike the existing approach of Wu et al, which performs optimisation in the dual formulation, our approach solves the primal problem directly. The primal problem is much more efficient to solve, as it can be converted to the training of a linear classifier in each iteration, which scales linearly to the size of the data set and the number of expansions. This makes our primal approach highly desirable for large-scale applications, where the dual approach is inadequate and prohibitively slow due to the solution of cubic-time kernel SVM involved in each iteration. Experimental results have demonstrated the efficiency and effectiveness of the proposed primal approach for learning sparse kernel classifiers that clearly outperform the alternatives.

Download conference paper PDF

References

  1. Wu, M., Scholkopf, B., Bakir, G.: A direct method for building sparse kernel learning algorithms. Journal of Machine Learning Research 7, 603–624 (2006)

    MathSciNet  MATH  Google Scholar 

  2. Platt, J.: Fast training of support vector machines using sequential minimal optimization. In: Advances in Kernel Methods - Support Vector Learning (1998)

    Google Scholar 

  3. Lee, Y.J., Mangasarian, O.L.: Rsvm: Reduced support vector machines. In: Siam Data Mining Conf. (2001)

    Google Scholar 

  4. Scholkopf, B., Smola, A.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press (2002)

    Google Scholar 

  5. Keerthi, S., Chapelle, O., DeCoste, D.: Building support vector machines with reduced classifier complexity. Journal Machine Learning Res. 7, 1493–1515 (2006)

    MathSciNet  MATH  Google Scholar 

  6. Bonnans, J.F., Shapiro, A.: Optimization problems with pertubation: A guided tour. SIAM Review 40(2), 202–227 (1998)

    CrossRef  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

  1. School of Computing, University of Western Sydney, Penrith, NSW, 2750, Australia

    Zhouyu Fu

  2. Gippsland School of IT, Monash University, Churchill, VIC, 3842, Australia

    Guojun Lu, Kai-Ming Ting & Dengsheng Zhang

Authors
  1. Zhouyu Fu
    View author publications

    You can also search for this author in PubMed Google Scholar

  2. Guojun Lu
    View author publications

    You can also search for this author in PubMed Google Scholar

  3. Kai-Ming Ting
    View author publications

    You can also search for this author in PubMed Google Scholar

  4. Dengsheng Zhang
    View author publications

    You can also search for this author in PubMed Google Scholar

Editor information

Editors and Affiliations

  1. Department of Computer Science, University of Auckland, Private Bag 92019, 1142, Auckland, New Zealand

    Georgy Gimel’farb

  2. Department of Computer Science, University of York, Deramore Lane, YO10 5GH, York, UK

    Edwin Hancock

  3. Institute of Media and Information Technology, Chiba University, Yayoi-cho 1-33, 263-8522, Inage-ku, Chiba, Japan

    Atsushi Imiya

  4. Technische Universität/Fraunhofer IGD, Fraunhoferstraße 5, 64283, Darmstadt, Germany

    Arjan Kuijper

  5. Graduate School of Information Science and Technology, Hokkaido University, 060-0814, Sapporo, Japan

    Mineichi Kudo

  6. Graduate School of Engineering, Tohoku University, 6-6-05 Aoba, Aramaki, Aoba-ku, 980-8579, Sendai, Miyagi, Japan

    Shinichiro Omachi

  7. Centre for Vision, Speech and Signal Processing, University of Surrey, GU2 7XH, Guildford, Surrey, UK

    Terry Windeatt

  8. C&C Innovation Research Laboratories, NEC Corporation, 8916-47 Takayama-cho, Ikoma-Shi, Nara, Japan

    Keiji Yamada

Rights and permissions

Reprints and Permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Fu, Z., Lu, G., Ting, KM., Zhang, D. (2012). Learning Sparse Kernel Classifiers in the Primal. In: Gimel’farb, G., et al. Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2012. Lecture Notes in Computer Science, vol 7626. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34166-3_7

Download citation

  • .RIS
  • .ENW
  • .BIB
  • DOI: https://doi.org/10.1007/978-3-642-34166-3_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-34165-6

  • Online ISBN: 978-3-642-34166-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Share this paper

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • The International Association for Pattern Recognition

    Published in cooperation with

    http://www.iapr.org/

Search

Navigation

  • Find a journal
  • Publish with us

Discover content

  • Journals A-Z
  • Books A-Z

Publish with us

  • Publish your research
  • Open access publishing

Products and services

  • Our products
  • Librarians
  • Societies
  • Partners and advertisers

Our imprints

  • Springer
  • Nature Portfolio
  • BMC
  • Palgrave Macmillan
  • Apress
  • Your US state privacy rights
  • Accessibility statement
  • Terms and conditions
  • Privacy policy
  • Help and support

167.114.118.210

Not affiliated

Springer Nature

© 2023 Springer Nature