Advertisement

Semi-supervised Dynamic Counter Propagation Network

  • Yao Chen
  • Yuntao Qian
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4093)

Abstract

Semi-supervised classification uses a large amount of unlabeled data to help a little amount of labeled data for designing classifiers, which has good potential and performance when the labeled data are difficult to obtain. This paper mainly discusses semi-supervised classification based on CPN (Counter-propagation Network). CPN and its revised models have merits such as simple structure, fast training and high accuracy. Especially, its training scheme combines supervised learning and unsupervised learning, which makes it very conformable to be extended to semi-supervised classification problem. According to the characteristics of CPN, we propose a semi-supervised dynamic CPN, and compare it with other two semi-supervised CPN models using Expectation Maximization and Co-Training/Self-Training techniques respectively. The experimental results show the effectiveness of CPN based semi-supervised classification methods.

Keywords

Class Label Label Data Unlabeled Data Handwritten Digit Supervise Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bennett, K.P., Demiriz, A.: Semi-Supervised Support Vector Machines. In: Proceedings of Neural Information Processing Systems, pp. 368–374. MIT Press, Denver (1999)Google Scholar
  2. 2.
    Nigam, K., McCallum, A., Thrun, S., Mitchell, T.: Text Classification from Labeled and Unlabeled Documents using EM. Machine Learning 39(2-3), 103–134 (2000)CrossRefMATHGoogle Scholar
  3. 3.
    Moon, T.K.: The Expectation Maximization Algorithm. Signal Processing Magazine 13(6), 47–60 (1996)CrossRefGoogle Scholar
  4. 4.
    Avrim, B., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proceedings of the Eleventh Annual Conference on Computational Learning Theory, pp. 92–100. ACM Press, Madison, Wisconsin, United States (1998)Google Scholar
  5. 5.
    Nigam, K., Ghani, R.: Analyzing the Effectiveness and Applicability of Co-training. In: Ninth International Conference on Information and Know ledge Management, pp. 86–93. ACM Press, McLean, Virginia, United States (2000)Google Scholar
  6. 6.
    Nielsen, R.H.: Counter propagation networks. Applied Optics 26(23), 4979–4983 (1987)CrossRefGoogle Scholar
  7. 7.
    Morns, I.P., Dlay, S.S.: The DSFPN, a New Neural Network for Optical Character Recognition. IEEE Transactions on Neural Networks 10(6), 1465–1473 (1999)CrossRefGoogle Scholar
  8. 8.
    Zhongzhi, S.: Knowledge Discovery (in Chinese). Tsinghua University Press, Beijing (2002)Google Scholar
  9. 9.
    Chan, J., Koprinska, I., Poon, J.: Co-training on Textual Documents with a Single Natural Feature. In: Set Proceedings of the 9th Australasian Document Computing Symposium, pp. 47–54. ADSC, Melbourne, Australia (2004)Google Scholar
  10. 10.
    Cohen, I., Cozman, F.G., Sebe, N., Cirelo, M.C., Huang, T.S.: Semi-supervised Learning of Classifiers: Theory, Algorithms and Their Application to Human-Computer Interaction. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(12), 1553–1567 (2004)CrossRefGoogle Scholar
  11. 11.
    Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA, http://www.ics.uci.edu/~mlearn/MLRepository.html

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Yao Chen
    • 1
  • Yuntao Qian
    • 1
  1. 1.College of Computer ScienceZhejiang UniversityHangzhouP.R. China

Personalised recommendations