Advertisement

Text Categorization Based on Artificial Neural Networks

  • Cheng Hua Li
  • Soon Choel Park
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4234)

Abstract

This paper described two kinds of neural networks for text categorization, multi-output perceptron learning (MOPL) and back-propagation neural network (BPNN), and then we proposed a novel algorithm using improved back-propagation neural network. This algorithm can overcome some shortcomings in traditional back-propagation neural network such as slow training speed and easy to enter into local minimum. We compared the training time and the performance, and tested the three methods on the standard Reuter-21578. The results show that the proposed algorithm is able to achieve high categorization effectiveness as measured by the precision, recall and F-measure.

Keywords

Neural Network Learning Phase Text Categorization Term Weight Training Document 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Yang, Y., Liu, X.: A Re-examination of Text Categorization Methods. In: Proceedings of SIGIR-99, 22nd ACM International Conference on Research and Development in Information Retrieval, pp. 42–49 (1999)Google Scholar
  2. 2.
    Rocchio Jr., J.: JRelevance Feedback in Information Retrieva. In: Salton, G. (ed.) The SMART Retrieval System: Experiments in Automatic Document Processing, Prentice-Hall, Englewood Cliffs (1971)Google Scholar
  3. 3.
    Cohen, W.W., Singer, Y.: Context–Sensitive Learning Methods for Text Categorization. ACM Trans. Inform. Syst. 17(2), 141–173 (1999)CrossRefGoogle Scholar
  4. 4.
    Farkas, J.: Neural Networks and Document Classification. In: Proceedings of the 1993Canadian Conference on Electrical and Computer Engineering, Vancouver, B.C., vol. I, pp. 14–17 (1993)Google Scholar
  5. 5.
    Ruiz, M.E., Srinivasan, P.: Hierarchical Neural Networks for Text Categorization. In: Proceedings of SIGIR 1999, 22nd ACM International Information Retrieval, pp. 281–282 (1999)Google Scholar
  6. 6.
    Thorsten, J.: Text categorization with support vector machines: Learning woth many relevant features. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, Springer, Heidelberg (1998)Google Scholar
  7. 7.
    Wu, W., Feng, G., Li, Z., Xu, Y.: Deterministic Convergence of an Online Gradient Method for BP Neural Networks. IEEE Transactions On Neural Networks 16(3) (2005)Google Scholar
  8. 8.
    Wasserman, P.D.: Neural Computing: Theory and Practice [M]. Van Nostrand Reinhold, New York (1989)Google Scholar
  9. 9.
    Plagianakos, V.P., Vrahatis, M.N.: Training Neural Networks with Threshold Activation Functions and Constrained Integer Weights. In: IJCNN, IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN 2000), vol. 5, p. 5161 (2000)Google Scholar
  10. 10.
    Porter, M.F.: An algorithm for suffix stripping. Program 14(3), 130–137 (1980)Google Scholar
  11. 11.
    Lam, S.L.Y., Lee, D.L.: Feature Reduction for Neural Network Based Text Categorization. In: 6th International Conference on Database Systems for Advanced Applications (DASFAA 1999), p. 195 (1999)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Cheng Hua Li
    • 1
  • Soon Choel Park
    • 1
  1. 1.Division of Electronics and Information EngineeringChonbuk National UniversityJeonjuKorea

Personalised recommendations