ICANN 2003, ICONIP 2003: Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003 pp 359-366 | Cite as
Transformations of Symbolic Data for Continuous Data Oriented Models
Conference paper
First Online:
Abstract
Most of Computational Intelligence models (e.g. neural networks or distance based methods) are designed to operate on continuous data and provide no tools to adapt their parameters to data described by symbolic values. Two new conversion methods which replace symbolic by continuous attributes are presented and compared to two commonly known ones. The advantages of the continuousification are illustrated with the results obtained with a neural network, SVM and a kNN systems for the converted data.
Preview
Unable to display preview. Download preview PDF.
References
- 1.R. Adamczak, W. Duch, and N. Jankowski. New developments in the feature space mapping model. In Third Conference on Neural Networks and Their Applications, pages 65–70, Kule, Poland, October 1997.Google Scholar
- 2.E. Blanzieri and F. Ricci. Advanced metrics for class-driven similarity search. In Proceedings of the International Workshop on Similarity Search, Firenze, Italy, September 1999.Google Scholar
- 3.L. Bobrowski, M. Krętowska, and M. Krętowski. Design of neural classifying networks by using dipolar criterions. In Third Conference on Neural Networks and Their Applications, Kule, Poland, October 1997.Google Scholar
- 4.S. Brandt. Data Analysis. Springer, New York, 1999.MATHGoogle Scholar
- 5.T. G. Dietterich. Approximate statistical tests for comparing supervised classification learning algorithms. Neural Computation, 10(7):1895–1924, 1998.CrossRefGoogle Scholar
- 6.W. Duch and G. H. F. Diercksen. Feature space mapping as a universal adaptive system. Computer Physics Communications, 87:341–371, 1995.MATHCrossRefGoogle Scholar
- 7.W. Duch, K. Grudziński, and G. Stawski. Symbolic features in neural networks. In Proceedings of the 5th Conference on Neural Networks and Their Applications, pages 180–185, Zakopane, Poland, June 2000.Google Scholar
- 8.K. Grąbczewski and W. Duch. A general purpose separability criterion for classification systems. In Proceedings of the 4th Conference on Neural Networks and Their Applications, pages 203–208, Zakopane, Poland, June 1999.Google Scholar
- 9.K. Grąbczewski and W. Duch. The separability of split value criterion. In Proceedings of the 5th Conference on Neural Networks and Their Applications, pages 201–208, Zakopane, Poland, June 2000.Google Scholar
- 10.S. S. Keerthi, S. K. Shevade, C. Bhattacharyya, and K. R. K. Murthy. Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Computation, 13:637–649, 2001.MATHCrossRefGoogle Scholar
- 11.C. J. Merz and P. M. Murphy. UCI repository of machine learning databases, 1998. http://www.ics.uci.edu/~mlearn/MLRepository.html.Google Scholar
- 12.J. C. Platt. Fast training of support vector machines using sequential minimal optimization. In B. Schölkopf, C. J. C. Burges, and A. J. Smola, editors, Advances in Kernel Methods-Support Vector Learning. MIT Press, Cambridge, MA., 1998.Google Scholar
- 13.D. R. Wilson and T. R. Martinez. Improved heterogeneous distance functions. Journal of Artificial Intelligence Research, 11:1–34, 1997.MathSciNetGoogle Scholar
Copyright information
© Springer-Verlag Berlin Heidelberg 2003