Advertisement

TTLSC – Transductive Total Least Square Model for Classification and Its Application in Medicine

  • Qun Song
  • Tian Min Ma
  • Nikola Kasabov
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4093)

Abstract

This paper introduces a novel classification method-transductive total least square classification method (TTLSC). While inductive approaches are concerned with the development of a model to approximate data in the whole problem space (induction), and consecutively – using this model to calculate the output value(s) for a new input vector (deduction), in transductive systems a local model is developed for every new input vector, based on some closest data to this vector from the training data set. The total least square method (TLS) is one of the optimal fitting methods that can be used for curve and surface fitting and outperform the commonly used least square fitting methods in resisting both normal noise and outlier. The TTLSC is illustrated by a case study: a real medical decision support problem of estimating the survival of haemodialysis patients. This personalized modelling can also be applied to solve other classification or clustering problems.

Keywords

Support Vector Machine Total Least Square Steep Descent Algorithm Transductive Support Vector Machine Practice Pattern Study 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Altman, D.G.: Practical Statistics for Medical Research. Chapman and Hall, London (1991)Google Scholar
  2. 2.
    Donner, A., Eliasziw, M.: A goodness-of-fit approach to inference procedures for the kappa statistic: confidence interval construction, significance-testing and sample size estimation. Statistics in Medicine 11, 1511–1519 (1992)CrossRefGoogle Scholar
  3. 3.
    Gammerman, A., Vovk, V., Vapnik, V.: Learning by transduction. In: Cooper, G.F., Moral, S. (eds.) Proc. of the 14th Conference on Uncertainty in Artificial Intelligence, Madison, Wisconsin, pp. 148–155. Morgan Kaufmann, San Francisco (1998)Google Scholar
  4. 4.
    Golub, C.L., Van Loan, C.: Matrix computations. Jons Hopkins University Press, Baltimore, MDGoogle Scholar
  5. 5.
    Goodkin, D.A., Mapes, D.L., Held, P.J.: The dialysis outcomes and practice patterns study (DOPPS): how can we improve the care of hemodialysis patients? Seminars in Dialysis 14, 157–159 (2001)CrossRefGoogle Scholar
  6. 6.
    Hsia, T. C.: System Identification: Least-Squares Methods. D.C. Heath and Company (1977)Google Scholar
  7. 7.
    Kasabov, N., Song, Q.: DENFIS: Dynamic, evolving neural-fuzzy inference systems and its application for time-series prediction. IEEE Trans. on Fuzzy Systems 10, 144–154 (2002)CrossRefGoogle Scholar
  8. 8.
    Kasabov, N.: Evolving connectionist systems: Methods and Applications in Bioinformatics. In: Brain study and intelligent machines, Springer, Great Britain (2003)Google Scholar
  9. 9.
    Kukar, M.: Transductive reliability estimation for medical diagnosis. Artif. Intell. Med. 29, 81–106 (2003)CrossRefGoogle Scholar
  10. 10.
    Marshall, M.R., Song, Q., Ma, T.M., MacDonell, S., Kasabov, N.: Evolving Connectionist System versus Algebraic Formulae for Prediction of Renal Function from Serum Creatinine. Kidney International 67, 1944–1954 (2005)CrossRefGoogle Scholar
  11. 11.
    McKenzie, D.P., Mackinnon, A.J., Peladeau, N., Onghena, P., Bruce, P.C., Clarke, D.M., Harrigan, S., McGorry, P.D.: Comparing correlated kappas by resampling: is one level of agreement significantly different from another? Journal of Psychiatric Research 30, 483–492 (1996)CrossRefGoogle Scholar
  12. 12.
    McKenzie, D.P., Mackinnon, A.J., Clarke, D.M.: KAPCOM: a program for the comparison of kappa coefficients obtained from the same sample of observations. Perceptual and Motor Skills 85, 899–902 (1997)CrossRefGoogle Scholar
  13. 13.
    Neural Network Toolbox User’s Guide. The Math Works Inc., 3 Apple Hill Drive, Natrick, Massachusetts, Ver. 4 (2002)Google Scholar
  14. 14.
    Oja, E.: a simplified neuron model as a principal component analyzer. Journal of Mathematical Biology 16, 267–273 (1982)CrossRefMathSciNetGoogle Scholar
  15. 15.
    Song, Q., Kasabov, N.: NFI: A Neuro-Fuzzy Inference Method for Transductive Reasoning. IEEE Trans. on Fuzzy Systems 13(6), 799–808 (2005)CrossRefGoogle Scholar
  16. 16.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)CrossRefMATHGoogle Scholar
  17. 17.
    Xu, L., Oja, E., Suen, C.Y.: Modified Hebbian Learning for Curve and Surface Fitting. Neural Networks 5, 441–457 (1992)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Qun Song
    • 1
  • Tian Min Ma
    • 1
  • Nikola Kasabov
    • 1
  1. 1.Knowledge Engineering & Discovery Research InstituteAuckland University of TechnologyAucklandNew Zealand

Personalised recommendations