Other Applications: Sequential Dependence Modelling and Data Mining

  • Addisson Salazar
Part of the Springer Theses book series (Springer Theses, volume 4)


This chapter presents two diverse applications: diagnosis of sleep disorders (apnea) and data mining in a web of a virtual campus. The first application presents a procedure to extend ICA mixture models (ICAMM) to the case of having sequential dependence in the feature observation record. We call it sequential ICAMM (SICAMM). We present the algorithm, which is essentially a sequential Bayes processor, which can be used to sequentially classify the input feature vector among a given set of possible classes. Estimates of the class-transition probabilities are used in conjunction with the classical ICAMM parameters: mixture matrices, centroids, and source probability densities. These parameters were estimated using the Mixca algorithm proposed in Chap. 3 Some simulations are presented to verify the improvement of SICAMM with respect to ICAMM. Moreover, a real data case is considered: the computation of hypnograms to help in the diagnosis of sleep disorders. Both simulated and real data analyses demonstrate the potential interest of including sequential dependence in the implementation of an ICAMM classifier.


Independent Component Analyzer Learning Style Observation Vector Sequential Dependence Exercise Practice 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    T.W. Lee, M.S. Lewicki, T.J. Sejnowski, ICA mixture models for unsupervised classification of non-gaussian classes and automatic context switching in blind signal separation. IEEE Trans. Pattern Anal. Mach. Intell. 22(10), 1078–1089 (2000)CrossRefGoogle Scholar
  2. 2.
    T.W. Lee, M.S. Lewicki, Unsupervised image classification, segmentation, and enhancement using ICA mixture models. IEEE Trans. Image Process. 11(3), 270–279 (2002)CrossRefGoogle Scholar
  3. 3.
    R. Choudrey, S. Roberts, Variational mixture of bayesian independent component analysers. Neural Comput. 15(1), 213–252 (2002)CrossRefGoogle Scholar
  4. 4.
    N.H. Mollah, M. Minami, S. Eguchi, Exploring latent structure of mixture ICA models by the Minimum ß-Divergence method. Neural Comput. 18, 166–190 (2005)CrossRefGoogle Scholar
  5. 5.
    C.T. Lin, W.C. Cheng, S.F. Liang, An on-line ICA-mixture-model-based self-constructing fuzzy neural network. IEEE Trans. Circuits Syst. 52(1), 207–221 (2005)CrossRefMathSciNetGoogle Scholar
  6. 6.
    C.A. Shah, P.K. Varshney, M.K. Arora, ICA mixture model algorithm for unsupervised classification of remote sensing imagery. Int. J. Remote Sens. 28(8), 1711–1731 (2007)CrossRefGoogle Scholar
  7. 7.
    A. Salazar, L. Vergara, A. Serrano, J. Igual, A general procedure for learning mixtures of independent component analyzers. Pattern Recognit. 43(1), 69–85 (2010)CrossRefzbMATHGoogle Scholar
  8. 8.
    O. Cappe, E. Moulines, T. Ryden, Inference in Hidden Markov Models (Springer, New York, 2005)zbMATHGoogle Scholar
  9. 9.
    A. Salazar, L. Vergara, R. Miralles, On including sequential dependence in ICA mixture models. Signal Process. 90, 2314–2318 (2010)CrossRefzbMATHGoogle Scholar
  10. 10.
    R. Agarwal, J. Gotman, Computer-assisted sleep staging. IEEE Trans. Biomed. Eng. 12(48), 1412–1423 (2001)CrossRefGoogle Scholar
  11. 11.
    M. Jobert, H. Shulz, P. Jähnig, C. Tismer, F. Bes, H. Escola, A computerized method for detecting episodes of wakefulness during sleep based on the Alpha slow-wave index (ASI). Sleep 17(1), 37–46 (1994)Google Scholar
  12. 12.
    J.F. Cardoso, A. Souloumiac, Blind beamforming for non gaussian signals. IEE Proc.-F 140(6), 362–370 (1993)Google Scholar
  13. 13.
    A.K. Jain, Statistical pattern recognition: a review. IEEE Trans. Pattern Anal. Mach. Intell. 22(1), 4–37 (2000)CrossRefGoogle Scholar
  14. 14.
    J. Srivastava, R. Cooley, M. Deshpande, P. Tan, Web usage mining: discovery and applications of usage patterns from web data. SIGKDD Explor. 2(1), 12–23 (2000)CrossRefGoogle Scholar
  15. 15.
    J. Larsen, L.K. Hansen, A. Szymkowiak, T. Christiansen, T. Kolenda, Webmining: learning from the World Wide Web, special issue of Computational Statistics and Data Analysis. Comput. Stat. Data Anal. 38, 517–532 (2002)Google Scholar
  16. 16.
    S.B. Kotsiantis, C.J. Pierrakeas, P.E. Pintelas, Preventing student dropout in distance learning using machine learning techniques. Proceedings of 7th International Conference on Knowledge-Base Intelligent Information an Engineering Systems, pp. 267–274 (2003)Google Scholar
  17. 17.
    B. Minaei, D.A. Kashy, G. Kortemeyer, W. Punch, Predicting student performance: an application of data mining methods with an educational web-based system. Proceedings of 33rd Frontiers in Education Conference, pp. T2A-13-T2A-18 (2003)Google Scholar
  18. 18.
    W.Zang, F. Lin, Investigation of web-based teaching and learning by boosting algorithms. IEEE International Conference on Information Technology: Research and Education, pp. 445–449 (2003)Google Scholar
  19. 19.
    E. Mor, J. Minguillón, E-learning personalization based on itineraries and long-term navigational behavior. Proceedings of 30th World Web Conference, no. 2, pp. 264–265, New York, 2004Google Scholar
  20. 20.
    M. Xenos, Prediction and assessment of student behaviour in open and distance education in computers using Bayesian networks. Comput. Education 43, 345–359 (2004)CrossRefGoogle Scholar
  21. 21.
    P. Garcia, A. Amandi, S. Schiaffino, M. Campo, Evaluating Bayesian Networks’ Precision for Detecting Students’ Learning Styles. Comput. Education 49, 794–808 (2007) Google Scholar
  22. 22.
    R. Boscolo, H. Pan, Independent component analysis based on nonparametric density estimation. IEEE Trans. Neural Netw. 15(1), 55–65 (2004)CrossRefGoogle Scholar
  23. 23.
    E.G. Learned-Miller, J.W. Fisher, ICA using spacings estimates of entropy. J. Mach. Learn. Res. 4, 1271–1295 (2003)MathSciNetGoogle Scholar
  24. 24.
    R. Felder, L. Silverman, Learning and teaching styles. J. Eng. Education 78(7), 674–681 (1988)Google Scholar
  25. 25.
    M. Khalifa, R. Lam, Web-based learning: effects on learning process and outcome. IEEE Trans. Education 45(4), 350–356 (2002)CrossRefGoogle Scholar
  26. 26.
    W. Hardle, L. Simar, Applied Multivariate Statistical Analysis (Springer, New York, 2006)Google Scholar
  27. 27.
    L.R.B. Elton, D.M. Laurillard, Trends in research on student learning. Stud. High. Education 4, 87–102 (1979)CrossRefGoogle Scholar
  28. 28.
    M. Haldiki, Y. Batistakis, M. Vazirgiannis, On clustering validation techniques. J. Intell. Inf. Syst. 17(2–3), 107–145 (2001)Google Scholar
  29. 29.
    T.W. Lee, M. Girolami, T.J. Sejnowski, Independent component analysis using an extended InfoMax algorithm for mixed sub-gaussian and super-gaussian sources. Neural Comput. 11(2), 417–441 (1999)CrossRefGoogle Scholar
  30. 30.
    J.F. Cardoso, High-order contrasts for independent component analysis. Neural Comput. 11(1), 157–192 (1999)CrossRefMathSciNetGoogle Scholar
  31. 31.
    A. Ziehe, K.R. Müller, TDSEP—an efficient algorithm for blind separation using time structure. Proceedings of the 8th International Conference on Artificial Neural Networks, ICANN’98, Perspectives in Neural Computing, pp. 675–680 (1998)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  1. 1.Department of Communications, School of Telecommunication EngineeringPolytechnic University of ValenciaValenciaSpain

Personalised recommendations