Advertisement

Orthogonal Decision Trees for Resource-Constrained Physiological Data Stream Monitoring Using Mobile Devices

  • Haimonti Dutta
  • Hillol Kargupta
  • Anupam Joshi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3769)

Abstract

This paper considers the problem of monitoring physiological data streams obtained from resource-constrained wearable sensing devices for pervasive health-care management. It considers Orthogonal decision trees (ODTs) that offer an effective way to construct a redundancy-free, accurate, and meaningful representation of large decision-tree-ensembles often created by popular techniques such as Bagging, Boosting, Random Forests and many distributed and data stream mining algorithms. ODTs are functionally orthogonal to each other and they correspond to the principal components of the underlying function space. This paper offers experimental results to document the performance of ODTs on grounds of accuracy, model complexity, and resource consumption.

Keywords

Decision Tree Random Forest Data Stream West Nile Virus Fourier Spectrum 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Breiman, L., Freidman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth, Belmont (1984)zbMATHGoogle Scholar
  2. 2.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kauffman, San Francisco (1993)Google Scholar
  3. 3.
    Freund, Y.: Boosting a weak learning algorithm by majority. Information and Computation 121, 256–285 (1995)zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Drucker, H., Cortes, C.: Boosting decision trees. Advances in Neural Information Processing Systems 8, 479–485 (1996)Google Scholar
  5. 5.
    Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)zbMATHMathSciNetGoogle Scholar
  6. 6.
    Wolpert, D.: Stacked generalization. Neural Networks 5, 241–259 (1992)CrossRefGoogle Scholar
  7. 7.
    Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)zbMATHCrossRefGoogle Scholar
  8. 8.
    Street, W.N., Kim, Y.: A streaming ensemble algorithm (sea) for large-scale classificaiton. In: Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA (2001)Google Scholar
  9. 9.
    Kargupta, H., Park, B.: A Fourier spectrum-based approach to represent decision trees for mining data streams in mobile environments. IEEE Transactions on Knowledge and Data Engineering 16, 216–229 (2002)CrossRefGoogle Scholar
  10. 10.
    Kargupta, H., Dutta, H.: Orthogonal Decision Trees. In: Fourth IEEE International Conference on Data Mining (ICDM), pp. 427–430 (2004)Google Scholar
  11. 11.
    Kostov, Y., Rao, G.: Low-cost optical instrumentation for biomedical measurements. Review of Scientific Instruments 71, 4361–4373 (2000)CrossRefGoogle Scholar
  12. 12.
    Park, B.H., Kargupta, H.: Constructing simpler decision trees from ensemble models using Fourier analysis. In: Proceedings of the 7th Workshop on Research Issues in Data Mining and Knowledge Discovery, ACM SIGMOD, pp. 18–23 (2002)Google Scholar
  13. 13.
    Linial, N., Mansour, Y., Nisan, N.: Constant depth circuits, fourier transform, and learnability. Journal of the ACM 40, 607–620 (1993)zbMATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Merz, C.J., Pazzani, M.J.: A principal components approach to combining regression estimates. Machine Learning 36, 9–32 (1999)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Haimonti Dutta
    • 1
  • Hillol Kargupta
    • 1
  • Anupam Joshi
    • 1
  1. 1.Department of Computer Science and Electrical EngineeringUniversity of Maryland Baltimore CountyBaltimoreUSA

Personalised recommendations