Skip to main content
Log in

Active improvement of hierarchical object features under budget constraints

  • Research Article
  • Published:
Frontiers of Computer Science Aims and scope Submit manuscript

Abstract

When we think of an object in a supervised learning setting, we usually perceive it as a collection of fixed attribute values. Although this setting may be suited well for many classification tasks, we propose a new object representation and therewith a new challenge in data mining; an object is no longer described by one set of attributes but is represented in a hierarchy of attribute sets in different levels of quality. Obtaining a more detailed representation of an object comes with a cost. This raises the interesting question of which objects we want to enhance under a given budget and cost model. This new setting is very useful whenever resources like computing power, memory or time are limited. We propose a new active adaptive algorithm (AAA) to improve objects in an iterative fashion. We demonstrate how to create a hierarchical object representation and prove the effectiveness of our new selection algorithm on these datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Rueping S, Scheffer T. Proceedings of the ICML 2005 Workshop on Learning with Multiple Views. 2005

  2. Adelson E H, Anderson C H, Bergen J R, Burt P J, Ogden J M. Pyramid methods in image processing. RCA Engineer, 1984, 29(6): 33–41

    Google Scholar 

  3. Cohn D A, Atlas L, Ladner R E. Improving generalization with active learning. Machine Learning, 1994, 15(2): 201–221

    Google Scholar 

  4. Zhou Z H, Li M. Semi-supervised learning by disagreement. Knowledge and Information Systems, 2010, 24(3): 415–439

    Article  Google Scholar 

  5. MacKay D J C. Information-based objective functions for active data selection. Neural Computation, 1992, 4(4): 590–604

    Article  Google Scholar 

  6. Roy N, McCallum A. Toward optimal active learning through sampling estimation of error reduction. In: Proceedings of the 18th International Conference on Machine Learning. 2001, 441–448

  7. Cohn D A, Ghahramani Z, Jordan M I. Active learning with statistical models. In: Proceedings of 1994 Neural Information Processing Systems. 1994, 705–712

  8. Lindenbaum M, Markovitch S, Rusakov D. Selective sampling for nearest neighbor classifiers. Machine Learning, 2004, 54(2): 125–152

    Article  MATH  Google Scholar 

  9. Freund Y, Seung S H, Shamir E, Tishby N. Selective sampling using the query by committee algorithm. Machine Learning, 1997, 28(2–3): 133–168

    Article  MATH  Google Scholar 

  10. Tong S, Koller D. Support vector machine active learning with applications to text classification. Journal of Machine Learning Research, 2001, 2: 45–66

    Google Scholar 

  11. Schohn G, Cohn D. Less is more: active learning with support vector machines. In: Proceedings of the 17th International Conference on Machine Learning. 2000, 839–846

  12. Campbell C, Cristianini N, Smola A J. Query learning with large margin classifiers. In: Proceedings of the 17th International Conference on Machine Learning. 2000, 111–118

  13. Baram Y, El-Yaniv R, Luz K. Online choice of active learning algorithms. Journal of Machine Learning Research, 2004, 5: 255–291

    MathSciNet  Google Scholar 

  14. Osugi T, Kun D, Scott S. Balancing exploration and exploitation: a new algorithm for active machine learning. In: Proceedings of the 5th IEEE International Conference on Data Mining. 2005, 330–337

  15. Cebron N, Berthold M R. Active learning for object classification: from exploration to exploitation. Data Mining and Knowledge Discovery, 2009, 18(2): 283–299

    Article  MathSciNet  Google Scholar 

  16. Balcan M, Beygelzimer A, Langford J. Agnostic active learning. In: Proceedings of the 23rd International Conference on Machine Learning. 2006, 65–72

  17. Dasgupta S, Kalai A T, Monteleoni C. Analysis of perceptron-based active learning. Journal ofMachine Learning Research, 2009, 10: 281–299

    MathSciNet  Google Scholar 

  18. Zhao W, He Q, Ma H, Shi Z. Effective semi-supervised document clustering via active learning with instance-level constraints. Knowledge and Information Systems (in Press)

  19. Basu S, Banerjee A, Mooney R J. Active semi-supervision for pairwise constrained clustering. In: Proceedings of the 4th SIAM International Conference on Data Mining. 2004, 333–344

  20. Kapoor A, Horvitz E, Basu S. Selective supervision: guiding supervised learning with decision-theoretic active learning. In: Proceedings of the 20th International Joint Conference on Artificial Intelligence. 2007, 877–882

  21. Settles B, Craven M, Friedland L. Active learning with real annotation costs. In: Proceedings of the NIPS Workshop on Cost-Sensitive Learning. 2008, 1–10

  22. Zheng Z, Padmanabhan B. On active learning for data acquisition. In: Proceedings of 2002 IEEE International Conference on Data Mining. 2002, 562–569

  23. Saar-Tsechansky M, Melville P, Provost F. Active feature-value acquisition. Management Science, 2009, 55(4): 664–684

    Article  Google Scholar 

  24. Viola P A, Jones M J. Rapid object detection using a boosted cascade of simple features. In: Proceedings of 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2001, 511–518

  25. Schölkopf B, Burges C J C, Smola A J. Advances in Kernel Methods: Support Vector Learning. Cambridge: MIT Press, 1999

    Google Scholar 

  26. Abbasnejad M E, Ramachandram D, Mandava R. A survey of the state of the art in learning the kernels. Knowledge and Information Systems (in Press)

  27. McCallum A, Nigam K. Employing EMand pool-based active learning for text classification. In: Proceedings of the 15th International Conference on Machine Learning. 1998, 350–358

  28. Mandel M I, Poliner G E, Ellis D P W. Support vector machine active learning for music retrieval. Multimedia Systems, 2006, 12(1): 3–13

    Article  Google Scholar 

  29. Wang L, Chan K L, Zhang Z. Bootstrapping SVM active learning by incorporating unlabelled images for image retrieval. In: Proceedings of 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2003, 629–634

  30. Luo T, Kramer K, Goldgof D B, Hall L O, Samson S, Remsen A, Hopkins T. Active learning to recognize multiple types of plankton. Journal of Machine Learning Research, 2005, 6: 589–613

    MathSciNet  MATH  Google Scholar 

  31. Warmuth M K, Liao J, Rätsch G, Mathieson M, Putta S, Lemmen C. Active learning with support vector machines in the drug discovery process. Journal of Chemical Information and Computer Sciences, 2003, 43(2): 667–673

    Article  Google Scholar 

  32. Cauwenberghs G, Poggio T. Incremental and decremental support vector machine learning. In: Proceedings of 2000 Neural Information Processing Systems. 2000, 409–415

  33. van der Heijden F, Duin R, de Ridder D, Tax D M J. Classification, Parameter Estimation and State Estimation: An Engineering Approach Using Matlab. New York: Wiley, 2004

    Book  MATH  Google Scholar 

  34. Zernike F. Diffraction theory of the cut procedure and its improved form, the phase contrast method. Physica, 1934, 1: 689–704

    Article  MATH  Google Scholar 

  35. Asuncion A, Newman D J. UCI Machine Learning Repository, 2007

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nicolas Cebron.

Additional information

Nicolas Cebron obtained his diploma in Computer Science from Ostfalia University of Applied Sciences in Braunschweig, Germany, and his PhD in Computer Science from the University of Konstanz, Germany. He spent one year as a postdoctoral researcher in the European Union research project “Bisociation Networks for Creative Information Discovery” and one year as a postdoctoral researcher at the International Computer Science Institute at the University of California, Berkeley. He has received grants from the German Research Foundation and from the German Academic Exchange Service. Nicolas is currently working as a researcher and lecturer in the field of active machine learning and image classification at the University of Augsburg, Germany.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cebron, N. Active improvement of hierarchical object features under budget constraints. Front. Comput. Sci. 6, 143–153 (2012). https://doi.org/10.1007/s11704-012-2857-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11704-012-2857-5

Keywords

Navigation