Skip to main content
Log in

Bagging based ensemble transfer learning

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

Nowadays, transfer learning is one of the main research areas in machine learning that is helpful for labeling the data with low cost. In this paper, we propose a novel bagging-based ensemble transfer learning (BETL). The BETL framework includes three operations: Initiate, Update, and Integrate. In the Initiate operation, we use bootstrap sampling to divide the source data into many subsets, and add the labeled data from the target domain into these subsets separately so that the source data and the target data arrive at a reasonable ratio, then we learn as many initial classifiers as the elements of an ensemble. In the Update operation, we utilize the initial classifiers and an updateable classifier to repeatedly label the data that hasn’t been labeled yet in the target domain, and then, add the newly labeled data into the target domain to renew the updateable classifier. In the Integrate operation, we integrate the updated classifiers from each iteration into a pool to predict the labels of the test data via the majority vote strategy. In order to demonstrate the effectiveness of our method in the classification process, we conduct experiments on UCI data set, real world data set, and text data set. The results show that our method can effectively label the unlabeled data in the target domain, which greatly enhances the performance of target domain.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. http://archive.ics.uci.edu/ml/datasets.html.

  2. http://www.cs.ust.hk/~qyang/.

  3. http://www.cs.ust.hk/~qyang/.

References

  • Barbosa BHG, Aguirre LA, Braga AP (2011) The use of coevolution and the artificial immune system for ensemble learning. Soft Comput 15(9):1735–1747. doi:10.1007/s00500-010-0613-z

    Article  Google Scholar 

  • Bennett PN, Dumais ST, Horvitz E (2005) The combination of text classifiers using reliability indicators. Inf Retr 8(1):67–100. doi:10.1023/B:INRT.0000048491.59134.94

    Article  Google Scholar 

  • Bouziane H, Messabih B, Chouarfia A (2014) Effect of simple ensemble methods on protein secondary structure prediction. Soft Comput 1–16. doi:10.1007/s00500-014-1355-0

  • Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140. doi:10.1023/A:1018054314350

    MathSciNet  MATH  Google Scholar 

  • Camastra F, Ciaramella A, Staiano A (2013) Machine learning and soft computing for ict security: an overview of current trends. J Ambient Intell Humaniz Comput 4(2):235–247. doi:10.1007/s12652-011-0073-z

    Article  Google Scholar 

  • Crammer K, Kearns M, Wortman J (2007) Learning from multiple sources. In: 20th annual conference on neural information processing systems, pp 321–328

  • Dai W, Yang Q, Xue G, Yu Y (2007) Boosting for transfer learning. In: Proceedings of the 24th annual international conference on machine learning, pp 193–200

  • Daume H, Marcu D (2006) Domain adaptation for statistical classifiers. J Artif Intell Res 26:101–126

    MathSciNet  MATH  Google Scholar 

  • Dietterich TG (2002) Ensemble learning. In: Arbib MA (ed) The handbook of brain theory and neural networks, 2nd edn. MIT Press, Cambridge, MA

  • Esposito C, Ficco M, Palmieri F, Castiglione A (2015) Smart cloud storage service selection based on fuzzy logic, theory of evidence and game theory. IEEE Trans Comput 1–14. doi:10.1109/TC.2015.2389952

  • Ficco M, Palmieri F, Castiglione A (2015) Modeling security requirements for cloud-based system development. Concurr Comput: Pract Exp 27(8):2107–2124. doi:10.1002/cpe.3402

  • Gao J, Fan W, Jiang J et al (2008) Knowledge transfer via multiple model local structure mapping. In: Proceedings of the 14th ACM SIGKDD international conference on knowledge discovery and data mining (KDD 2008), pp 283–291

  • Gong W, Cai Z (2013) Differential evolution with ranking-based mutation operators. IEEE Trans Cybern 43(6):2066–2081. doi:10.1109/TCYB.2013.2239988

    Article  Google Scholar 

  • Gong W, Cai Z, Ling CX (2011a) De/bbo: a hybrid differential evolution with biogeography-based optimization for global numerical optimization. Soft Comput 15(4):645–665. doi:10.1007/s00500-010-0591-1

    Article  Google Scholar 

  • Gong W, Cai Z, Ling CX, Li H (2011b) Enhanced differential evolution with adaptive strategies for numerical optimization. IEEE Trans Syst Man Cybern: Part B Cybern 41(2):397–413. doi:10.1109/TSMCB.2010.2056367

    Article  Google Scholar 

  • Kamishima T, Hamasaki M, Akaho S (2009) Trbagg: a simple transfer learning method and its application to personalization in collaborative filtering. In: Proceedings of 9th IEEE international conference on data mining, pp 219–228

  • Kuncheva LI (ed) (2004) Combining patten classifiers: methods and algorithms. Wiley, New York

  • Lee CC, Shih CY, Lai WP, Lin PC (2012) An improved boosting algorithm and its application to facial emotion recognition. J Ambient Intell Humaniz Comput 3:11–17

    Article  Google Scholar 

  • Mitchell TM (ed) (1997) Machine learning. McGraw Hill, New York

  • Pan SJ, Yang Q (2010) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359

    Article  Google Scholar 

  • Roy DM, Kaelbling LP (2007) Efficient bayesian task-level transfer learning. In: 20th international joint conference on artificial intelligence, pp 2599–2604

  • Shi W, Fan W, Ren J (2008) Actively transfer domain knowledge. In: European conference on machine learning and knowledge discovery in databases, ECML PKDD 2008, pp 342–357

  • Shi Y, Lan Z, Liu W, et al (2009) Extending semi-supervised learning methods for inductive transfer learning. In: 9th IEEE international conference on data mining, pp 483–492

  • Storkey A, Sugiyama M (2007) Mixture regression for covariate shift. In: 20th annual conference on neural information processing systems, pp 1337–1344

  • Witten IH, Frank E, Hall MA (eds) (2010) Data mining: practical machine learning tools and techniques, 3rd edn. Morgan Kaufmann, Burlington, MA

Download references

Acknowledgments

We thank the reviewers for their helpful comments. Project supported by the Fundamental Research Funds for the Central Universities (Grant No. G1323511315), the Key Project of the Natural Science Foundation of Hubei Province, China (Grant No. 2013CFA004), and the National Natural Science Foundation of China (Grant No. 61403351).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaobo Liu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, X., Wang, G., Cai, Z. et al. Bagging based ensemble transfer learning. J Ambient Intell Human Comput 7, 29–36 (2016). https://doi.org/10.1007/s12652-015-0296-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-015-0296-5

Keywords

Navigation