Incremental Learning on a Budget and Its Application to Power Electronics

  • Koichiro Yamauchi
  • Yusuke Kondo
  • Akinari Maeda
  • Kiyotaka Nakano
  • Akihisa Kato
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8227)

Abstract

In this paper, we present an incremental learning method on a budget for embedded systems. We discuss its application for two power systems: a micro-converter for photovoltaic and a step down DC-DC-converter. This learning method is a variation of the general regression neural network but it is able to continue incremental learning on a bounded support set. The method basically learns new instances by adding new kernels. However, when the number of kernels reaches a predefined upper bound, the method selects the most effective learning option from several options: including replacing the most ineffective kernel with the new kernel, modifying of the parameters of existing kernels, and ignoring the new instance.

The proposed method is compared with other similar learning methods on a budget, which are based on kernel perceptron. Two examples of the application of the proposed method are demonstrated in power electronics. In these two examples, we show that the proposed system learns the properties of the control-objects during the services and realizes quick control.

Keywords

Incremental Learning on a budget Kernel Method micro-converter photovoltaic shadow-flicker model-based control DC-DC converter 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Dekel, O., Shalev-Shwartz, S., Singer, Y.: The forgetron: A kernel-based perceptron on a budget. SIAM Journal on Computing (SICOMP) 37(5), 1342–1372 (2008)MathSciNetCrossRefMATHGoogle Scholar
  2. 2.
    Orabona, F., Keshet, J., Caputo, B.: The projectron: a bounded kernel-based perceptron. In: ICML 2008, pp. 720–727 (2008)Google Scholar
  3. 3.
    He, W., Wu, S.: A kernel-based perceptron with dynamic memory. Neural Networks 25, 105–113 (2011)Google Scholar
  4. 4.
    Yamauchi, K.: Pruning with replacement and automatic distance metric detection in limited general regression neural networks. In: Proceedings of International Joint Conference on Neural Networks, San Jose, California, USA, July 31-August 5, pp. 899–906. IEEE (2011)Google Scholar
  5. 5.
    Yamauchi, K.: Incremental learning on a budget and its application to quick maximum power point tracking of photovoltaic systems. In: The 6th International Conference on Soft Computing and Intelligent Systems, pp. 71–78 (November 2012)Google Scholar
  6. 6.
    Specht, D.F.: A general regression neural network. IEEE Transactions on Neural Networks 2(6), 568–576 (1991)CrossRefGoogle Scholar
  7. 7.
    Xu, X., Hu, D., Lu, X.: Kernel-based least squares policy iteration for reinforcement learning. IEEE Transactions on Neural Networks 18(4), 973–992 (2007)CrossRefGoogle Scholar
  8. 8.
    Yamauchi, K.: An importance weighted projection method for incremental learning under unstationary environments. In: IJCNN 2013: The International Joint Conference on Neural Networks 2013, pp. 506–514. IEEE (August 2013)Google Scholar
  9. 9.
    Bache, K., Lichman, M.: UCI machine learning repository (2013), http://archive.ics.uci.edu/ml
  10. 10.
    Webb, A.R.: Functional approximation by feed-forward networks: a least-squares approach to generalization. IEEE Transactions on Neural Networks 5(3), 363–371 (1994)CrossRefGoogle Scholar
  11. 11.
    Esram, T., Chapman, P.L.: Comparison of photovoltaic array maximum power point tracking techniques. IEEE Transactions on Energy Conversion 22(2), 439–449 (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Koichiro Yamauchi
    • 1
  • Yusuke Kondo
    • 1
  • Akinari Maeda
    • 1
  • Kiyotaka Nakano
    • 1
  • Akihisa Kato
    • 1
  1. 1.Department of Information ScienceChubu UniversityJapan

Personalised recommendations