Lightweight Modal Regression for Stand Alone Embedded Systems

  • Taiki Watanabe
  • Koichiro YamauchiEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11954)


Although the CPU power of recent embedded systems has increased, their storage space is still limited. To overcome this limitation, most embedded devices are connected to a cloud server so they can outsource heavy calculations. However, some applications must handle private data, meaning internet connections are undesirable based on security concerns. Therefore, small devices that handle private data should be able to work without internet connections. This paper presents a limited modal regression model that restricts the number of internal units to a certain fixed number. Modal regression can be used for multivalued function approximation with limited sensory inputs. In this study, a kernel density estimator (KDE) with a fixed number of kernels called “limited KDE” was constructed. We will demonstrate how to implement the limited KDE and how to construct a lightweight algorithm for modal regression using a system-on-chip field-programmable gate array device.


Multivalued function Regression Modal regression Kernel Density Estimator (KDE) Incremental learning 


  1. Cao, Y., He, H., Man, H.: SOMKE: kernel density estimation over data streams by sequences of self-organizing maps. IEEE Trans. Neural Netw. Learn. Syst. 23(8), 1254–1268 (2012). Scholar
  2. Dunn, J.C.: A fuzzy relative of the isodata process and its use in detecting compact well-separated clusters. J. Cybern. 3(3), 32–57 (1973). Scholar
  3. Einbeck, J., Tutz, G.: Modelling beyond regression functions: an application of multimodal regression to speed flow data. Appl. Stat. 55(4), 461–475 (2006). Scholar
  4. Kaelbling, L.P., Littman, M.L., Cassandra, A.R.: Planning and acting in partially observable stochastic domains. Artif. Intell. 101, 99–134 (1998). Scholar
  5. Lee, D., Noh, S., Min, S., Choi, J., Kim, J., Cho, Y., Sang, K.C.: LRFU: a spectrum of policies that subsumes the least recently used and least frequently used policies. IEEE Trans. Comput. 50(12), 1352–1361 (2001)MathSciNetCrossRefGoogle Scholar
  6. MacQueen, J.: Some methods for classification and analysis of multivariate observations. In: Cam, L.M.L., Neyman, J. (eds.) Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 281–297. University of California Press, Berkeley (1967)Google Scholar
  7. Parzen, E.: On estimation of a probability density function and mode. Ann. Math. Stat. 33(3), 1065–1076 (1962). Scholar
  8. Sasaki, H., Ono, Y., Sugiyama, M.: Modal regression via direct log-density derivative estimation. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9948, pp. 108–116. Springer, Cham (2016). Scholar
  9. Tax, D.M., Duin, R.P.: Support vector data description. Mach. Learn. 54(1), 45–66 (2004). Scholar
  10. Yamamoto, M, Yamauchi, K.: Swap kernel regression. In: Tetko, I.V., Kůrková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11728, pp. 579–592. Springer, Cham (2019). Scholar
  11. Yamauchi, K.: A Quick Maximum Power Point Tracking Method Using an Embedded Learning Algorithm for Photovoltaics on Roads, pp. 85–106. InTechOpen (2018). Scholar
  12. Yamauchi, K., Bhargav, V.N.: Minimum modal regression. In: Marsico, M.D., di Baja, G.S., Fred, A. (eds.) ICPRAM2018 7th International Conference on Pattern Recognition Applications and Methods, pp. 448–455 (2018)Google Scholar
  13. Yao, W., Lindsay, B.G., Li, R.: Local modal regression. Stat. J. Nonparametr. 24(3), 647–663 (2012). Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Chubu UniversityKasugaiJapan

Personalised recommendations