Skip to main content

Swap Kernel Regression

  • 2309 Accesses

Part of the Lecture Notes in Computer Science book series (LNTCS,volume 11727)

Abstract

Recent developments in the field of artificial intelligence have increased the demand for high performance computation devices. An edge device is highly restricted not only in terms of its computational power but also memory capacity. This study proposes a method that enables both inference and learning on an edge device. The proposed method involves a kernel machine that works in restricted environments by collaborating with its secondary storage system. The kernel parameters, which are not essential for calculating the output values for the upcoming inputs, are stored in the secondary storage to make space in the main memory. The essential kernel parameters stored in the secondary storage are loaded into the main memory when required. With the use of this strategy, the system can realize the recognition/regression tasks without reducing its generalization capability.

Keywords

  • Swap kernel regression
  • Regression
  • Kernel machine
  • Softmax function
  • General regression neural network
  • Secondary storage

A part of this research was supported by the Chubu University Grant A.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-30487-4_18
  • Chapter length: 16 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   99.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-30487-4
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   129.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.

Notes

  1. 1.

    The original LGRNN algorithm has four learning options (Yamauchi (2014)). One of them being the projection operation. However this operation is not useful for the swap kernel regression. Hence, here we only use the replacement and ignore options.

  2. 2.

    Even under such situations, if the related instance distribution area is wide enough, then the number of kernels of the same cluster is larger than \(|S_t^1|\), hence the swap occurs repeatedly (see Algorithm 2).

  3. 3.

    Although the VGG16 model supports the processing of real color images, we still provide the re-scaled gray-level images as the input.

  4. 4.

    Note that each new instance was tested its recognition result before the incremental learning of it, and the swap-kerne-regression learned it incrementally. So, if the swap kernel regression fails to recognize a new instance, the cumulative number of mistakes is incremented.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Koichiro Yamauchi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Yamamoto, M., Yamauchi, K. (2019). Swap Kernel Regression. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation. ICANN 2019. Lecture Notes in Computer Science(), vol 11727. Springer, Cham. https://doi.org/10.1007/978-3-030-30487-4_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30487-4_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30486-7

  • Online ISBN: 978-3-030-30487-4

  • eBook Packages: Computer ScienceComputer Science (R0)