Tuning Machine-Learning Algorithms for Battery-Operated Portable Devices

  • Ziheng Lin
  • Yan Gu
  • Samarjit Chakraborty
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6458)

Abstract

Machine learning algorithms in various forms are now increasingly being used on a variety of portable devices, starting from cell phones to PDAs. They often form a part of standard applications (e.g. for grammar-checking in email clients) that run on these devices and occupy a significant fraction of processor and memory bandwidth. However, most of the research within the machine learning community has ignored issues like memory usage and power consumption of processors running these algorithms. In this paper we investigate how machine learned models can be developed in a power-aware manner for deployment on resource-constrained portable devices. We show that by tolerating a small loss in accuracy, it is possible to dramatically improve the energy consumption and data cache behavior of these algorithms. More specifically, we explore a typical sequential labeling problem of part-of-speech tagging in natural language processing and show that a power-aware design can achieve up to 50% reduction in power consumption, trading off a minimal decrease in tagging accuracy of 3%.

Keywords

Low-power Machine Learned Models Part-of-speech Tagging Mobile Machine Learning Applications Power-aware Design 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Ziheng Lin
    • 1
  • Yan Gu
    • 2
  • Samarjit Chakraborty
    • 3
  1. 1.Department of Computer ScienceNational University of SingaporeSingapore
  2. 2.Continental Automotive Singapore Pte LtdSingapore
  3. 3.Institute for Real-time Computer SystemsTU MunichGermany

Personalised recommendations