Information Retrieval Technology

Volume 6458 of the series Lecture Notes in Computer Science pp 502-513

Tuning Machine-Learning Algorithms for Battery-Operated Portable Devices

  • Ziheng LinAffiliated withDepartment of Computer Science, National University of Singapore
  • , Yan GuAffiliated withContinental Automotive Singapore Pte Ltd
  • , Samarjit ChakrabortyAffiliated withInstitute for Real-time Computer Systems, TU Munich

* Final gross prices may vary according to local VAT.

Get Access


Machine learning algorithms in various forms are now increasingly being used on a variety of portable devices, starting from cell phones to PDAs. They often form a part of standard applications (e.g. for grammar-checking in email clients) that run on these devices and occupy a significant fraction of processor and memory bandwidth. However, most of the research within the machine learning community has ignored issues like memory usage and power consumption of processors running these algorithms. In this paper we investigate how machine learned models can be developed in a power-aware manner for deployment on resource-constrained portable devices. We show that by tolerating a small loss in accuracy, it is possible to dramatically improve the energy consumption and data cache behavior of these algorithms. More specifically, we explore a typical sequential labeling problem of part-of-speech tagging in natural language processing and show that a power-aware design can achieve up to 50% reduction in power consumption, trading off a minimal decrease in tagging accuracy of 3%.


Low-power Machine Learned Models Part-of-speech Tagging Mobile Machine Learning Applications Power-aware Design