Advertisement

Sentinel: generating GUI tests for sensor leaks in Android and Android wear apps

  • 40 Accesses

Abstract

Due to the widespread use of Android devices and apps, it is important to develop tools and techniques to improve app quality and performance. Our work focuses on a problem related to hardware sensors on Android devices: the failure to disable unneeded sensors, which leads to sensor leaks and thus battery drain. We propose the Sentinel testing tool to uncover such leaks. The tool performs static analysis of app code and produces a model which maps GUI events to callback methods that affect sensor behavior. Edges in the model are labeled with symbols representing the acquiring/releasing of sensors and the opening/closing of UI windows. The model is traversed to identify paths that are likely to exhibit sensor leaks during run-time execution based on two context-free languages over the symbol alphabet. The reported paths are then used to generate test cases. The execution of each test case tracks the run-time behavior of sensors and reports observed leaks. This approach has been applied to both open-sourced and closed-sourced regular Android applications as well as watch faces for Android Wear smartwatches. Our experimental results indicate that Sentinel effectively detects sensor leaks, while focusing the testing efforts on a very small subset of possible GUI event sequences.

This is a preview of subscription content, log in to check access.

Access options

Buy single article

Instant unlimited access to the full article PDF.

US$ 39.95

Price includes VAT for USA

Subscribe to journal

Immediate online access to all issues from 2019. Subscription will auto renew annually.

US$ 99

This is the net price. Taxes to be calculated in checkout.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Notes

  1. 1.

    The file format used by Android for distribution and installation of apps.

  2. 2.

    In some cases (e.g., when the device is rotated) the current window is destroyed and then recreated with a different layout. Such cases are also represented as \(w_{i} \rightarrow w_{i}\) transitions.

  3. 3.

    More generally, the top of the stack could be open(wj) for some menu or dialog wj working on behalf of activity wi. This generalization is discussed in Section 3.6.

  4. 4.

    https://github.com/fathominfo/fathom-watchfaces/issues/53

References

  1. Alshahwan, N., Gao, X., Harman, M., Jia, Y., Mao, K., Mols, A., Tei, T., Zorin, I. (2018). Deploying search based software engineering with Sapienz at Facebook. In SBSE (pp. 3–45).

  2. Amalfitano, D., Fasolino, A., Tramontana, P., De Carmine, S., Memon, A. (2012). Using GUI ripping for automated testing of Android applications. In ASE (pp. 258–261).

  3. Amalfitano, D., Fasolino, A., Tramontana, P., Ta, B., Memon, A. (2015). MobiGUITAR: automated model-based testing of mobile apps. IEEE Software, 32(5), 53–59.

  4. Anand, S., Naik, M., Harrold, M., Yang, H. (2012). Automated concolic testing of smartphone apps. In FSE (pp. 1–11).

  5. APKPure. (2018). Apkpure: Free APKs online. https://apkpure.com.

  6. Arzt, S., Rasthofer, S., Fritz, C., Bodden, E., Bartel, A., Klein, J., Le Traon, Y., Octeau, D., McDaniel, P. (2014). FlowDroid: Precise context, flow, field, object-sensitive and lifecycle-aware taint analysis for Android apps. In PLDI (pp. 259–269).

  7. Azim, T., & Neamtiu, I. (2013). Targeted and depth-first exploration for systematic testing of Android apps. In OOPSLA (pp. 641–660).

  8. Bacon, D., & Sweeney, P. (1996). Fast static analysis of C++ virtual function calls. In OOPSLA (pp. 324–341).

  9. Banerjee, A., & Roychoudhury, A. (2016). Automated re-factoring of Android apps to enhance energy-efficiency. In MOBILESoft (pp. 139–150).

  10. Banerjee, A., Chong, L., Chattopadhyay, S., Roychoudhury, A. (2014). Detecting energy bugs and hotspots in mobile apps. In FSE (pp. 588–598).

  11. Banerjee, A., Guo, H., Roychoudhury, A. (2016). Debugging energy-efficiency related field failures in mobile apps. In MOBILESoft (pp. 127–138).

  12. Choi, W., Necula, G., Sen, K. (2013). Guided GUI testing of Android apps with minimal restart and approximate learning. In OOPSLA (pp. 623–640).

  13. Choudhary, S., Gorla, A., Orso, A. (2015). Automated test input generation for Android: are we there yet?. In ASE (pp. 429–440).

  14. Corral, L., & Fronza, I. (2015). Better code for better apps: a study on source code quality and market success of Android applications. In MOBILESoft (pp. 22–32).

  15. Corral, L., Sillitti, A., Succi, G. (205). Defining relevant software quality characteristics from publishing policies of mobile app stores. In International Conference on Mobile Web and Information Systems.

  16. Cruz, L., & Abreu, R. (2017). Performance-based guidelines for energy efficient mobile applications. In MOBILESoft (pp. 46–57).

  17. Dean, J., Grove, D., Chambers, C. (1995). Optimizations of object-oriented programs using static class hierarchy analysis. In ECOOP (pp. 77–101).

  18. d’Heureuse, N., Huici, F., Arumaithurai, M., Ahmed, M., Papagiannaki, K., Niccolini, S. (2012). What’s app?: a wide-scale measurement study of smart phone markets. ACM SIGMOBILE Mobile Computing and Communications Review, 16(2), 16–27.

  19. Fazzini, M., Freitas, E., Choudhary, S.R., Orso, A. (2017). Barista: a technique for recording, encoding, and running platform independent Android tests. In ICST (pp. 149–160).

  20. Garcia, J., Hammad, M., Ghorbani, N., Malek, S. (2017). Automatic generation of inter-component communication exploits for Android applications. In FSE (pp. 661–671).

  21. GATOR. (2017). GATOR: Program analysis toolkit for Android. http://web.cse.ohio-state.edu/presto/software/gator.

  22. Google. (2017a). Monkey: UI/Application exerciser for Android. http://developer.Android.com/tools/help/monkey.html.

  23. Google. (2017b). UI Automator testing framework. http://developer.Android.com/training/testing/ui-automator.html.

  24. Google. (2018a). Android Debug Bridge (adb). https://developer.Android.com/studio/command-line/adb.

  25. Google. (2018b). Android Wear. http://developer.Android.com/wear.

  26. Google. (2018c). Best practices for accessing and using sensors. https://developer.Android.com/guide/topics/sensors/sensors_overview.html#sensors-practices.

  27. Google. (2018d). dumpsys. https://developer.Android.com/studio/command-line/dumpsys.

  28. Google. (2018e). Firebase test lab Robo test. https://firebase.google.com/docs/test-lab/Android/robo-ux-test.

  29. Google. (2018f). MonkeyRunner. https://developer.Android.com/studio/test/monkeyrunner.

  30. Google. (2018g). Optimizing watch faces: move expensive operations outside the drawing method. https://developer.Android.com/training/wearables/watch-faces/performance.html#OutDrawing.

  31. Grano, G., Ciurumelea, A., Panichella, S., Palomba, F., Gall, H.C. (2018). Exploring the integration of user feedback in automated testing of Android applications. In SANER (pp. 72–83).

  32. Grove, D., & Chambers, C. (2001). A framework for call graph construction algorithms. TOPLAS, 23(6), 685–746.

  33. Hao, S., Liu, B., Nath, S., Halfond, W., Govindan, R. (2014). PUMA: Programmable UI-automation for large-scale dynamic analysis of mobile apps. In MobiSys (pp. 204–217).

  34. He, X. (2018). Python wrapper of Android UI Automator test tool. http://github.com/xiaocong/uiautomator.

  35. Jabbarvand, R., & Malek, S. (2017). μ droid: an energy-aware mutation testing framework for Android. In FSE (pp. 208–219).

  36. Jabbarvand, R., Sadeghi, A., Bagheri, H., Malek, S. (2016). Energy-aware test-suite minimization for Android apps. In ISSTA (pp. 425–436).

  37. Jensen, C.S., Prasad, M.R., Møller, A. (2013). Automated testing with targeted event sequence generation. In ISSTA (pp. 67–77).

  38. Jiang, H., Yang, H., Qin, S., Su, Z., Zhang, J., Yan, J. (2017). Detecting energy bugs in Android apps using static analysis. In ICFEM (pp. 192–208).

  39. Lhoták, O. (2002). Spark: a scalable points-to analysis framework for Java. Master?s thesis, McGill University.

  40. Li, L., Bissyandé, T.F., Papadakis, M., Rasthofer, S., Bartel, A., Octeau, D., Klein, J., Traon, L. (2017a). Static analysis of Android apps: a systematic literature review. IST, 88, 67–95.

  41. Li, X., Chang, N., Wang, Y., Huang, H., Pei, Y., Wang, L., Li, X. (2017b). Atom: automatic maintenance of GUI test scripts for evolving mobile applications. In ICST (pp. 161–171).

  42. Linares-Vásquez, M., Moran, K., Poshyvanyk, D. (2017). Continuous, evolutionary and large-scale: a new perspective for automated mobile app testing. In ICSME (pp. 399–410).

  43. Liu, X., Chen, T., Qian, F., Guo, Z., Lin, F.X., Wang, X., Kai, C. (2017). Characterizing smartwatch usage in the wild. In MobiSys (pp. 385–398).

  44. Liu, Y., Xu, C., Cheung, S.C. (2013). Where has my battery gone? Finding sensor related energy black holes in smartphone applications. In PerCom (pp. 2–10).

  45. Liu, Y., Xu, C., Cheung, S.C., Lu, J. (2014). Greendroid: automated diagnosis of energy inefficiency for smartphone applications. TSE, 40, 911–940.

  46. Liu, Y., Xu, C., Cheung, S., Terragni, V. (2016). Understanding and detecting wake lock misuses for Android applications. In FSE (pp. 296–409).

  47. Ma, J., Liu, S., Yue, S., Tao, X., Lu, J. (2017). LeakDAF: an automated tool for detecting leaked activities and fragments of Android applications. In COMPSAC (pp. 23–32).

  48. Machiry, A., Tahiliani, R., Naik, M. (2013). Dynodroid: an input generation system for Android apps. In FSE (pp. 224–234).

  49. Mahmood, R., Mirzaei, N., Malek, S. (2014). EvoDroid: segmented evolutionary testing of Android apps. In FSE (pp. 599–609).

  50. Mao, K., Harman, M., Jia, Y. (2016). Sapienz: multi-objective automated testing for Android applications. In ISSTA (pp. 94–105).

  51. Mao, K., Harman, M., Jia, Y. (2017). Robotic testing of mobile apps for truly black-box automation. IEEE Software, 34(2), 11–16.

  52. Min, C., Kang, S., Yoo, C., Cha, J., Choi, S., Oh, Y., Song, J. (2015). Exploring current practices for battery use and management of smartwatches. In ISWC (pp. 11–18).

  53. Mirzaei, N., Garcia, J., Bagheri, H., Sadeghi, A., Malek, S. (2016). Reducing combinatorics in GUI testing of Android applications. In ICSE (pp. 559–570).

  54. Moran, K., Linares-Vasquez, M., Bernal-Cardenas, C., Vendome, C., Poshyvanyk, D. (2016). Automatically discovering, reporting and reproducing Android application crashes. In ICST (pp. 33–44).

  55. Pathak, A., Jindal, A., Hu, Y., Midkiff, S. (2012). What is keeping my phone awake?: Characterizing and detecting no-sleep energy bugs in smartphone apps. In MobiSys (pp. 267–280).

  56. Poyraz, E., & Memik, G. (2016). Analyzing power consumption and characterizing user activities on smartwatches. In IISWC (pp. 1–2).

  57. Reps, T. (1998). Program analysis via graph reachability. IST, 40(11-12), 701–726.

  58. Rountev, A., & Yan, D. (2014). Static reference analysis for GUI objects in Android software. In CGO (pp. 143–153).

  59. Ryder, B. (2003). Dimensions of precision in reference analysis of object-oriented programming languages. In CC (pp. 126–137).

  60. Sadeghi, A., Jabbarvand, R., Malek, S. (2017). PATDroid: Permission-aware GUI testing of Android. In FSE (pp. 220–232).

  61. Shivers, O. (1991). Control-flow analysis of higher-order languages. PhD thesis, Carnegie Mellon University.

  62. Skylot. (2018). jadx: Dex to Java decompiler. https://github.com/skylot/jadx.

  63. Soot. (2017).

  64. Wang, Y., Zhang, H., Rountev, A. (2016). On the unsoundness of static analysis for Android GUIs.

  65. Wu, H., Yang, S., Rountev, A. (2016a). Static detection of energy defect patterns in Android applications. In CC (pp. 185–195).

  66. Wu, H., Wang, Y., Rountev, A. (2018). Sentinel: generating GUI tests for Android sensor leaks. In AST (pp. 27–33).

  67. Wu, T., Liu, J., Deng, X., Yan, J., Zhang, J. (2016b). Relda2: an effective static analysis tool for resource leak detection in Android apps. In ASE (pp. 762–767).

  68. Yang, S., Yan, D., Wu, H., Wang, Y., Rountev, A. (2015a). Static control-flow analysis of user-driven callbacks in Android applications. In ICSE (pp. 89–99).

  69. Yang, S., Zhang, H., Wu, H., Wang, Y., Yan, D., Rountev, A. (2015b). Static window transition graphs for Android. In ASE (pp. 658–668).

  70. Yang, S., Wu, H., Zhang, H., Wang, Y., Swaminathan, C., Yan, D., Rountev, A. (2018). Static window transition graphs for Android. JASE, 1–41.

  71. Yang, W., Prasad, M., Xie, T. (2013). A grey-box approach for automated GUI-model generation of mobile applications. In FASE (pp. 250–265).

  72. Zhang, H., & Rountev, A. (2017). Analysis and testing of notifications in Android Wear applications. In ICSE (pp. 64–70).

  73. Zhang, H., Wu, H., Rountev, A. (2016). Automated test generation for detection of leaks in Android applications. In AST (pp. 64–70).

  74. Zhang, H., Wu, H., Rountev, A. (2018). Detection of energy inefficiencies in Android Wear watch faces. In FSE (pp. 691–702).

Download references

Acknowledgments

We thank the SQJ, AST and ESEC/FSE reviewers for their valuable feedback.

Funding information

This material is based upon work supported by the U.S. National Science Foundation under CCF-1319695 and CCF-1526459, and by a Google Faculty Research Award.

Author information

Correspondence to Hailong Zhang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The two lead authors Haowei Wu and Hailong Zhang contributed equally to this work.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wu, H., Zhang, H., Wang, Y. et al. Sentinel: generating GUI tests for sensor leaks in Android and Android wear apps. Software Qual J (2019) doi:10.1007/s11219-019-09484-z

Download citation

Keywords

  • Android
  • GUI
  • Android Wear
  • Smartwatch
  • Energy
  • Sensor
  • Static analysis
  • Testing