Exploring Machine Learning Object Classification for Interactive Proximity Surfaces

  • Andreas BraunEmail author
  • Michael Alekseew
  • Arjan Kuijper
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9749)


Capacitive proximity sensors are a variety of the sensing technology that drives most finger-controlled touch screens today. However, they work over a larger distance. As they are not disturbed by non-conductive materials, they can be used to track hands above arbitrary surfaces, creating flexible interactive surfaces. Since the resolution is lower compared to many other sensing technologies, it is necessary to use sophisticated data processing methods for object recognition and tracking. In this work we explore machine learning methods for the detection and tracking of hands above an interactive surface created with capacitive proximity sensors. We discuss suitable methods and present our implementation based on Random Decision Forests. The system has been evaluated on a prototype interactive surface - the CapTap. Using a Kinect-based hand tracking system, we collect training data and compare the results of the learning algorithm to actual data.


Capacitive proximity sensing Interactive surfaces Machine learning 



We would like to thank all volunteers that participated in our studies and provided valuable feedback for future iterations. This work was supported by the European Commission under the 7th Framework Programme (Grant Agreement No. 611421).


  1. 1.
    Barrett, G., Omote, R.: Projected capacitive touch technology. Inf. Disp. 28, 16–21 (2010)Google Scholar
  2. 2.
    Braun, A., Hamisu, P.: Designing a multi-purpose capacitive proximity sensing input device. Proc. PETRA (2011). Article No. 15Google Scholar
  3. 3.
    Zimmerman, T.G., Smith, J.R., Paradiso, J.A., Allport, D., Gershenfeld, N.: Applying electric field sensing to human-computer interfaces. In: Proceedings of the CHI, pp. 280–287 (1995)Google Scholar
  4. 4.
    Braun, A., Wichert, R., Kuijper, A., Fellner, D.W.: Capacitive proximity sensing in smart environments. J. Ambient Intell. Smart Environ. 7, 1–28 (2015)Google Scholar
  5. 5.
    Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., Blake, A.: Real-time human pose recognition in parts from single depth images. Commun. ACM 56, 116–124 (2013)CrossRefGoogle Scholar
  6. 6.
    Le Goc, M., Taylor, S., Izadi, S., Keskin, C., et al.: A low-cost transparent electric field sensor for 3d interaction on mobile devices. In: Proceedings of the CHI, pp. 3167–3170 (2014)Google Scholar
  7. 7.
    Braun, A., Zander-Walz, S., Krepp, S., Rus, S., Wichert, R., Kuijper, A.: CapTap - combining capacitive gesture recognition and knock detection. Working Paper (2016)Google Scholar
  8. 8.
    Grosse-Puppendahl, T., Berghoefer, Y., Braun, A., Wimmer, R., Kuijper, A.: OpenCapSense: a rapid prototyping toolkit for pervasive interaction using capacitive sensing. In: 2013 IEEE International Conference on Pervasive Computing and Communications, PerCom 2013, pp. 152–159 (2013)Google Scholar
  9. 9.
    Grosse-Puppendahl, T., Braun, A., Kamieth, F., Kuijper, A.: Swiss-cheese extended: an object recognition method for ubiquitous interfaces based on capacitive proximity sensing. In: Proceedings of the CHI, pp. 1401–1410 (2013)Google Scholar
  10. 10.
    Microchip Technology Inc.: GestIC ® Design Guide: Electrodes and System Design MGC3130 (2013)Google Scholar
  11. 11.
    Benko, H., Jota, R., Wilson, A.: Miragetable: freehand interaction on a projected augmented reality tabletop. In: Proceedings of the CHI, pp. 199–208 (2012)Google Scholar
  12. 12.
    Dietz, P., Leigh, D.: DiamondTouch: a multi-user touch technology. In: Proceedings of the UIST, pp. 219–226 (2001)Google Scholar
  13. 13.
    Harrison, C., Schwarz, J., Hudson, S.E.: TapSense: enhancing finger interaction on touch surfaces. In: Proceedings of the UIST, pp. 627–636 (2011)Google Scholar
  14. 14.
    Ren, Z., Meng, J., Yuan, J.: Depth camera based hand gesture recognition and its applications in Human-Computer-Interaction. In: 2011 8th International Conference on Information, Communications and Signal Processing, pp. 1–5 (2011)Google Scholar
  15. 15.
    Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Kim, D., Rhemann, C., Leichter, I., Vinnikov, A., Wei, Y., Freedman, D., Kohli, P., Krupka, E., Fitzgibbon, A., Izadi, S.: Accurate, robust, and flexible real-time hand tracking. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3633–3642. ACM, New York (2015)Google Scholar
  16. 16.
    Breiman, L.: Random forests. Mach. Learn. 45, 5–32 (2001)CrossRefzbMATHGoogle Scholar
  17. 17.
    Cerezo, F.T.: 3D hand and finger recognition using Kinect. Project report, University of Granada (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Andreas Braun
    • 1
    • 2
    Email author
  • Michael Alekseew
    • 1
  • Arjan Kuijper
    • 1
    • 2
  1. 1.Fraunhofer Institute for Computer Graphics Research IGDDarmstadtGermany
  2. 2.Technische Universität DarmstadtDarmstadtGermany

Personalised recommendations