Personal and Ubiquitous Computing

, Volume 22, Issue 2, pp 365–377 | Cite as

Privacy preserving recognition of object-based activities using near-infrared reflective markers

  • Joseph KorpelaEmail author
  • Takuya Maekawa
Original Article


This paper presents a low-cost privacy preserving method for recognizing object-based activities through the use of near-infrared (NIR) reflective markers. The NIR markers are thin pieces of durable reflective material that can be attached to objects, with their movement captured using a webcam that has been modified to capture NIR images. The camera is modified to replace its standard IR-blocking filter with an IR-passing visible-light-blocking filter to allow us to preserve the privacy of users during recognition. Using this equipment, we attempt to achieve accurate recognition of several object-based activities by differentiating between markers based on their stripe or grid designs, as well as by capturing the usage of the objects based on their unique movement patterns. We evaluate our method by identifying kitchen activities using markers attached to common kitchen utensils.


Activity recognition Near infrared Visual markers Kitchen activities 



This study is partially supported by JST CREST.


  1. 1.
    Adib F, Kabelac Z, Katabi D, Miller RC (2014) 3D tracking via body radio reflections. In: symposium on networked systems design and implementation, vol 14. USENIX, pp 317–329Google Scholar
  2. 2.
    Bao L, Intille SS (2004) Activity recognition from user-annotated acceleration data. In: pervasive computing. Springer, pp 1–17Google Scholar
  3. 3.
    Bradski G (2000) Opencv library. Dr. Dobb’s J Softw Tools 25:120–123Google Scholar
  4. 4.
    Breiman L (2001) Random forests. Mach Learn 45(1):5–32CrossRefzbMATHGoogle Scholar
  5. 5.
    Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ (2012) Simple and complex activity recognition through smart phones. In: IEEE international conference on intelligent environments. IEEE, pp 214–221Google Scholar
  6. 6.
    Fiala M (2005) Artag, a fiducial marker system using digital techniques. In: IEEE conference on computer vision and pattern recognition, vol 2. IEEE, pp 590–596Google Scholar
  7. 7.
    Garrido-Jurado S, Muñoz-Salinas R, Madrid-Cuevas FJ, Marín-Jiménez MJ (2014) Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recogn 47(6):2280–2292CrossRefGoogle Scholar
  8. 8.
    Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18CrossRefGoogle Scholar
  9. 9.
    He K, Zhang X, Ren S, Sun J (2016) Identity mappings in deep residual networks. In: European conference on computer vision. Springer, pp 630–645Google Scholar
  10. 10.
    Hooper CJ, Preston A, Balaam M, Seedhouse P, Jackson D, Pham C, Ladha C, Ladha K, Plötz T, Olivier P (2012) The french kitchen: task-based learning in an instrumented kitchen. In: ACM conference on ubiquitous computing. ACM, pp 193–202Google Scholar
  11. 11.
    Ji Y, Ko Y, Shimada A, Nagahara H, Taniguchi RI (2012) Cooking gesture recognition using local feature and depth image. In: ACM multimedia. ACM, pp 37–42Google Scholar
  12. 12.
    Jolliffe I (2002) Principal component analysis. Wiley Online LibraryGoogle Scholar
  13. 13.
    Lei J, Ren X, Fox D (2012) Fine-grained kitchen activity recognition using RGB-D. In: ACM conference on ubiquitous computing. ACM, pp 208–211Google Scholar
  14. 14.
    Li T, An C, Tian Z, Campbell AT, Zhou X (2015) Human sensing using visible light communication. In: International conference on mobile computing and networking. ACM, pp 331–344Google Scholar
  15. 15.
    Nakazato Y, Kanbara M, Yokoya N (2005) Wearable augmented reality system using invisible visual markers and an IR camera. In: International symposium on wearable computers, pp 198–199Google Scholar
  16. 16.
    Pham C, Jackson D, Schöning J, Bartindale T, Plötz T, Olivier P (2013) Foodboard: surface contact imaging for food recognition. In: ACM international joint conference on pervasive and ubiquitous computing. ACM, pp 749–752Google Scholar
  17. 17.
    Plötz T, Moynihan P, Pham C, Olivier P (2011) Activity recognition and healthier food preparation. In: activity recognition in pervasive intelligent environments. Springer, Berlin, pp 313–329Google Scholar
  18. 18.
    Rybok L, Friedberger S, Hanebeck UD, Stiefelhagen R (2011) The kit robo-kitchen data set for the evaluation of view-based activity recognition systems. In: IEEE-RAS international conference on humanoid robots. IEEE, pp 128–133Google Scholar
  19. 19.
    Souza CR (2014) The framework.
  20. 20.
    Spielberg A, Sample A, Hudson SE, Mankoff J, McCann J (2016) RapID: a framework for fabricating low-latency interactive objects with RFID tags. In: Conference on human factors in computing systems. ACM, pp 5897–5908Google Scholar
  21. 21.
    Stein S, McKenna SJ (2013) Combining embedded accelerometers with computer vision for recognizing food preparation activities. In: International joint conference on pervasive and ubiquitous computing. ACM, pp 729–738Google Scholar
  22. 22.
    Wagner D, Schmalstieg D (2007) Artoolkitplus for pose tracking on mobile devices. In: computer vision winter workshopGoogle Scholar
  23. 23.
    Wagner J, van Halteren A, Hoonhout J, Plötz T, Pham C, Moynihan P, Jackson D, Ladha C, Ladha K, Olivier P (2011) Towards a pervasive kitchen infrastructure for measuring cooking competence. In: International conference on pervasive computing technologies for healthcare. IEEE, pp 107–114Google Scholar
  24. 24.
    Yan Z, Chakraborty D, Misra A, Jeung H, Aberer K (2012) Sammple: detecting semantic indoor activities in practical settings using locomotive signatures. In: International symposium on wearable computers. IEEE, pp 37–40Google Scholar
  25. 25.
    Yang L, Li Y, Lin Q, Li XY, Liu Y (2016) Making sense of mechanical vibration period with sub-millisecond accuracy using backscatter signals. In: International conference on mobile computing and networking. ACM, pp 16–28Google Scholar

Copyright information

© Springer-Verlag London Ltd. 2017

Authors and Affiliations

  1. 1.Osaka UniversityOsakaJapan

Personalised recommendations