Abstract
This paper presents a low-cost privacy preserving method for recognizing object-based activities through the use of near-infrared (NIR) reflective markers. The NIR markers are thin pieces of durable reflective material that can be attached to objects, with their movement captured using a webcam that has been modified to capture NIR images. The camera is modified to replace its standard IR-blocking filter with an IR-passing visible-light-blocking filter to allow us to preserve the privacy of users during recognition. Using this equipment, we attempt to achieve accurate recognition of several object-based activities by differentiating between markers based on their stripe or grid designs, as well as by capturing the usage of the objects based on their unique movement patterns. We evaluate our method by identifying kitchen activities using markers attached to common kitchen utensils.
This is a preview of subscription content, access via your institution.














References
Adib F, Kabelac Z, Katabi D, Miller RC (2014) 3D tracking via body radio reflections. In: symposium on networked systems design and implementation, vol 14. USENIX, pp 317–329
Bao L, Intille SS (2004) Activity recognition from user-annotated acceleration data. In: pervasive computing. Springer, pp 1–17
Bradski G (2000) Opencv library. Dr. Dobb’s J Softw Tools 25:120–123
Breiman L (2001) Random forests. Mach Learn 45(1):5–32
Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ (2012) Simple and complex activity recognition through smart phones. In: IEEE international conference on intelligent environments. IEEE, pp 214–221
Fiala M (2005) Artag, a fiducial marker system using digital techniques. In: IEEE conference on computer vision and pattern recognition, vol 2. IEEE, pp 590–596
Garrido-Jurado S, Muñoz-Salinas R, Madrid-Cuevas FJ, MarÃn-Jiménez MJ (2014) Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recogn 47(6):2280–2292
Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18
He K, Zhang X, Ren S, Sun J (2016) Identity mappings in deep residual networks. In: European conference on computer vision. Springer, pp 630–645
Hooper CJ, Preston A, Balaam M, Seedhouse P, Jackson D, Pham C, Ladha C, Ladha K, Plötz T, Olivier P (2012) The french kitchen: task-based learning in an instrumented kitchen. In: ACM conference on ubiquitous computing. ACM, pp 193–202
Ji Y, Ko Y, Shimada A, Nagahara H, Taniguchi RI (2012) Cooking gesture recognition using local feature and depth image. In: ACM multimedia. ACM, pp 37–42
Jolliffe I (2002) Principal component analysis. Wiley Online Library
Lei J, Ren X, Fox D (2012) Fine-grained kitchen activity recognition using RGB-D. In: ACM conference on ubiquitous computing. ACM, pp 208–211
Li T, An C, Tian Z, Campbell AT, Zhou X (2015) Human sensing using visible light communication. In: International conference on mobile computing and networking. ACM, pp 331–344
Nakazato Y, Kanbara M, Yokoya N (2005) Wearable augmented reality system using invisible visual markers and an IR camera. In: International symposium on wearable computers, pp 198–199
Pham C, Jackson D, Schöning J, Bartindale T, Plötz T, Olivier P (2013) Foodboard: surface contact imaging for food recognition. In: ACM international joint conference on pervasive and ubiquitous computing. ACM, pp 749–752
Plötz T, Moynihan P, Pham C, Olivier P (2011) Activity recognition and healthier food preparation. In: activity recognition in pervasive intelligent environments. Springer, Berlin, pp 313–329
Rybok L, Friedberger S, Hanebeck UD, Stiefelhagen R (2011) The kit robo-kitchen data set for the evaluation of view-based activity recognition systems. In: IEEE-RAS international conference on humanoid robots. IEEE, pp 128–133
Souza CR (2014) The accord.net framework. http://accord-framework.net/
Spielberg A, Sample A, Hudson SE, Mankoff J, McCann J (2016) RapID: a framework for fabricating low-latency interactive objects with RFID tags. In: Conference on human factors in computing systems. ACM, pp 5897–5908
Stein S, McKenna SJ (2013) Combining embedded accelerometers with computer vision for recognizing food preparation activities. In: International joint conference on pervasive and ubiquitous computing. ACM, pp 729–738
Wagner D, Schmalstieg D (2007) Artoolkitplus for pose tracking on mobile devices. In: computer vision winter workshop
Wagner J, van Halteren A, Hoonhout J, Plötz T, Pham C, Moynihan P, Jackson D, Ladha C, Ladha K, Olivier P (2011) Towards a pervasive kitchen infrastructure for measuring cooking competence. In: International conference on pervasive computing technologies for healthcare. IEEE, pp 107–114
Yan Z, Chakraborty D, Misra A, Jeung H, Aberer K (2012) Sammple: detecting semantic indoor activities in practical settings using locomotive signatures. In: International symposium on wearable computers. IEEE, pp 37–40
Yang L, Li Y, Lin Q, Li XY, Liu Y (2016) Making sense of mechanical vibration period with sub-millisecond accuracy using backscatter signals. In: International conference on mobile computing and networking. ACM, pp 16–28
Acknowledgements
This study is partially supported by JST CREST.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Korpela, J., Maekawa, T. Privacy preserving recognition of object-based activities using near-infrared reflective markers. Pers Ubiquit Comput 22, 365–377 (2018). https://doi.org/10.1007/s00779-017-1070-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-017-1070-9