Recognizing Group Activities Using Wearable Sensors
Pervasive computing envisions implicit interaction between people and their intelligent environments instead of between individuals and their devices, inevitably leading to groups of individuals interacting with the same intelligent environment. These environments must be aware of user contexts and activities, as well as the contexts and activities of groups of users. Here an application for in-network group activity recognition using only mobile devices and their sensors is presented. Different data abstraction levels for recognition were investigated in terms of recognition rates, power consumption and wireless communication volumes for the devices involved. The results indicate that using locally extracted features for global, multi-user activity recognition is advantageous (10% reduction in energy consumption, theoretically no loss in recognition rates). Using locally classified single-user activities incurred a 47% loss in recognition capabilities, making it unattractive. Local clustering of sensor data indicates potential for group activity recognition with room for improvement (40% reduction in energy consumed, though 20% loss of recognition abilities).
Keywordsgroup activity recognition context recognition distributed systems multi-user wearable
Unable to display preview. Download preview PDF.
- 3.Berchtold, M., Budde, M., Gordon, D., Schmidtke, H., Beigl, M.: ActiServ: Activity recognition service for mobile phones. In: ISWC 2010: Proceedings of the Fourteenth International Symposium on Wearable Computers, pp. 83–90. IEEE Computer Society, Seoul (2010)Google Scholar
- 5.Chang, M.-C., Krahnstoever, N., Lim, S., Yu, T.: Group level activity recognition in crowded environments across multiple cameras. In: IEEE Conference on Advanced Video and Signal Based Surveillance, pp. 56–63 (2010)Google Scholar
- 6.Dong, W., Lepri, B., Cappelletti, A., Pentland, A.S., Pianesi, F., Zancanaro, M.: Using the influence model to recognize functional roles in meetings. In: Proceedings of the 9th International Conference on Multimodal Interfaces, ICMI 2007, pp. 271–278. ACM, New York (2007)Google Scholar
- 7.Dunkels, A., Grönvall, B., Voigt, T.: Contiki - a lightweight and flexible operating system for tiny networked sensors. In: Proceedings of the First IEEE Workshop on Embedded Networked Sensors (Emnets-I), Tampa, Florida, USA (November 2004)Google Scholar
- 8.Gordon, D., Hanne, J.-H., Berchtold, M., Miyaki, T., Beigl, M.: An Experiment in Hierarchical Recognition of Group Activities Using Wearable Sensors. In: Beigl, M., Christiansen, H., Roth-Berghofer, T.R., Kofod-Petersen, A., Coventry, K.R., Schmidtke, H.R. (eds.) CONTEXT 2011. LNCS, vol. 6967, pp. 104–107. Springer, Heidelberg (2011)CrossRefGoogle Scholar
- 9.Gu, T., Wu, Z., Wang, L., Tao, X., Lu, J.: Mining emerging patterns for recognizing activities of multiple users in pervasive computing. In: 6th Annual International Mobile and Ubiquitous Systems: Networking Services, MobiQuitous 2009, pp. 1–10 (July 2009)Google Scholar
- 11.Hsu, J.Y.-J., Lian, C.-C., Jih, W.-R.: Probabilistic models for concurrent chatting activity recognition. ACM Trans. Intell. Syst. Technol. 2, 4:1–4:20 (2011)Google Scholar
- 12.Sigg, S., Gordon, D., von Zengen, G., Beigl, M., Haseloff, S., David, K.: Investigation of context prediction accuracy for different context abstraction levels. In: IEEE Transactions on Mobile Computing (2011)Google Scholar
- 16.Wirz, M., Roggen, D., Tröster, G.: A methodology towards the detection of collective behavior patterns by means of body-worn sensors. In: Workshop at the 8th International Conference on Pervasive Computing (2010)Google Scholar