Nonverbal Social Sensing in Action: Unobtrusive Recording and Extracting of Nonverbal Behavior in Social Interactions Illustrated with a Research Example
Nonverbal behavior coding is typically conducted by “hand”. To remedy this time and resource intensive undertaking, we illustrate how nonverbal social sensing, defined as the automated recording and extracting of nonverbal behavior via ubiquitous social sensing platforms, can be achieved. More precisely, we show how and what kind of nonverbal cues can be extracted and to what extent automated extracted nonverbal cues can be validly obtained with an illustrative research example. In a job interview, the applicant’s vocal and visual nonverbal immediacy behavior was automatically sensed and extracted. Results show that the applicant’s nonverbal behavior can be validly extracted. Moreover, both visual and vocal applicant nonverbal behavior predict recruiter hiring decision, which is in line with previous findings on manually coded applicant nonverbal behavior. Finally, applicant average turn duration, tempo variation, and gazing best predict recruiter hiring decision. Results and implications of such a nonverbal social sensing for future research are discussed.
KeywordsUbiquitous social sensing platform Automated extraction Applicant nonverbal behavior Hiring decision Job interview
We thank Dr. Florent Monay (Idiap) for the design and implementation of the sensing platform; Dr. Jean-Marc Odobez (Idiap) for his contribution to the sensing platform and the nodding recognition method; and Prof. Tanzeem Choudhury (Cornell University) for her contribution to the design of the job performance part of the study. This research was funded by the Swiss National Science Foundation through the Sinergia SONVB (Sensing and Analyzing Nonverbal Organizational Behavior) project.
Conflict of interest
The authors declare that they have no conflict of interest.
- Basu, S. (2002). Conversational scene analysis. MIT Department of EECS, Cambridge, MA. http://alumni.media.mit.edu/~sbasu/papers.html.
- Biel, J.-I., Aran, O., & Gatica-Perez, D. (2011). You are known by how you vlog: Personality impressions and nonverbal behavior in YouTube. Paper presented at the proceedings of international AAAI conference on weblogs and social media, Barcelona, Spain.Google Scholar
- Burgoon, J. K. (1994). Nonverbal signals. In M. L. Knapp & G. R. Miller (Eds.), Handbook of interpersonal communication (pp. 229–285). Thousand Oaks, CA: Sage.Google Scholar
- Fillmore, C. J. (1979). On fluency. In D. Kempler & W. S. Y. Wang (Eds.), Individual differences in language ability and language behavior (pp. 85–102). New York: Academic Press.Google Scholar
- Funes, K., & Odobez, J.-M. (2012). Gaze estimation from multimodal kinect data. Paper presented at the IEEE conference in computer vision and pattern recognition, Providence, RI, USA.Google Scholar
- Guerrero, L. K. (2005). Observer ratings of nonverbal involvement and immediacy. In V. Manusov (Ed.), The sourcebook of nonverbal measures. Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
- Knapp, M. L., & Hall, J. A. (2010). Nonverbal communication in human interaction (7th ed.). Wadsworth: Cengage Learning.Google Scholar
- Lathoud, G., & McCowan, I. A. (2003). Location based speaker segmentation. Paper presented at the meeting of acoustics, speech, and signal processing.Google Scholar
- Leathers, D. G. (1992). Successful nonvebal communication. London: Collier Macmillan.Google Scholar
- Leigh, T. W., & Summers, J. O. (2002). An initial evaluation of industrial buyers’ impressions of salespersons’ nonverbal cues. Journal of Personal Selling & Sales Management, 22, 41–53.Google Scholar
- Lu, H., Rabbi, M., Chittaranjan, G. T., Frauendorfer, D., Schmid Mast, M., Campbell, A. T., & Choudhury, T. (2012). Stresssense: Detecting stress in unconstrained acoustic environments using smartphones. Paper presented at the UbiComp, Pittsburgh, USA.Google Scholar
- Marcos-Ramiro, A., Pizarro-Perez, D., Marron-Romera, M., Nguyen, L. S., & Gatica-Perez, D. (2013). Body communication cue extraction for conversational analysis. Paper presented at the IEEE international conference on automatic face and gesture recognition, Shanghai, China.Google Scholar
- Mehrabian, A. (1972). Nonverbal communication. New Brunswick: AldineTransaction.Google Scholar
- Nguyen, L. S., Odobez, J.-M., & Gatica-Perez, D. (2012). Using self-context for multimodal detection of head nods in face-to-face interactions. Paper presented at the international conference on multimodal interactions, Santa Monica, CA, USA.Google Scholar
- Reis, H. T., & Charles, M. J. (2000). Handbook of research methods in social and personality psychology. Cambridge: Cambridge University Press.Google Scholar
- Vinciarelli, A., Pantic, M., Bourlard, H., & Pentland, A. (2008). Social signals, their function, and automatic analysis: A survey. Paper presented at the ICMI 2008, Chania, Crete, Greece.Google Scholar
- Wardhaugh, R. (1985). How conversation works. New York: Basil Blackwell.Google Scholar