Automatically Detected Nonverbal Behavior Predicts Creativity in Collaborating Dyads
- 898 Downloads
In the current study we administered a creative task in which two people collaboratively generated novel strategies to conserve resources. During this task, the nonverbal behavior of 104 participants in 52 pairs was tracked and recorded using the Kinect computer vision algorithm. We created a measure of synchrony by correlating movements between the two dyad members, and showed that synchrony occurred—that is, correlations decreased when we increased delay between the recorded movements of pair members. We also demonstrated a link between nonverbal synchrony and creativity, as operationalized by the number of new, valid ideas produced. Linear correlations demonstrated a significant relationship between synchrony and creativity. Finally, models using synchrony scores as input predicted whether dyads were high or low in creativity with a success rate as high as 86.7 % in the more exclusive subsets. We discuss implications for methodological approaches to measuring nonverbal behavior and synchrony, and suggest practical applications which can leverage the current findings.
KeywordsNonverbal behavior Synchrony Gesture Collaboration Creativity Kinect Interpersonal communication Contingency
The work presented herein was funded in part by Konica Minolta as part of a Stanford Media-X grant, and we thank them for the valuable insights provided by their visiting researchers, in particular Dr. Haisong Gu. In addition, it was funded in part by grant 108084-5031715-4 from the National Science Foundation. The authors also thank lab manager Cody Karutz for coordinating the administrative aspects of running this study, and Jimmy Lee, Pamela Martinez, Evan Shieh, Alex Zamoshchin, Angel Olvera, Christine Tataru, Mark Diaz and Mark Peng for their help in coding and data analysis, and Dr. Laura Aymerich-Franch, Ketaki Shriram, Michelle Friend, Brian Perone, and Jakki Bailey for helpful comments on an earlier draft of this paper.
- Delaherche, E., Chetouani, M., Mahdhaoui, A., Saint-Georges, C., Viaux, S., & Cohen, D. (2012). Interpersonal synchrony: A survey of evaluation methods across disciplines. Affective Computing, IEEE Transactions on, 3(3), 349–365.Google Scholar
- Hoque, M. E., McDuff, D. J., & Picard, R. W. (2012). Exploring temporal patterns in classifying frustrated and delighted smiles. Journal of IEEE Transactions on Affective Computing, 99, 1–13.Google Scholar
- Huang, L., Morency, L. P., & Gratch, J. (2011). Virtual rapport 2.0. In Intelligent virtual agents (pp. 68–79). Berlin: Springer.Google Scholar
- Jabon, M. E., Ahn, S. J., & Bailenson, J. N. (2011a). Automatically analyzing facial-feature movements to identify human errors. IEEE Journal of Intelligent Systems, 26(2), 54–63.Google Scholar
- Kapur, A., Kapur, A., Virji-Babul, N., Tzanetakis, G., & Driessen, P. F. (2005). Gesture-based affective computing on motion capture data. In Affective Computing and Intelligent Interaction (pp. 1–7). Berlin, Heidelberg: Springer.Google Scholar
- Kleinsmith, A., & Bianchi-Berthouze, N. (2007). Recognizing affective dimensions from body posture. In Affective computing and intelligent interaction (pp. 48–58). Berlin: Springer.Google Scholar
- Martin, C. C., Burkert, D. C., Choi, K. R., Wieczorek, N. B., McGregor, P. M., Herrmann, R. A., et al. (2012, April). A real-time ergonomic monitoring system using the Microsoft Kinect. In IEEE Systems and Information Design Symposium (SIEDS) (pp. 50–55).Google Scholar
- Meservy, T. O., Jensen, M. L., Kruse, J., Burgoon, J. K., & Jay, F. (2005). Detecting deception through automatic, unobtrusive analysis of nonverbal behavior. IEEE Intelligent Systems, 20(5), 36–43.Google Scholar
- Oppezzo, M., & Schwartz, D. L. (2014). Give your ideas some legs: The positive effect of walking on creative thinking. Journal of Experimental Psychology: Learning, Memory, and Cognition. doi: 10.1037/a0036577.
- Pentland, A. S. (2010). Honest signals. Cambridge: MIT press.Google Scholar
- Schmidt, R. C., Morr, S., Fitzpatrick, P., & Richardson, M. J. (2012). Measuring the dynamics of interactional synchrony. Journal of Nonverbal Behavior, 36(4), 263–279.Google Scholar
- Sung, J., Ponce, C., Selman, B., & Saxena, A. (2011). Human activity detection from RGBD images. AAAI 2011 workshop: Plan, activity, and intent recognition.Google Scholar
- Vinciarelli, A., Pantic, M., Bourlard, H., & Pentland, A. (2008). Social signal processing: State-of-the-art and future perspectives of an emerging domain. In Proceedings of the 16th ACM international conference on multimedia (pp. 1061–1070). New York: ACM.Google Scholar
- Witten, I., Eibe, F., & Hall, M. A. (2011). Data mining: Practical machine learning tools and techniques. Burlington MA: Morgan Kaufman.Google Scholar
- Won, A. S., Bailenson, J. N., & Janssen, J. H. (2014). Automatic detection of nonverbal behavior predicts learning in dyadic interactions. Manuscript submitted for publication.Google Scholar
- Microsoft Corp. Redmond WA. Kinect for Xbox 360.Google Scholar
- Zebrowitz, L. A., & Montepare, J. M. (2006). The ecological approach to person perception: Evolutionary roots and contemporary offshoots. In M. Schaller, J. A. Simpson, & D. T. Kenrick (Eds.) Evolution and social psychology. First Edition (pp. 81–113), New York: Psychology Press.Google Scholar