Towards Multi-modal Interaction with Interactive Paint
We present a Multi-Modal Interactive Paint application. Our work is intended to illustrate shortcomings in current multi-modal interaction and to present design strategies to address and alleviate these issues. In particular, from an input perspective use in a regular desktop environment. A serious of challenges are listed and addressed individually with their corresponding strategies in our discussion of design practices for multi-modality. We also identify areas which we will improve for future iterations of similar multi-modal interaction applications due to the findings identified in this paper. These improvements should alleviate shortcomings with our current design and provide further opportunities to research multi-modal interaction.
KeywordsMulti-modal Interaction Multi-touch Modality
We acknowledge the support by the National Science Foundation under Grant Nos. I/UCRC IIP-1338922, AIR IIP-1237818, SBIR IIP-1330943, III-Large IIS-1213026, MRI CNS-1532061, OISE 1541472, MRI CNS-1532061, MRI CNS-1429345, MRI CNS-0821345, MRI CNS-1126619, CREST HRD-0833093, I/UCRC IIP-0829576, MRI CNS-0959985, RAPID CNS- 1507611. Also, we acknowledge Garrett Lemieux and Andrew Mitchell for their help with the first prototype.
- 1.Ortega, F.R., Abyarjoo, F., Barreto, A., Rishe, N., Adjouadi, M.: Interaction Design for 3D User Interfaces. The World of Modern Input Devices for Research Applications, and Game Development. CRC Press, Boca Raton (2016)Google Scholar
- 3.Jourde, F., Laurillau, Y., Nigay, L.: COMM notation for specifying collaborative and multimodal interactive systems. In: Proceedings of the 2nd ACM SIGCHI (2010)Google Scholar
- 5.Khorshidi, S., Mohammadipour, M.: Children’s drawing: a way to discover their psychological disorders and problems. Int. J. Ment. Disord. 14, 31–36 (2016)Google Scholar
- 7.Hu, K., Canavan, S., Yin, L.: Hand pointing estimation for human computer interaction based on two orthogonal-views. In: Pattern Recognition (ICPR), pp. 3760–3763 (2010)Google Scholar
- 8.Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: The SIGCHI Conference, pp. 1083–1092. ACM Press, New York (2009)Google Scholar
- 9.Balcazar, R., Ortega, F.R., Tarre, K., Barreto, A., Weiss, M., Rishe, N.D.: CircGR: interactive multi-touch gesture recognition using circular measurements. In: Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces, ISS 2017, pp. 12–21. ACM, New York (2017)Google Scholar
- 10.Adler, A., Davis, R.: Speech and sketching for multimodal design. In: ACM SIGGRAPH 2007 Courses, SIGGRAPH 2007. ACM, New York (2007)Google Scholar