Advertisement

User Defined Conceptual Modeling Gestures

  • Bige Tunçer
  • Sumbul Khan
Chapter
Part of the KAIST Research Series book series (KAISTRS)

Abstract

Gesture and speech based interaction offers designers a powerful technique to create 3D CAD models. Previous studies on gesture based modeling have employed author defined gestures which may not be very user friendly. The aim of this study was to collect a data set of user generated gestures and accompanying voice commands for 3D modeling for form exploration in the conceptual architectural design phase. We conducted an experiment with 41 subjects to elicit their preferences in using gestures and speech for twelve 3D CAD modeling referents. In this paper we present the different types of gestures we found, and present user preferences of gestures and speech. Findings from this study will be used for the design of a speech and gesture based Cad modeling interface.

Keywords

Conceptual architectural design Gesture based modeling Natural user interface Gesture studies Human computer interaction 

Notes

Acknowledgements

This research is supported by SUTD-MIT International Design Centre (IDC) grant number IDG21500109, under the Sustainable Built Environment Grand Challenge, and Visualization and Prototyping Design Research Thrust.

References

  1. 1.
    Kang, J., et al. (2013). Instant 3D design concept generation and visualization by real-time hand gesture recognition. Computers in Industry, 64(7), 785–797.CrossRefGoogle Scholar
  2. 2.
    Wobbrock, J. O., Morris, M. R. & Wilson, A. D. (2009). User-defined gestures for surface computing. In Proceedings of the 27th International Conference on Human Factors in Computing Systems—CHI 09 (p. 1083).Google Scholar
  3. 3.
    van den Hoven, E., & Mazalek, A. (2011). Grasping gestures: Gesturing with physical artifacts. Artificial Intelligence for Engineering Design, Analysis and Manufacturing, 25(3), 255–271.CrossRefGoogle Scholar
  4. 4.
    Malizia, A., & Bellucci, A. (2012). The artificiality of natural user interfaces. Communications of the ACM, 55(3), 36.CrossRefGoogle Scholar
  5. 5.
    Zamborlin, B., et al. (2014). Fluid gesture interaction design. ACM Transactions on Interactive Intelligent Systems, 3(4), 1–30.CrossRefGoogle Scholar
  6. 6.
    Quek, F. K. (1995). Eyes in the interface. Image and Vision Computing, 13(6), 511–525.CrossRefGoogle Scholar
  7. 7.
    Grandhi, S. A., Joue, G. & Mittelberg, I. (2011). Understanding naturalness and intuitiveness in gesture production. In Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems—CHI’11 (p. 821).Google Scholar
  8. 8.
    Thompson, L. A., & Massaro, D. W. (1986). Evaluation and Integration of Speech and Pointing Gestures during Referential Understanding. Journal of Experimental Child Psychology, 42, 144–168.CrossRefGoogle Scholar
  9. 9.
    McNeill, D. (1992). Hand and mind: what gestures reveal about thought (Univ. of Chicago Press).Google Scholar
  10. 10.
    Athavankar, U. (1999). Gestures, imagery and spatial reasoning. ‘Visual and Spatial Reasoning’. In J. S Garo & B. Tversky (Eds.), Preprints of the International Conference on Visual Reasoning (VR 99) (pp 103–128). MIT.Google Scholar
  11. 11.
    Varshney, S. (1998). Castle in the air: A strategy to model shapes in a computer. In Proceedings of the Asia Pacific Conference on Computer Human Interaction (APCHI 98) (pp. 350–355).Google Scholar
  12. 12.
    Athavankar, U. (1997). Mental imagery as a design tool. In Cybernetics and Systems (Vol. 28, No. 1, pp. 25–42).Google Scholar
  13. 13.
    Wardak, D. (2016). Gestures orchestrating the multimodal development of ideas in educational design team meetings. Design Studies, 47, 1–22.CrossRefGoogle Scholar
  14. 14.
    Salisbury, M. W., Hendrickson, J. H., Lammers, T. L., Fu, C., & Moody, S. A. (1990). Talk and draw: Bundling speech and graphics. Computer, 23(8), 59–65.CrossRefGoogle Scholar
  15. 15.
    Weimer, D. & Ganapathy, S. K. (1989). A synthetic visual environment with hand gesturing and voice input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Wings for the Mind—CHI’89 (Vol. 20, pp. 235–240).Google Scholar
  16. 16.
    Herold, J., & Stahovich, T. F. (2011). Using speech to identify gesture pen strokes in collaborative, multimodal device descriptions. Artificial Intelligence for Engineering Design, Analysis and Manufacturing: AIEDAM, 25(3), 237–254.CrossRefGoogle Scholar
  17. 17.
    Bolt, R. A. (1980). Put-that-there. In Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques—SIGGRAPH’80 (Vol. 14, pp. 262–270). New York, New York, USA: ACM.Google Scholar
  18. 18.
    Nanjundaswamy, V. G. et al. (2013). Intuitive 3D computer-aided design (CAD) system with multimodal interfaces. In Proceedings of the ASME Design Engineering Technical Conference, 2 A. Portland, Oregon, USA.Google Scholar
  19. 19.
    Dave, D., Chowriappa, A., & Kesavadas, T. (2013). Gesture interface for 3D CAD modeling using Kinect. Computer-Aided Design and Applications, 10(4), 663–669.CrossRefGoogle Scholar
  20. 20.
    Arroyave-Tobón, S., Osorio-Gómez, G., & Cardona-McCormick, J. F. (2015). Air-modelling: A tool for gesture-based solid modelling in context during early design stages in AR environments. Computers in Industry, 66, 73–81.CrossRefGoogle Scholar
  21. 21.
    Song, J., Cho, S., Baek, S.-Y., Lee, K., & Bang, H. (2014). GaFinC: Gaze and finger control interface for 3D model manipulation in CAD application. Computer Aided Design, 46, 239–245.CrossRefGoogle Scholar
  22. 22.
    Huang, J. & Rai, R. (2014). Hand gesture based intuitive CAD interface. In Proceedings of the ASME 2014 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference, 1 A. Buffalo, New York, USA.Google Scholar
  23. 23.
    Zhong, K. et al. (2011). Rapid 3D conceptual design based on hand gesture. In 2011 3rd International Conference on Advanced Computer Control, ICACC 2011 (pp. 192–197).Google Scholar
  24. 24.
    Yi, X., Qin, S., & Kang, J. (2009). Generating 3D architectural models based on hand motion and gesture. Computers in Industry, 60(9), 677–685.CrossRefGoogle Scholar
  25. 25.
    Nacenta, M.A. et al. (2013). Memorability of Pre-designed & user-defined gesture sets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1099–1108).Google Scholar
  26. 26.
    Hinckley, K., Pausch, R. & Proffitt, D. (1997). Attention and visual feedback. In Proceedings of the 1997 Symposium on Interactive 3D Graphics—SI3D’97 (pp. 121–130). New York, New York, USA: ACM Press.Google Scholar
  27. 27.
    Boyes Braem, P. & Bram, T. (2000). A pilot study of the expressive gestures used by classical orchestral conductors. In: K. Emmorey & H. Lane (Eds.), The Signs of Language Revisted. An Anthology to Honor Ursula Bellugi and Edward Klima (pp. 143–167). New Jersey: Lawrence Erlbaum Associates.Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  1. 1.Architecture and Sustainable Design, Singapore University of Technology and DesignSingaporeSingapore
  2. 2.SUTD-MIT International Design CentreSingapore University of Technology and DesignSingaporeSingapore

Personalised recommendations