Advertisement

3D Gesture-Based View Manipulator for Large Scale Entity Model Review

  • Hye-Jin Park
  • Jiyoung Park
  • Myoung-Hee Kim
Part of the Communications in Computer and Information Science book series (CCIS, volume 323)

Abstract

Hand gesture-based Human Computer Interaction (HCI) is one of the most natural and intuitive methods of communication between humans and machines because it closely mimics how humans interact with each other. Its intuitiveness and naturalness are needed to explore extensive and complex data or virtual realities. We developed a 3D gesture interface to manipulate the display of a 3D entity model. For gesture recognition, we use the Kinect as a depth sensor to acquire depth image frames. We track the position of the user’s skeleton in each frame and detect preset gestures. By simple gestures, the user can pan, zoom, rotate, and reset the view and freely navigate inside the 3D entity model in the virtual space. The proposed gesture interface is integrated with the stereoscopic 3D model viewer that we have previously developed for 3D model review.

Keywords

3D gesture interface view manipulator battlefield visualization 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Park, J., Park, H.-J., Kim, M.-H.: Stereoscopic 3D Model Viewer with Improved Depth Perception for Battlefield Visualization. In: Proceedings of the Asia Simulation Conference 2011 (2011)Google Scholar
  2. 2.
  3. 3.
    Durbin, J., Swan II, J.E., Colbert, B., Crowe, J., King, R., King, T., Scannell, C., Wartell, Z., Welsh, T.: Battlefield Visualization on the Responsive Workbench. In: Proceedings of the Conference on Visualization 1998, pp. 463–466 (1998)Google Scholar
  4. 4.
    Jones, B.S., Naef, M., McLundie, M.: Interactive 3D Environments for Ship Design Review and Simulation. In: Proceedings of the 5th International Conference on Computer Applications and Information Technology in the Maritime Industries 2006 (2006)Google Scholar
  5. 5.
    Charissis, V., Naef, M., Jones, B.S., Ramsay, J., Sharples, B.: 3D Visualisation of Submarine Rescue Systems and Rescue Mission Simulation. In: Proceedings of the RINA Warships 2008: Naval Submarines (2008) Google Scholar
  6. 6.
  7. 7.
  8. 8.
    Sato, Y., Saito, M., Koike, H.: Real-time Input of 3D Pose and Gestures of a User’s Hand and Its Applications for HCI. In: IEEE VR, pp. 79–86 (2001)Google Scholar
  9. 9.
    Song, P., Goh, W.-B., Hutama, W., Fu, C.-W., Liu, X.: A Handle Bar Metaphor for Virtual Object Manipulation with Mid-Air Interaction. In: CHI 2012, pp. 1297–1306 (2012)Google Scholar
  10. 10.
    Van den Bergh, M., Bosché, F., Koller-Meier, E., Van Gool, L.: Haarlet-based Hand Gesture Recognition for 3D Interaction. In: Workshop on Applications of Computer Vision (WACV), pp. 1–8 (2009)Google Scholar
  11. 11.
    Van den Bergh, M., Van Gool, L.: Combining RGB and ToF Cameras for Real-time 3D Hand Gesture Interaction. In: 2011 IEEE Workshop on Applications of Computer Vision (WACV), pp. 66–72 (2011)Google Scholar
  12. 12.
    Gallo, L., Placitelli, A.P., Ciampi, M.: Controller-free Exploration of Medical Image Data: Experiencing the Kinect. In: 2011 24th International Symposium on Computer-Based Medical Systems (CBMS), pp. 1–6 (2011)Google Scholar
  13. 13.
    Cheng, K., Pul, K.: Direct Interaction with Large-Scale Display Systems using Infrared Laser Tracking Devices. In: APVis 2003 Proceedings of the Asia-Pacific Symposium on Information Visualisation, vol. 24, pp. 67–74 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Hye-Jin Park
    • 1
  • Jiyoung Park
    • 2
  • Myoung-Hee Kim
    • 1
    • 2
  1. 1.Dept. of Computer Science & EngineeringEwha Womans UniversitySeodaemunguKorea
  2. 2.Center for Computer Graphics and Virtual RealityEwha Womans UniversitySeodaemunguKorea

Personalised recommendations