Advertisement

Personal and Ubiquitous Computing

, Volume 19, Issue 2, pp 415–424 | Cite as

Design of a motion-based gestural menu-selection interface for a self-portrait camera

  • Shaowei Chu
  • Jiro Tanaka
Original Article

Abstract

Self-portraits allow users to capture memories, create art, and advance their photography skills. However, most existing camera interfaces are limited in that they do not support life-size previews, deviceless remote control, and real-time control over important camera functions. In this paper, we describe a new self-portrait camera system and develop a gesture interface for self-portraits. This self-portrait camera system supports life-size projection of a preview as well as a motion-based gesture system to select menu options to control camera functions including the shutter trigger, aperture size, shutter speed, and color balance. We experimentally evaluated the gesture-recognition accuracy and examined the effectiveness of the system compared with a hand-held remote control. The results suggest that the gesture-based interface is effective for controlling self-portrait camera options and improves the user experience when taking self-portraits. The gesture interface is expected to be useful in developing next-generation self-portrait cameras.

Keywords

Digital camera Gesture user interface Motion gestures Image processing Human computer interaction 

References

  1. 1.
  2. 2.
    4 tips for taking gorgeous self-portrait and outfit photos. http://www.shrimpsaladcircus.com/2012/01/self-portrait-outfit-photography-guide.html
  3. 3.
    5 reasons why you should take a self portrait. http://www.iheartfaces.com/2012/04/self-portraits-tutorial/
  4. 4.
    Canon digital camera software developers kit. http://usa.canon.com/cusa/consumer/standard_display/sdk_homepage
  5. 5.
  6. 6.
  7. 7.
    Open source computer vision library (opencv). http://opencv.org/
  8. 8.
  9. 9.
    Taking a great self portrait with your camera. http://www.squidoo.com/self-portrait-tips
  10. 10.
  11. 11.
    Baker S, Matthews I (2004) Lucas–Kanade 20 years on: a unifying framework. Int J Comput Vis 56(3):221–255CrossRefGoogle Scholar
  12. 12.
    Bayazit M, Couture-Beil A, Mori G (2009) Real-time motion-based gesture recognition using the GPU. In: IAPR conference on machine vision applications (MVA), pp 9–12Google Scholar
  13. 13.
    Chen M, Mummert L, Pillai P, Hauptmann A, Sukthankar R (2010) Controlling your tv with gestures. In: MIR 2010: 11th ACM SIGMM international conference on multimedia information retrieval, pp 405–408Google Scholar
  14. 14.
    Chu S, Tanaka J (2011) Hand gesture for taking self portrait. In: Proceedings of the 14th international conference on human–computer interaction: interaction techniques and environments—volume part II, pp 238–247Google Scholar
  15. 15.
    Chu S, Tanaka J (2012) Head nod and shake gesture interface for a self-portrait camera. In: ACHI 2012, the fifth international conference on advances in computer–human Interactions, pp 112–117Google Scholar
  16. 16.
    Chu S, Tanaka J (2013) Development of a head gesture interface for a self-portrait camera. Trans Hum Interface Soc 15(3):247–259Google Scholar
  17. 17.
    Chu S, Tanaka J (2013) Interacting with a self-portrait camera using motion-based hand gestures. In: Proceedings of the 11th Asia-Pacific conference on computer–human interaction (APCHI2013), pp 93–101Google Scholar
  18. 18.
    Eisenstein J, Mackay WE (2006) Interacting with communication appliances: an evaluation of two computer vision-based selection techniques. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 1111–1114Google Scholar
  19. 19.
    Graham-Rowe D (2011) Say cheese! taking photos with a wave of the hand. New Sci (2817):25Google Scholar
  20. 20.
    Guimbretière F, Nguyen C (2012) Bimanual marking menu for near surface interactions. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 825–828Google Scholar
  21. 21.
    Molnár J, Chetverikov D, Fazekas S (2010) Illumination-robust variational optical flow using cross-correlation. Comput Vis Image Underst 114(10):1104–1114CrossRefGoogle Scholar
  22. 22.
    Ni T, Bowman DA, North C, McMahan RP (2011) Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures. Int J Hum Comput Stud 69(9):551–562CrossRefGoogle Scholar
  23. 23.
    Okabe D, Ito M, Chipchase J, Shimizu A (2006) The social uses of purikura: photographing, modding, archiving, and sharing. In: Pervasive image capture and sharing workshop, ubiquitous computing conference, pp 2–5Google Scholar
  24. 24.
    Zivkovic Z (2004) Optical-flow-driven gadgets for gaming user interface. In: 3rd international conference on entertainment computing C ICEC 2004, pp 90–100Google Scholar

Copyright information

© Springer-Verlag London 2014

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity of TsukubaTsukubaJapan

Personalised recommendations