Skip to main content

Evaluation of a Mixed Reality Head-Mounted Projection Display to Support Motion Capture Acting

  • Conference paper
  • First Online:
  • 2894 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10714))

Abstract

Motion capture acting is a challenging task, it requires trained and experienced actors who can highly rely on their acting and imagination skills to deliver believable performances. This is especially the case when preparation times are short and scenery needs to be imagined, as it is commonly the case for shoots in the gaming industry. To support actors in such cases, we developed a mixed reality application that allows showing digital scenery and triggering emotions while performing.

In this paper we tested our hypothesis that a mixed reality head-mounted projection display can support motion capture acting through the help of experienced motion capture actors performing short acting scenes common for game productions. We evaluated our prototype with four motion capture actors and four motion capture experts. Both groups considered our application as helpful, especially as a rehearsal tool to prepare performances before capturing the motions in a studio. Actors and experts indicated that our application could reduce the time to prepare performances and supports the set up of physical acting scenery.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   179.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   229.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Kade, D., Özcan, O., Lindell, R.: Towards stanislavski-based principles for motion capture acting in animation and computer games. In: Proceedings of CONFIA 2013, International Conference in Illustration and Animation, IPCA, pp. 277–292 (2013)

    Google Scholar 

  2. Kade, D., Özcan, O., Lindell, R.: An immersive motion capture environment. In: Proceedings of the ICCGMAT 2013, International Conference on Computer Games, Multimedia and Allied Technology, World Academy of Science, Engineering and Technology, pp. 500–506 (2013)

    Google Scholar 

  3. Akşit, K., Kade, D., Özcan, O., Ürey, H.: Head-worn mixed reality projection display application. In: Proceedings of the 11th Conference on Advances in Computer Entertainment Technology, ACE 2014, pp. 11:1–11:9. ACM, New York (2014)

    Google Scholar 

  4. Kade, D., Lindell, R., Ürey, H., Özcan, O.: Supporting acting performances through mixed reality and virtual environments. In: Proceedings of SETECEC 2016, Fifth International Conference on Software and Emerging Technologies for Education, Culture, Entertainment, and Commerce, Italy, Blue Herons Editions (2016)

    Google Scholar 

  5. Benedek, J., Miner, T.: Measuring desirability: new methods for evaluating desirability in a usability lab setting. Proc. Usability Professionals Assoc. 2003(8–12), 57 (2002)

    Google Scholar 

  6. Travis, D.: Measuring satisfaction: Beyond the usability questionnaire (2008). http://www.userfocus.co.uk/articles/satisfaction.html

  7. Hua, H., Gao, C., Rolland, J.P.: Imaging properties of retro-reflective materials used in head-mounted projective displays (HMPDs). In: International Society for Optics and Photonics on AeroSense 2002, pp. 194–201 (2002)

    Google Scholar 

  8. Harrison, C., Benko, H., Wilson, A.D.: Omnitouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 441–450. ACM (2011)

    Google Scholar 

  9. Technical Illusions: Castar (2014). http://technicalillusions.com/castar/

  10. University of Southern California: Flatworld Project (2007). http://ict.usc.edu/projects/flatworld. Accessed 24 Apr 2014

  11. Andreadis, A., Hemery, A., Antonakakis, A., Gourdoglou, G., Mauridis, P., Christopoulos, D., Karigiannis, J.N.: Real-time motion capture technology on a live theatrical performance with computer generated scenery. In: 2010 14th Panhellenic Conference on Informatics (PCI), pp. 148–152. IEEE (2010)

    Google Scholar 

  12. Meador, W.S., Rogers, T.J., O’Neal, K., Kurt, E., Cunningham, C.: Mixing dance realities: collaborative development of live-motion capture in a performing arts environment. Comput. Entertainment (CIE) 2(2), 12 (2004)

    Article  Google Scholar 

  13. James, J., Ingalls, T., Qian, G., Olsen, L., Whiteley, D., Wong, S., Rikakis, T.: Movement-based interactive dance performance. In: Proceedings of the 14th Annual ACM International Conference on Multimedia, pp. 470–480. ACM (2006)

    Google Scholar 

  14. Slater, M., Howell, J., Steed, A., Pertaub, D.P., Garau, M.: Acting in virtual reality. In: Proceedings of the Third International Conference on Collaborative Virtual Environments, pp. 103–110. ACM (2000)

    Google Scholar 

  15. Normand, J.M., Spanlang, B., Tecchia, F., Carrozzino, M., Swapp, D., Slater, M.: Full body acting rehearsal in a networked virtual environment-a case study. Presence Teleoperators Virtual Environ. 21(2), 229–243 (2012)

    Article  Google Scholar 

  16. Steptoe, W., Normand, J.M., Oyekoya, O., Pece, F., Giannopoulos, E., Tecchia, F., Steed, A., Weyrich, T., Kautz, J., Slater, M.: Acting rehearsal in collaborative multimodal mixed reality environments. Presence 21(4), 406–422 (2012)

    Article  Google Scholar 

  17. Gruzelier, J., Inoue, A., Smart, R., Steed, A., Steffert, T.: Acting performance and flow state enhanced with sensory-motor rhythm neurofeedback comparing ecologically valid immersive VR and training screen scenarios. Neurosci. Lett. 480(2), 112–116 (2010)

    Article  Google Scholar 

  18. MacKenzie, I.S.: Human-computer interaction: an empirical research perspective. Newnes (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniel Kade .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kade, D., Lindell, R., Ürey, H., Özcan, O. (2018). Evaluation of a Mixed Reality Head-Mounted Projection Display to Support Motion Capture Acting. In: Cheok, A., Inami, M., Romão, T. (eds) Advances in Computer Entertainment Technology. ACE 2017. Lecture Notes in Computer Science(), vol 10714. Springer, Cham. https://doi.org/10.1007/978-3-319-76270-8_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-76270-8_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-76269-2

  • Online ISBN: 978-3-319-76270-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics