Virtual Realities pp 16-32

Part of the Lecture Notes in Computer Science book series (LNCS, volume 8844) | Cite as

Four Metamorphosis States in a Distributed Virtual (TV) Studio: Human, Cyborg, Avatar, and Bot – Markerless Tracking and Feedback for Realtime Animation Control

  • Jens Herder
  • Jeff Daemen
  • Peter Haufs-Brusberg
  • Isis Abdel Aziz
Chapter

Abstract

The major challenge in virtual studio technology is the interaction between actors and virtual objects. Virtual studios differ from other virtual environments because there always exist two concurrent views: The view of the tv consumer and the view of the talent in front of the camera. This paper illustrates the interaction and feedback in front of the camera and compares different markerless person tracking systems, which are used for realtime animations. Entertaining animations are required, but sensors usually provide only a limited number of parameters. Additional information based on the context allows the generation of appealing animations, which might be partly prefabricated. As main example, we use a distributed live production in a virtual studio with two locally separated markerless tracking systems. The production was based on a fully tracked actor, cyborg (half actor, half graphics), avatar, and a bot. All participants could interact and throw a virtual disc. This setup is compared and mapped to Milgram’s continuum and technical challenges are described in detail.

Keywords

Markerless tracking Virtual studio Avatars Virtual characters Interaction feedback 

Supplementary material

Supplementary material 1 (mp4 140438 KB)

References

  1. 1.
    Herder, J.: Interactive content creation with virtual set environments. J. 3D-Forum Soc. 15(4), 53–56 (2001). JapanGoogle Scholar
  2. 2.
    Gibbs, S., Arapis, C., Breiteneder, C., Lalioti, V., Mostafawy, S., Speier, J.: Virtual studios: an overview. IEEE Multimedia 5(1), 18–35 (1998)CrossRefGoogle Scholar
  3. 3.
    Bunsen, O.: Verteilte Virtuelle TV-Produktion im Gigabit-Testbed West. Abschließender Bericht, Laboratory for Mixed Realities, Institut an der Kunsthochschule für Medien Köln, GMD Forschungszentrum Informationstechnik GmbH Institut für Medienkommunikation, February 2000Google Scholar
  4. 4.
    Grau, O., Pullen, T., Thomas, G.: A combined studio production system for 3-d capturing of live action and immersive actor feedback. IEEE Trans. Circuits Syst. Video Technol. 14(3), 370–380 (2004)CrossRefGoogle Scholar
  5. 5.
    Ray, A., Blinn, J.F.: Blue screen matting. In: SIGGRAPH’98, Conference Proceeding, pp. 259–268 (1996)Google Scholar
  6. 6.
    Corazza, S., Mndermann, L., Chaudhari, A., Demattio, T., Cobelli, C., Andriacchi, T.: A markerless motion capture system to study musculoskeletal biomechanics: visual hull and simulated annealing approachD. Ann. Biomed. Eng. 34(6), 1019–1029 (2006)CrossRefGoogle Scholar
  7. 7.
    Vizrt: Kenziko and mammoth graphics at IBC 2012: Kinetrak 2012. http://www.vizrt.com/news/newsgrid/35347/Kenziko_and_Mammoth_Graphics_at_IBC_2012
  8. 8.
    Mammoth Graphics and Kenziko Ltd.: Kinetrak (2014). http://www.kinetrak.tv
  9. 9.
    Price, M., Thomas, G.A.: 3d virtual production and delivery using mpeg-4. In: International Broadcasting Convention (IBC). IEEE (2000)Google Scholar
  10. 10.
    Gibbs, S., Baudisch, P.: Interaction in the virtual studio. In: SIGGRAPH Computer Graphics, vol. 30, pp. 29–32. ACM Press, New York, November 1996. ISSN:0097–8930Google Scholar
  11. 11.
    Kim, N., Woo, W., Kim, G., Park, C.M.: 3-d virtual studio for natural inter-“acting”. IEEE Trans. Syst. Man Cybern. Part A: Syst. Hum. 36(4), 758–773 (2006)CrossRefGoogle Scholar
  12. 12.
    Wöldecke, B., Marinos, D., Pogscheba, P., Geiger, C., Herder, J., Schwirten, T.: radarTHEREMIN - creating musicalexpressionsin a virtual studio environment. In: Proceeding of ISVRI 2011 (International Symposium on VR Innovation), Singapore, pp. 345–346 (2009)Google Scholar
  13. 13.
    Marinos, D., Geiger, C., Herder, J.: Large-area moderator tracking and demonstrational configuration of position based interactions for virtual studios. In: 10th European Interactive TV Conference, Berlin (2012)Google Scholar
  14. 14.
    Herder, J., Wilke, M., Heimbach, J., Göbel, S., Marinos, D.: Simple actor tracking for virtual tv studios using a photonic mixing device. In: 12th International Conference on Human and Computer, Hamamatsu / Aizu-Wakamatsu / Düsseldorf, University of Aizu (2009)Google Scholar
  15. 15.
    Flasko, M., Pogscheba, P., Herder, J., Vonolfen, W.: Heterogeneous binocular camera-tracking in a virtual studio. In: 8. Workshop Virtuelle und Erweiterte RealitŁt der GI-Fachgruppe VR/AR, Wedel (2011)Google Scholar
  16. 16.
    Hough, G., Athwal, C., Williams, I.: Advanced occlusion handling for virtual studios. In: Lee, G., Howard, D., Kang, J.J., Skezak, D. (eds.) ICHIT 2012. LNCS, vol. 7425, pp. 287–294. Springer, Heidelberg (2012) CrossRefGoogle Scholar
  17. 17.
    Carranza, J., Theobalt, C., Magnor, M.A., Seidel, H.P.: Free-viewpoint video of human actors. ACM Trans. Graph. 22(3), 569–577 (2003)CrossRefGoogle Scholar
  18. 18.
    Brooks, A., Czarowicz, A.: Markerless motion tracking: Ms kinect & organic motion openstage. In: 9th International Conference on Disability, Virtual Reality and Associated Technologies, vol. 9, ICDVRAT and The University of Reading, pp. 435–437 (2012)Google Scholar
  19. 19.
    Organic Motion Inc.: Openstage 2.0 technical overview, June 2014. http://www.organicmotion.com/openstage-2-0-technical-overview/
  20. 20.
    Livingston, M., Sebastian, J., Ai, Z., Decker, J.: Performance measurements for the microsoft kinect skeleton. In: Virtual Reality Short Papers and Posters (VRW), pp. 119–120. IEEE (2012)Google Scholar
  21. 21.
    Microsoft: Kinect for windows sdk documentation, July 2014Google Scholar
  22. 22.
    Daemen, J., Haufs-Brusberg, P., Herder, J.: Markerless actor tracking for virtual (tv) studio applications. In: International Joint Conference on Awareness Science and Technology & Ubi-Media Computing, iCAST 2013 & UMEDIA 2013. IEEE (2012)Google Scholar
  23. 23.
    Milgram, P., Takemura, H., Utsumi, A., Kishino, F.: Augmented reality: a class of displays on the reality-virtuality continuum. Proc. SPIE 2351, 282–292 (1995)CrossRefGoogle Scholar
  24. 24.
    Holz, T., Dragone, M., O’Hare, G.: Where robots and virtual agents meet. Int. J.Soc. Robot. 1(1), 83–93 (2009)CrossRefGoogle Scholar
  25. 25.
    Herder, J., Cohen, M.: Enhancing perspicuity of objects in virtual reality environments. In: CT’97 – Second International Cognitive Technology Conference on IEEE, pp. 228–237. IEEE Press, August 1997. ISBN 0-8186-8084-9Google Scholar
  26. 26.
    Vierjahn, T., Wöldecke, B., Geiger, C., Herder, J.: Improved direction signalization technique employing vibrotactile feedback. In: 11th Virtual Reality International Conference, VRIC’2009 (2009)Google Scholar
  27. 27.
    Wöldecke, B., Vierjahn, T., Flasko, M., Herder, J., Geiger, C.: Steering actors through a virtual set employing vibro-tactile feedback. In: TEI 2009: Proceedings of the 3rd International Conference on Tangible and Embedded Interaction, pp. 169–174. ACM, New York (2009)Google Scholar
  28. 28.
    Ludwig, P., Büchel, J., Herder, J., Vonolfen, W.: InEarGuide - a navigation and interaction feedback system using in ear headphones for virtual tv studio productions. In: 9. Workshop Virtuelle und Erweiterte Realität der GI-Fachgruppe VR/AR, Düsseldorf (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Jens Herder
    • 1
  • Jeff Daemen
    • 1
  • Peter Haufs-Brusberg
    • 1
  • Isis Abdel Aziz
    • 1
  1. 1.Department of MediaFH Düsseldorf, University of Applied SciencesDüsseldorfGermany

Personalised recommendations