Advertisement

TrackLine: Refining touch-to-track Interaction for Camera Motion Control on Mobile Devices

  • Axel HoeslEmail author
  • Sarah Aragon Bartsch
  • Andreas Butz
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10514)

Abstract

Controlling a film camera to follow an actor or object in an aesthetically pleasing way is a highly complex task, which takes professionals years to master. It entails several sub-tasks, namely (1) selecting or identifying and (2) tracking the object of interest, (3) specifying the intended location in the frame (e.g., at 1/3 or 2/3 horizontally) and (4) timing all necessary camera motions such that they appear smooth in the resulting footage. Traditionally, camera operators just controlled the camera directly or remotely and practiced their motions in several repeated takes until the result met their own quality criteria. Automated motion control systems today assist with the timing and tracking sub-tasks, but leave the other two to the camera operator using input methods such as touch-to-track, which still present challenges in timing and coordination. We designed a refined input method called TrackLine which decouples target and location selection and adds further automation with even improved control. In a first user study controlling a virtual camera, we compared TrackLine to touch-to-track and traditional joystick control and found that the results were objectively both more accurate and more easily achieved, which was also confirmed by the subjective ratings of our participants.

Keywords

Camera motion Motion control Image-based control User interface User-centered design 

References

  1. 1.
    Carr, P., Mistry, M., Matthews, I.: Hybrid robotic/virtual pan-tilt-zom cameras for autonomous event recording. In: Proceedings of the 21st ACM International Conference on Multimedia, MM 2013, pp. 193–202. ACM (2013)Google Scholar
  2. 2.
    Chen, C., Wang, O., Heinzle, S., Carr, P., Smolic, A., Gross, M.: Computational sports broadcasting: automated director assistance for live sports. In: 2013 IEEE International Conference on Multimedia and Expo (ICME), ICME 2013, pp. 1–6. IEEE (2013)Google Scholar
  3. 3.
    Chen, J., Carr, P.: Autonomous camera systems: a survey. In: Workshops at the Twenty-Eighth AAAI Conference on Artificial Intelligence, pp. 18–22. Québec City, Québec, Canada (2014)Google Scholar
  4. 4.
    Chen, J., Carr, P.: Mimicking human camera operators. In: IEEE Winter Conference on Applications of Computer Vision, pp. 215–222. WACV 2015. IEEE (2015)Google Scholar
  5. 5.
    Cherry, E., Latulipe, C.: Quantifying the creativity support of digital tools through the creativity support index. ACM Trans. Comput.-Hum. Interact. (TOCHI) 21(4), 21 (2014)CrossRefGoogle Scholar
  6. 6.
    Chiu, T.T., Young, K.Y., Hsu, S.H., Lin, C.L., Lin, C.T., Yang, B.S., Huang, Z.R.: A study of Fitts’ Law on goal-directed aiming task with moving targets. Percept. Mot. Skills 113(1), 339–352 (2011)CrossRefGoogle Scholar
  7. 7.
    Christie, M., Hosobe, H.: Through-the-lens cinematography. In: Butz, A., Fisher, B., Krüger, A., Olivier, P. (eds.) SG 2006. LNCS, vol. 4073, pp. 147–159. Springer, Heidelberg (2006). doi: 10.1007/11795018_14 CrossRefGoogle Scholar
  8. 8.
    Christie, M., Normand, J.M.: A semantic space partitioning approach to virtual camera composition. Comput. Graph. Forum 24(3), 247–256 (2005)CrossRefGoogle Scholar
  9. 9.
    Foote, E., Carr, P., Lucey, P., Sheikh, Y., Matthews, I.: One-man-band: a touch screen interface for producing live multi-camera sports broadcasts. In: Proceedings of the 21st ACM International Conference on Multimedia, MM 2013, pp. 163–172. ACM, Barcelona, Spain (2013)Google Scholar
  10. 10.
    Gaddam, V.R., Langseth, R., Stensland, H., Griwodz, C., Halvorsen, P., Landsverk, Ø.: Automatic real-time zooming and panning on salient objects from a panoramic video. In: Proceedings of the 22nd ACM International Conference on Multimedia, pp. 725–726. ACM (2014)Google Scholar
  11. 11.
    Galvane, Q., Fleureau, J., Tariolle, F.L., Guillotel, P.: Automated cinematography with unmanned aerial Vehicles. In: Christie, M., Galvane, Q., Jhala, A., Ronfard, R. (eds.) Eurographics Workshop on Intelligent Cinematography and Editing. WICED 2016. The Eurographics Association (2016)Google Scholar
  12. 12.
    Hulens, D., Goedem, T., Rumes, T.: Autonomous lecture recording with a PTZ camera while complying with cinematographic rules. In: Proceedings of the 2014 Canadian Conference on Computer and Robot Vision, pp. 371–377. CRV 2014. IEEE Computer Society, Washington, DC, USA (2014)Google Scholar
  13. 13.
    Lino, C., Christie, M., Ranon, R., Bares, W.: The director’s lens: an intelligent assistant for virtual cinematography. In: Proceedings of the 19th ACM International Conference on Multimedia, MM 2011, pp. 323–332. ACM, Scottsdale, Arizona, USA (2011)Google Scholar
  14. 14.
    Marchand, E., Courty, N.: Image-based virtual camera motion strategies. In: Fels, S., Poulin, P. (eds.) Proceedings of the Graphics Interface 2000 Conference, GI 2000, pp. 69–76. Morgan Kaufmann Publishers, Montral, Qubec, Canada (2000)Google Scholar
  15. 15.
    Stanciu, R., Oh, P.Y.: Designing visually servoed tracking to augment camera teleoperators. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IRDS 2002, vol. 1, pp. 342–347. IEEE (2002)Google Scholar
  16. 16.
    Stanciu, R., Oh, P.Y.: Feedforward-output tracking regulation control for human-in-the-loop camera systems. In: Proceedings of the 2005, American Control Conference, ACC 2005, vol. 5, pp. 3676–3681. IEEE (2005)Google Scholar
  17. 17.
    Verster, J.C., Roth, T.: Standard operation procedures for conducting the on-the-road driving test, and measurement of the standard deviation of lateral position (SDLP). Int. J. Gen. Med. 4(4), 359–371 (2011)CrossRefGoogle Scholar
  18. 18.
    Wulff, B., Fecke, A.: LectureSight - an open source system for automatic camera control in lecture recordings. In: 2012 IEEE International Symposium on Multimedia (ISM), ISM 2012, pp. 461–466. IEEE (2012)Google Scholar
  19. 19.
    Wulff, B., Rolf, R.: Opentrack-automated camera control for lecture recordings. In: 2011 IEEE International Symposium on Multimedia (ISM), ISM 2011, pp. 549–552. IEEE (2011)Google Scholar
  20. 20.
    Zhang, C., Rui, Y., Crawford, J., He, L.W.: An automated end-to-end lecture capture and broadcasting system. ACM Trans. Multimedia Comput. Commun. Appl. (TOMM ) 4(1), 6 (2008)Google Scholar
  21. 21.
    Zhang, Z., Liu, Z., Zhao, Q.: Semantic saliency driven camera control for personal remote collaboration. In: IEEE 10th Workshop on Multimedia Signal Processing, MMSP 2008, pp. 28–33. IEEE (2008)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2017

Authors and Affiliations

  • Axel Hoesl
    • 1
    Email author
  • Sarah Aragon Bartsch
    • 1
  • Andreas Butz
    • 1
  1. 1.LMU MunichMunichGermany

Personalised recommendations