MediaSync pp 565-592 | Cite as

Camera Synchronization for Panoramic Videos

  • Vamsidhar R. Gaddam
  • Ragnar Langseth
  • Håkon K. Stensland
  • Carsten Griwodz
  • Michael Riegler
  • Tomas Kupka
  • Håvard Espeland
  • Dag Johansen
  • Håvard D. Johansen
  • Pål Halvorsen
Chapter

Abstract

Multi-camera systems are frequently used in applications such as panorama videos creation, free-viewpoint rendering, and 3D reconstruction. A critical aspect for visual quality in these systems is that the cameras are closely synchronized. In our research, we require high-definition panorama videos generated in real time using several cameras in parallel. This is an essential part of our sports analytics system called Bagadus, which has several synchronization requirements. The system is currently in use for soccer games at the Alfheim stadium for Tromsø IL and at the Ullevaal stadium for the Norwegian national soccer team. Each Bagadus installation is capable of combining the video from five 2 K cameras into a single 50 fps cylindrical panorama video. Due to proper camera synchronization, the produced panoramas exhibit neither ghosting effects nor other visual inconsistencies at the seams. Our panorama videos are designed to support several members of the trainer team at the same time. Using our system, they are able to pan, tilt, and zoom interactively, independently over the entire field, from an overview shot to close-ups of individual players in arbitrary locations. To create such panoramas, each of our cameras covers one part of the field with small overlapping regions, where the individual frames are transformed and stitched together into a single view. We faced two main synchronization challenges in the panorama generation process. First, to stitch frames together without visual artifacts and inconsistencies due to motion, the shutters in the cameras had to be synchronized with sub-millisecond accuracy. Second, to circumvent the need for software readjustment of color and brightness around the seams between cameras, the exposure settings were synchronized. This chapter describes these synchronization mechanisms that were designed, implemented, evaluated, and integrated in the Bagadus system.

Keywords

Camera array Panorama video Frame stitching Shutter synchronization Exposure synchronization 

Notes

Acknowledgements

This work has been performed in the context of the iAD Centre for Research-based Innovation (project number 174867), and it is also supported in part by the EONS project (project number 231687)—both funded by the Research Council of Norway. Furthermore, there are numerous students and researchers that have worked on Bagadus or post-Bagadus solutions. For the synchronization of cameras, the authors want to acknowledge in alphabetical order: Alexander Eichhorn, Martin Stensgård, and Simen Sægrov.

References

  1. 1.
    Halvorsen, P., Sægrov, S., Mortensen, A., Kristensen, D.K., Eichhorn, A., Stenhaug, M., Dahl, S., Stensland, H.K., Gaddam, V.R., Griwodz, C., Johansen, D.: Bagadus: An integrated system for arena sports analytics a soccer case study. In: Proceedings of ACM MMSys, pp. 48–59 (2013)Google Scholar
  2. 2.
    Sægrov, S., Eichhorn, A., Emerslund, J., Stensland, H.K., Griwodz, C., Johansen, D., Halvorsen, P.: Bagadus: an integrated system for soccer analysis (demo). In: Proceedings of ICDSC, pp. 1–2 (2012)Google Scholar
  3. 3.
    Stensland, H.K., Gaddam, V.R., Tennøe, M., Helgedagsrud, E., Næss, M., Alstad, H.K., Mortensen, A., Langseth, R., Ljødal, S., Landsverk, Ø., Griwodz, C., Halvorsen, P., Stenhaug, M., Johansen, D.: Bagadus: an integrated real-time system for soccer analytics. ACM TOMCCAP 10(1s) (2014)Google Scholar
  4. 4.
    Mortensen, A., Gaddam, V.R., Stensland, H.K., Griwodz, C., Johansen, D., Halvorsen, P.: Automatic event extraction and video summaries from soccer games. In: Proceedings of ACM MMSys, pp. 176–179 (2014)Google Scholar
  5. 5.
    ChyronHego: ZXY Sport Tracking. http://www.zxy.no/
  6. 6.
    Mills, D., Martin, J., Burbank, J., Kasch, W.: Network time protocol version 4: protocol and algorithms specification. RFC 5905 (Proposed Standard) (2010)Google Scholar
  7. 7.
    Johansen, D., Stenhaug, M., Hansen, R.B.A., Christensen, A., Høgmo, P.M.: Muithu: smaller footprint, potentially larger imprint. In: Proceedings of IEEE ICDIM, pp. 205–214 (2012)Google Scholar
  8. 8.
  9. 9.
  10. 10.
    Stensland, H.K., Gaddam, V.R., Tennøe, M., Helgedagsrud, E., Næss, M., Alstad, H.K., Griwodz, C., Halvorsen, P., Johansen, D.: Processing panorama video in real-time. Int. J. Semant. Comput. 8(2) (2014)Google Scholar
  11. 11.
    Tennøe, M., Helgedagsrud, E., Næss, M., Alstad, H.K., Stensland, H.K., Gaddam, V.R., Johansen, D., Griwodz, C., Halvorsen, P.: Efficient implementation and processing of a real-time panorama video pipeline. In: Proceedings of IEEE ISM (2013)Google Scholar
  12. 12.
    Langseth, R., Gaddam, V.R., Stensland, H.K., Griwodz, C., Halvorsen, P., Johansen, D.: An experimental evaluation of debayering algorithms on gpus for recording panoramic video in real-time. Int. J. Multimedia Data Eng. Manag. 6(6) (2015)Google Scholar
  13. 13.
    Kellerer, L., Gaddam, V.R., Langseth, R., Stensland, H.K., Griwodz, C., Johansen, D., Halvorsen, P.: Real-time HDR panorama video. In: Proceedings of ACM MM, pp. 1205–1208 (2014)Google Scholar
  14. 14.
    Stensland, H.K., Wilhelmsen, M.A., Gaddam, V.R., Mortensen, A., Langseth, R., Griwodz, C., Halvorsen, P.: Using a commodity hardware video encoder for interactive applications. Int. J. Multimedia Data Eng. Manag. 6(3), 17–31 (2015)CrossRefGoogle Scholar
  15. 15.
    Wilhelmsen, M.A., Stensland, H.K., Gaddam, V.R., Mortensen, A., Langseth, R., Griwodz, C., Johansen, D., Halvorsen, P.: Using a commodity hardware video encoder for interactive video streaming. In: Proceedings of IEEE ISM (2014)Google Scholar
  16. 16.
  17. 17.
    Gaddam, V.R., Langseth, R., Ljødal, S., Gurdjos, P., Charvillat, V., Griwodz, C., Halvorsen, P.: Interactive zoom and panning from live panoramic video. In: Proceedings of ACM NOSSDAV, pp. 19–24 (2014)Google Scholar
  18. 18.
    Gaddam, V.R., Bao Ngo, H., Langseth, R., Griwodz, C., Johansen, D., Halvorsen, P.: Tiling of panorama video for interactive virtual cameras: overheads and potential bandwidth requirement. In: Proceedings of IEEE PV, pp. 204–209 (2015)Google Scholar
  19. 19.
    Gaddam, V.R., Riegler, M., Eg, R., Griwodz, C., Halvorsen, P.: Tiling in interactive panoramic video: approaches and evaluation. IEEE Trans. Multimedia 18(9), 1819–1831 (2016)CrossRefGoogle Scholar
  20. 20.
    Niamut, O.A., Thomas, E., D’Acunto, L., Concolato, C., Denoual, F., Lim, S.Y.: Mpeg dash srd: Spatial relationship description. In: Proceedings of MMSys (2016)Google Scholar
  21. 21.
    Sanchez, Y., Skupin, R., Schierl, T.: Compressed domain video processing for tile based panoramic streaming using hevc. In: Proceedings of IEEE ICIP, pp. 2244–2248 (2015)Google Scholar
  22. 22.
    NTP.org: NTP faq—How does it work? http://www.ntp.org/ntpfaq/NTP-s-algo.htm
  23. 23.
  24. 24.
    Quora: How fast can a soccer ball be kicked? https://www.quora.com/How-fast-can-a-soccer-ball-be-kicked
  25. 25.
    Hasler, N., Rosenhahn, B., Thormahlen, T., Wand, M., Gall, J., Seidel, H.P.: Markerless motion capture with unsynchronized moving cameras. In: Proceedings of IEEE CVPR, pp. 224–231 (2009)Google Scholar
  26. 26.
    Pourcelot, P., Audigié, F., Degueurce, C., Geiger, D., Denoix, J.M.: A method to synchronise cameras using the direct linear transformation technique 33(12), 1751–1754 (2000)Google Scholar
  27. 27.
    Shrestha, P., Barbieri, M., Weda, H., Sekulovski, D.: Synchronization of multiple camera videos using audio-visual features. IEEE Trans. Multimedia 12(1), 79–92 (2010)CrossRefGoogle Scholar
  28. 28.
    Shrestha, P., Weda, H., Barbieri, M., Sekulovski, D.: Synchronization of multiple video recordings based on still camera flashes. In: Proceedings ACM MM. New York, USA (2006)Google Scholar
  29. 29.
    Ruttle, J., Manzke, M., Prazak, M., Dahyot, R.: Synchronized real-time multi-sensor motion capture system. In: Proceedings of ACM SIGGRAPH ASIA (2009)Google Scholar
  30. 30.
    Bradley, D., Atcheson, B., Ihrke, I., Heidrich, W.: Synchronization and rolling shutter compensation for consumer video camera arrays. In: Proceedings of IEEE CVPR, pp. 1–8 (2009)Google Scholar
  31. 31.
    Duckworth, T., Roberts, D.J.: Camera image synchronisation in multiple camera real-time 3D reconstruction of moving humans. In: Proceedings of DS-RT, pp. 138–144 (2011)Google Scholar
  32. 32.
    Moore, C., Duckworth, T., Aspin, R., Roberts, D.: Synchronization of images from multiple cameras to reconstruct a moving human. In: Proceedings of IEEE/ACM DR-RT, pp. 53–60 (2010)Google Scholar
  33. 33.
    Chang, R., Ieng, S., Benosman, R.: Shapes to synchronize camera networks. In: Proceedings of IEEE ICPR, pp. 1–4 (2008)Google Scholar
  34. 34.
    Sinha, S., Pollefeys, M.: Synchronization and calibration of camera networks from silhouettes. In: Proceedings of ICPR, vol. 1, pp. 116–119 (2004)Google Scholar
  35. 35.
    Sinha, S.N., Pollefeys, M.: Camera network calibration and synchronization from silhouettes in archived video. Int. J. Comput. Vis. 87(3), 266–283 (2010)CrossRefGoogle Scholar
  36. 36.
    Topçu, O., Ercan, A.Ö., Alatan, A.A.: Recovery of temporal synchronization error through online 3D tracking with two cameras. In: Proceedings of ICDSC, pp. 1–6 (2014)Google Scholar
  37. 37.
    Haufmann, T.A., Brodtkorb, A.R., Berge, A., Kim, A.: Real-time online camera synchronization for volume carving on GPU. In: Proceedings of AVSS, pp. 288–293 (2013)Google Scholar
  38. 38.
    Nischt, M., Swaminathan, R.: Self-calibration of asynchronized camera networks. In: Proceedings of ICCV Workshops, pp. 2164–2171 (2009)Google Scholar
  39. 39.
    Shankar, S., Lasenby, J., Kokaram, A.: Synchronization of user-generated videos through trajectory correspondence and a refinement procedure. In: Proceedings of CVMP, pp. 1–10 (2013)Google Scholar
  40. 40.
    Shankar, S., Lasenby, J., Kokaram, A.: Warping trajectories for video synchronization. In: Proceedings of ARTEMIS, pp. 41–48 (2013)Google Scholar
  41. 41.
    Tao, J., Risse, B., Jiang, X., Klette, R.: 3D trajectory estimation of simulated fruit flies. In: Proceedings of IVCNZ (2012)Google Scholar
  42. 42.
    Velipasalar, S., Wolf, W.H.: Frame-level temporal calibration of video sequences from unsynchronized cameras. Mach. Vis. Appl. 19(5–6), 395–409 (2008)CrossRefGoogle Scholar
  43. 43.
    Whitehead, A., Laganiere, R., Bose, P.: Temporal synchronization of video sequences in theory and in practice. In: Proceedings of IEEE WACV/MOTION, pp. 132–137 (2005)Google Scholar
  44. 44.
    Kovacs, J.: AN005 Application Note—An Overview of Genlock. http://www.mivs.com/old-site/documents/appnotes/an005.html
  45. 45.
    Smith, S.L.: Application of high-speed videography in sports analysis. In: Proceedings of SPIE 1757 (1993)Google Scholar
  46. 46.
    Collins, R.T., Amidi, O., Kanade, T.: An active camera system for acquiring multi-view video. In: Proceedings of ICIP, pp. 520–527 (2002)Google Scholar
  47. 47.
    Lin, M.Y.: Shutter synchronization circuit for stereoscopic systems (1998). https://www.google.com/patents/US5808588
  48. 48.
    Gross, M., Lang, S., Strehlke, K., Moere, A.V., Staadt, O., Würmlin, S., Naef, M., Lamboray, E., Spagno, C., Kunz, A., Koller-Meier, E., Svoboda, T., Van Gool, L.: Blue-c: a spatially immersive display and 3D video portal for telepresence. In: Proceedings of ACM SIGGRAPH (2003)Google Scholar
  49. 49.
    Wilburn, B., Joshi, N., Vaish, V., Levoy, M., Horowitz, M.: High-speed videography using a dense camera array. Proc. IEEE CVPR 2, 294–301 (2004)Google Scholar
  50. 50.
    Meyer, F., Bahr, A., Lochmatter, T., Borrani, F.: Wireless GPS-based phase-locked synchronization system for outdoor environment. J. Biomech. 45(1), 188–190 (2012)CrossRefGoogle Scholar
  51. 51.
    Litos, G., Zabulis, X., Triantafyllidis, G.: Synchronous image acquisition based on network synchronization. In: Proceedings of CVPR Workshops (3DCINE), pp. 167–167Google Scholar
  52. 52.
    Sousa, R.M., Wäny, M., Santos, P., Dias, M.: Multi-camera synchronization core implemented on USB3 based FPGA platform. In: Proceedings of SPIE 9403 (2015)Google Scholar
  53. 53.
    Nguyen, H., Nguyen, D., Wang, Z., Kieu, H., Le, M.: Real-time, high-accuracy 3D imaging and shape measurement. Appl. Opt. 54(1), A9 (2015)CrossRefGoogle Scholar
  54. 54.
    Gaddam, V.R., Griwodz, C., Halvorsen, P.: Automatic exposure for panoramic systems in uncontrolled lighting conditions: a football stadium case study. In: Proceedings of SPIE 9012—The Engineering Reality of Virtual Reality (2014)Google Scholar
  55. 55.
    Hasler, D., Ssstrunk, S.: Mapping colour in image stitching applications. J. Visual Commun. Im. Represent. 15(1), 65–90 (2004)CrossRefGoogle Scholar
  56. 56.
    Tian, G.Y., Gledhill, D., Taylor, D., Clarke, D.: Colour correction for panoramic imaging. In: Proceedings of IV (2002)Google Scholar
  57. 57.
    Doutre, C., Nasiopoulos, P.: Fast vignetting correction and color matching for panoramic image stitching. In: Proceedings of ICIP, pp. 709–712 (2009)Google Scholar
  58. 58.
    Xu, W., Mulligan, J.: Performance evaluation of color correction approaches for automatic multi-view image and video stitching. In: Proceedings of IEEE CVPR, pp. 263–270 (2010)Google Scholar
  59. 59.
    Xiong, Y., Pulli, K.: Color correction for mobile panorama imaging. In: Proceedings of ICIMCS, pp. 219–226 (2009)Google Scholar
  60. 60.
    Ibrahim, M., Hafiz, R., Khan, M., Cho, Y., Cha, J.: Automatic reference selection for parametric color correction schemes for panoramic video stitching. Adv. Visual Comput. Lect. Notes Comput. Sci. 7431, 492–501 (2012)Google Scholar
  61. 61.
    Debevec, P.E., Malik, J.: Recovering high dynamic range radiance maps from photographs. In: Proceedings of SIGGRAPH, pp. 369–378 (1997)Google Scholar
  62. 62.
    Larson, G.W., Rushmeier, H., Piatko, C.: A visibility matching tone reproduction operator for high dynamic range scenes. IEEE Trans. Visual. Comput. Graph. 3(4), 291–306 (1997)Google Scholar
  63. 63.
    Reinhard, E., Stark, M., Shirley, P., Ferwerda, J.: Photographic tone reproduction for digital images. ACM Trans. Graph. 21(3), 267–276 (2002)CrossRefGoogle Scholar
  64. 64.
    Gonzalez, R.C., Woods, R.E.: Digital Image Processing, 3rd edn. Prentice-Hall, Inc. (2006)Google Scholar
  65. 65.
    Langseth, R.: Implementation of a distributed real-time video panorama pipeline for creating high quality virtual views. University of Oslo (2014)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Vamsidhar R. Gaddam
    • 1
  • Ragnar Langseth
    • 2
  • Håkon K. Stensland
    • 1
  • Carsten Griwodz
    • 1
  • Michael Riegler
    • 1
  • Tomas Kupka
    • 2
  • Håvard Espeland
    • 2
  • Dag Johansen
    • 3
  • Håvard D. Johansen
    • 3
  • Pål Halvorsen
    • 1
  1. 1.Simula Research LaboratoryFornebuNorway
  2. 2.ForzaSys ASFornebuNorway
  3. 3.UiT – The Artic University of NorwayTromsøNorway

Personalised recommendations