Feature-Based Synchronization of Video and Background Music

  • Jong-Chul Yoon
  • In-Kwon Lee
  • Hyun-Chul Lee
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4153)


We synchronize background music with a video by changing the timing of music, an approach that minimizes the damage to music data. Starting from a MIDI file and video data, feature points are extracted from both sources, paired, and then synchronized using dynamic programming to time-scale the music. We also introduce the music graph, a directed graph that encapsulates connections between many short music sequences. By traversing a music graph, we can generate large amounts of new background music, in which we expect to find a sequence which matches the video features better than the original music.


Feature Point Video Clip Camera Movement Music Video Feature Pair 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Burt, G.: The Art of Film Music. Northeastern University Press (1996)Google Scholar
  2. 2.
    Lee, H.C., Lee, I.K.: Automatic synchronization of background music and motion in computer animation. In: Proceedings of the EUROGRAPHICS 2005, pp. 353–362 (2005)Google Scholar
  3. 3.
    Kovar, L., Gleicher, M., Pighin, F.: Motion graphs. In: Proceedings of ACM SIGGRAPH, pp. 473–482 (2002)Google Scholar
  4. 4.
    Arikan, O., Forsyth, D.: Interactive motion generation from examples. In: Proceedings of ACM SIGGRAPH, pp. 483–490 (2002)Google Scholar
  5. 5.
    Lee, J., Chai, J., Reitsma, P., Hodgins, J., Pollard, N.: Interactive control of avatars animated with human motion data. In: Proceedings of ACM SIGGRAPH, pp. 491–500 (2002)Google Scholar
  6. 6.
    Foote, J., Cooper, M., Girgensohn, A.: Creating music videos using automatic media analysis. In: Proceedings of ACM Multimedia 2002, pp. 553–560 (2002)Google Scholar
  7. 7.
    Hua, X.S., Lu, L., Zhang, H.J.: Ave - automated home video editing. In: Proceedings of ACM Multimedia 2003, pp. 490–497 (2003)Google Scholar
  8. 8.
    Mulhem, P., Kankanhalli, M.S., Hassan, H., Yi, J.: Pivot vector space approach for audio-video mixing. In: Proceedings of IEEE Multimedia 2003, pp. 28–40 (2003)Google Scholar
  9. 9.
    Jehan, T., Lew, M., Vaucelle, C.: Cati dance: self-edited, self-synchronized music video. In: Proceedings of SIGGRAPH Conference Abstracts and Applications, pp. 27–31 (2003)Google Scholar
  10. 10.
    Yoo, M.-J., Lee, I.-K., Choi, J.-J.: Background music generation using music texture synthesis. In: Rauterberg, M. (ed.) ICEC 2004. LNCS, vol. 3166, pp. 565–570. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  11. 11.
    Ma, Y.F., Lu, L., Zhang, H.J., Li, M.J.: A user attention model for video summarization. In: Proceedings of ACM Multimedia 2002, pp. 533–542 (2002)Google Scholar
  12. 12.
    Lan, D.J., Ma, Y.F., Zhang, H.J.: A novel motion-based representation for video mining. In: Proceedings of IEEE International Conference on Multimedia and Expo., pp. 469–472 (2003)Google Scholar
  13. 13.
    Bradski, G.R.: Computer vision face tracking as a component of a perceptual user interface. In: Proceedings of Workshop on Applications of Computer Vision, pp. 214–219 (1998)Google Scholar
  14. 14.
    Rowe, R.: Machine Musicianship. MIT Press, Cambridge (2004)Google Scholar
  15. 15.
    Hoscheck, J., Lasser, D.: Fundametals of Computer Aided Geometric Design. AK Peters (1993)Google Scholar
  16. 16.
    Trivedi, K.: Probability & Statistics with Reliability, Queuing, and Computer Science Applications. Prentice-Hall, Englewood Cliffs (1982)zbMATHGoogle Scholar
  17. 17.
    Cambouropoulos, E.: Markov chains as an aid to computer assisted composition. Musical Praxis 1, 41–52 (1994)Google Scholar
  18. 18.
    Trivino-Rodriguez, J.L., Morales-Bueno, R.: Using multiattribute prediction surffix graphs to predict and generate music. Computer Music Journal 25, 62–79 (2001)CrossRefGoogle Scholar
  19. 19.
    Bresin, R., Friberg, A.: Emotional coloring of computer-controlled music performances. Computer Music Journal 24, 44–63 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Jong-Chul Yoon
    • 1
  • In-Kwon Lee
    • 1
  • Hyun-Chul Lee
    • 1
  1. 1.Dept. of Computer ScienceYonsei UniversityKorea

Personalised recommendations