Browse/search for people

Publication - Professor Dave Cliff

    Automated Composition of Picture-Synched Music Soundtracks for Movies

    Citation

    Dassani, V, Bird, J & Cliff, D, 2019, ‘Automated Composition of Picture-Synched Music Soundtracks for Movies’. in: Proceedings of the 16th ACM SIGGRAPH European Conference on Visual Media Production. Association for Computing Machinery (ACM)

    Abstract

    We describe the implementation of and early results from a system that automatically composes picture-synched musical soundtracks for videos and movies. We use the phrase _picture-synched_ to mean that the structure of the automatically composed music is determined by visual events in the input movie, i.e. the final music is synchronised to visual events and features such as cut transitions or within-shot key-frame events. Our system combines automated video analysis and computer-generated music-composition techniques to create unique soundtracks in response to the video input, and can be thought of as an initial step in creating a computerised replacement for a human composer writing music to fit the picture-locked edit of a movie. Working only from the video information in the movie, key features are extracted from the input video, using video analysis techniques, which are then fed into a machine-learning-based music generation tool, to compose a piece of music
    from scratch. The resulting soundtrack is tied to video features, such as scene
    transition markers and scene-level energy values, and is unique to the input video. Although the system we describe here is only a preliminary proof-of-concept, user evaluations of the output of the system have been positive.

    Full details in the University publications repository