Ho Yin Au, Jie Chen, Junkun Jiang, Yike Guo. ReChoreoNet: Repertoire-based Dance Re-choreography with Music-conditioned Temporal and Style Clues[J]. Machine Intelligence Research, 2024, 21(4): 771-781. DOI: 10.1007/s11633-023-1478-9
Citation: Ho Yin Au, Jie Chen, Junkun Jiang, Yike Guo. ReChoreoNet: Repertoire-based Dance Re-choreography with Music-conditioned Temporal and Style Clues[J]. Machine Intelligence Research, 2024, 21(4): 771-781. DOI: 10.1007/s11633-023-1478-9

ReChoreoNet: Repertoire-based Dance Re-choreography with Music-conditioned Temporal and Style Clues

  • To generate dance that temporally and aesthetically matches the music is a challenging problem in three aspects. First, the generated motion should be beats-aligned to the local musical features. Second, the global aesthetic style should be matched between motion and music. And third, the generated motion should be diverse and non-self-repeating. To address these challenges, we propose ReChoreoNet, which re-choreographs high-quality dance motion for a given piece of music. A data-driven learning strategy is proposed to efficiently correlate the temporal connections between music and motion in a progressively learned cross-modality embedding space. The beats-aligned content motion will be subsequently used as autoregressive context and control signal to control a normalizing-flow model, which transfers the style of a prototype motion to the final generated dance. In addition, we present an aesthetically labelled music-dance repertoire (MDR) for both efficient learning of the cross-modality embedding, and understanding of the aesthetic connections between music and motion. We demonstrate that our repertoire-based framework is robustly extensible in both content and style. Both quantitative and qualitative experiments have been carried out to validate the efficiency of our proposed model.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return