MOTION SYNTHESIS FOR SYNCHRONIZING WITH STREAMING MUSIC BY SEGMENT-BASED SEARCH ON METADATA MOTION GRAPHS
Jianfeng Xu, Koichi Takagi, Shigeyuki SakazawaAbstract
Music and dance are two major forms of entertainment in our daily life. Moreover, the fact that people dance to music suggests the possibility of synchronizing human motion with music. In this paper, we present the first system to automatically synthesize human motion that is synchronized with streaming music using both rhythm and intensity features. In our system, a motion capture database is re-organized into a novel graph-based representation with metadata (called metadata motion graphs) beforehand, which is specially designed for the streaming application. When receiving an amount of music data as a segment, our system will search a best path for the segment on a metadata motion graph. This approach, whose effectiveness is demonstrated in a user study, can compose motions segment by segment, which (1) are synchronized with the music at a beat level in a short enough period, (2) are connected seamlessly with the previous segment, and (3) have the necessary synchronization capacity for the remaining music no matter how long it is.
Read Submission [348]