(paper presented at ICCC2016 in Paris)
Recent advances in deep learning have enabled the extraction of high-level features from raw sensor data which has opened up new possibilities in many different fields, including computer generated choreography. The Lulu Art Group has in collaboration with Peltarion developed a system, chor-rnn, for generating novel choreographic material in the nuanced choreographic language and style of an individual choreographer. It also shows promising results in producing a higher level compositional cohesion, rather than just generating sequences of movement. At the core of chor-rnn is deep recurrent neural network trained on raw motion capture data and that can generate new dance sequences for a solo dancer. Chor-rnn can be used for collaborative human-machine choreography or as a creative catalyst, serving as inspiration for a choreographer.
Results:
After 10 minutes of training , movements are more or less random
After 6 hours of training, the RNN knows how the joints are related and it makes its first, careful and somewhat wobbly attempts at dancing:
After 48 hours it has become an accomplished dancer, making up the choreography as it goes:
Choreography
Louise Crnkovic-Friis
Neural networks
Luka Crnkovic-Friis
Full article text
Preprint PDF