An AI Based Choreography Project

An AI Based Choreography Project

Recently I participated in an experimental choreography project at Beijing Dance Academy.
It was a great pleasure working with Prof. Liu and Prof. Ren.

In this project, we used an optical camera rather than a commercial sensor (e.g. Kinect) for skeleton tracking in order to eliminate several deficiencies. Such as the limited sensor range (2 to 8 meters), the unstability of detecting target behind transparent obstacles and the error identification of certain gestures.

With the aid of OpenPose, it is possible to detect human body and extract accurate skeletons in realtime. Our team did a great job building a robust gesture detection framework and creating fantastic realtime interactive visual effects via Processing and Unity.

We set up a deep learning server to get the camera feed near the stage, it analysed both the body and the hands gestures of the dancer. Then it sent data to the main control server for the final rendering.


This is the very first time that deep learning and realtime interactive effects play such important roles in dance (in China). And we were all proud of what we did.
(๑•ㅂ•́)و✧