Project for music generation system based on object tracking and CGAN

The project was inspired by MIDINet: A Convolutional Generative Adversarial Network for Symbolic-domain Music Generation.

Based on its idea, we further consider the possiblity that the moving rhythm could control the style of generated music.

Methodology

  1. We use OpenCV to track the object and compute the object’s speed, moving direction and size. The speed is meant to control the speed of music, the direction is meant to control the tone of music and the size is meant to control the volume of music, as an approximation of the distance to camera.
  2. We feed the movement feature as the condition of network to genrate movement controled music.
  3. Use two threads to anlyse the video and generate the music respectively to realize a nearly real-time music generation.

GitHub

View Github