We have developped a program in C++ using openFrameworks that communicates with Ableton and the Leapmotion* and that displays shapes whith colors, movements and effects and broadcasts music.
Step 1 - SETUP and SLEEP MODE
The basic shapes and music of each sequence, that we call 'state of mind', are loaded. Shapes and their colors are gathered via SVGs and music via MIDI. For each state of mind, we have defined specific actions and their results, as well as physics for the movements. Once everything is setup, we project the visual and broacast the sound. When there are no actions from the user, our installation is in sleep mode.
Step 2 - INTERCTIONS
When a user places his hands over the leapmotion, it wakes our installation up. Each gesture is mapped with a specific reaction, both on the shapes (colors, movement, ...) and on music (filters, note or noise addition, ...). Therefore, when a gesture is detected, their is a direct response!