Waithera Rina Schr (dance), Opiyo Okach (dance, interaction design), KMRU (sound)
Shared control:
Control of the processes that contribute to the overall composition is shared between the dancer, the live media designer, the music, the software programming and the hardware (computer, camera, motion tracking device). None of them have exclusive control over the outcome. In this sense live interaction and collaboration between the performers and technology are integrated into the creative process.
The base system consists of a performance space, 1 or more performers, a camera for live video feed, 1 Kinect sensor for motion tracking, a computer running Isadora and TouchDesigner for real time media generation and manipulation, 1 rear projection surface, 1 front projection surface (back-wall), 2 video projectors, theatre lighting system (3 spot lights), multi-channel audio system.
Base spatial dispositif at AiR Studio Rote Fabrik 
System function: I used live video feed from the camera and motion tracking data from the Kinect to generate visual texture from the performers’ body, movement and gesture. In working with a video camera as well as the Kinect sensor I used 2 parallel systems (video image & motion data) at the same time. I used TouchDesigner pixel displacement and feedback networks to process incoming live feed from the camera.
With Isadora OpenNI actor I pulled motion tracking data from the Kinect and used it to create line based visual textures from the performers movement using Live Drawing actor. I also captured sound frequency data and used it to manipulate lighting intensity via a Matrix Value Send user actor to DMX. I set the video camera so the lighting intensity would determine how and what the camera could ‘see’ or not. In this instance I tried out audio recordings of ‘Yoora’ and ‘Slowed Cities’ that the sound artist KMRU is making for the project. Ultimately the sound would be performed live in interaction with the dancers and visual texture.
The system operates 2 levels of video feedback. The first is with the video camera. The camera is positioned in a way that it films the performer(s) and any elements in the space including the surfaces on which the visual texture is projected. The second level are feedback network loops operated within TouchDesigner.
Waithera Rina Schr
Waithera Rina Schr
Waithera Rina Schr
Waithera Rina Schr
Waithera Rina Schr
Waithera Rina Schr
Opiyo Okach
Opiyo Okach
Back to Top