Have a look at the trailer of the piece for dancer, sound and projection übersetzen – vertimas:
Sound optimized for headphone playback
The dancer is captured by a Kinect Sensor, originally developed for Microsoft’s XBox 360.
This sensor device provides a depth videostream which is used in two ways.
With the help of OpenNI framework and NITE by primesense, which are integrated into Pure Data through my external pix_openni, a Skeleton representation of the dancer with 15 joints is available for measuring movements.
These movements are translated into sounds.
For the projection on the dancer’s body the region where the dancer moves is extracted through thresholding using OpenGL shader on the GPU. The resulting mask is filled with color (or various visual content) and projected back using Extended View Toolkit by Marian Weger and Peter Venus.
All software is developed using Pure Data and Gem on a Linux operating system. As well as custom developed externals and OpenGL shader.
Thanks for projection technology support to Marian Weger!!
Check out his performance: Monster