documentation :: übersetzen – vertimas

Have a look at the trailer of the piece for dancer, sound and projection übersetzen – vertimas:

Sound optimized for headphone playback

technical details

The dancer is captured by a Kinect[1] Sensor, originally developed for Microsoft’s XBox 360.
This sensor device provides a depth videostream which is used in two ways.
With the help of OpenNI[2] framework and NITE by primesense, which are integrated into Pure Data through my external pix_openni[3], a Skeleton representation of the dancer with 15 joints is available for measuring movements.
These movements are translated into sounds.

For the projection on the dancer’s body the region where the dancer moves is extracted through thresholding using OpenGL shader on the GPU. The resulting mask is filled with color (or various visual content) and projected back using Extended View Toolkit[4] by Marian Weger and Peter Venus.

All software is developed using Pure Data[5] and Gem on a Linux operating system. As well as custom developed externals and OpenGL shader.

(C)2012 Matthias KronlachnerThanks for projection technology support to Marian Weger!!
Check out his performance: Monster

  1. [1]Microsoft Kinectwww.xbox.com/kinect
  2. [2]OpenNI Framework and Nitewww.openni.org
  3. [3]pix_openni
  4. [4]Extended View Toolkit
  5. [5]pure data

Ein Gedanke zu „documentation :: übersetzen – vertimas

  1. Hi, I’m a music teacher and composer, I’m learning Pd right now, I use linux most of the time. I work also as a part-time composer for a contemporary dance company, so I think your work is very interesting! Congratulations for ubersetzen vertimas. I’m working on similar things right now, hope we could exchange ideas! my mail: pdro74@hotmail.com

Schreibe einen Kommentar zu Pedro Antworten abbrechen

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert.

Spam protection: * Time limit is exhausted. Please reload CAPTCHA.