jon/bellona

simpleKinect Application

simpleKinect is an interface application for sending data from the Microsoft Kinect to any OSC-enabled application. The application attempts to improve upon similar software by offering more openni features and more user control.

The interface was built with Processing, utilizing the libraries: controlP5, oscP5, and simple-openni. Because I used open-source tools, and because the nature of the project is to stimulate creativity, simpleKinect is free to use.





simpleKinect Features

  • Auto-calibration.
  • Specify OSC output IP and Port in real time.
  • Send CoM (Center of Mass) coordinate of all users inside the space, regardless of skeleton calibration.
  • Send skeleton data (single user), on a joint-by-joint basis, as specified by the user.
  • Manually switch between users for skeleton tracking.
  • Individually select between three joint modes (world, screen, and body) for sending data.
  • Individually determine the OSC output url for any joint.
  • Save/load application settings.
  • Send distances between joints (sent in millimeters). [default is on]

simpleKinect.


simpleKinect FAQ page

Note: This app was exported as a Mac version. If you are a PC user, and interested in testing out a beta PC version, let me know.

simpleKinect is by Jon Bellona



Projects using simpleKinect


Casting: Kinect, Kyma (2013)


Casting is a real-time composition for a single performer using the Microsoft Kinect and Kyma. The piece embodies both the programmatic and the magical use of the term. By giving form to gesture that conjures sound and visual elements, a body's movement becomes intertwined with the visceral. The performer's body 'throws' and controls sound, enabling the viewer to perceive sound as transfigured by motion. In this way, music becomes defined by the human mold of the performer and listener.


AV@AR (from Manfred Borsch) (2014)

The interactive and audiovisual installation av@ar puts the relationship between people and their medial reflection at the center of the experience. In this closed circuit installation, the control, dependencies, and aesthetic reference levels can be explored and controlled interactively with the entire body. A catalogue of emotions serves as audiovisual communication material and leads to a reflection about the systematic use of feelings. The dance with the medial mirror opens a space of experience in the tense atmosphere between the non-digital and the digital ego - the avatar.


Kinect-Via- Interface Series

Kinect-Via- is a Max/MSP interface series for composers wanting to route and map user-tracking data from the XBox Kinect.




White paper (.pdf)

Kinect-Via-OSCeleton. (.zip) OSCeleton application

Kinect-Via-Synapse. (.zip) Synapse application

Kinect-Via-Processing. (.zip) Processing library

Kinect-Via-NIMate. (.zip) NImate application


NON-MAX/MSP USERS: The stand alone application of Kinect-Via-Synapse may be downloaded here.




What is the Kinect-Via- interface series?

Kinect-Via- is a Max/MSP interface series for composers wanting to route and map user-tracking data from the XBox Kinect. The interface series complements four different OpenNI applications, namely OSCeleton, Synapse, Processing's simple-openni library, and Delicode's NIMate. All Max/MSP interfaces communicate using OSC (Open Sound Control) messages and are performance-ready, meaning that all routing and system options may be changed in real time. The Kinect-Via- interfaces offer a tangible solution for anyone wishing to explore user tracking with the Kinect for creative application. Please read the documentation in the .zip files to learn more. Note: Latest has been tested with Max 5.1 and OSX 10.6.8.

Kinect-Via- FAQ page

You can also Shout! if you have questions. jpbellona [ ] yahoo [ ] com

Kinect-Via- is by Jon Bellona




Projects using the Kinect-Via- interface series


Human Chimes: Kinect, Processing, Max/MSP/Jitter, Kyma and video projector (2011)

Human Chimes is an interactive public installation. Participating users become triggered sounds that interact with all other participating users inside the space. By dynamically tracking users' locations in real time, the piece maps participants as sounds that pan around the space according to the participantsŐ positions. The Kinect mapping is using Kinect-Via-OSCeleton.


The Beat: Kinect, Max/MSP/Jitter, Ableton Live, Synthesizer (2012)


The Kinect user's hand and head movements mapped to filters, and at times, hand gestures actuate sound. The Kinect mapping is using Kinect-Via-Synapse. "The Beat" is a composition by Nathan Asman.


Juggling Music (from Arthur Wagenaar) (2012)

Playing music by juggling with glowballs! Demonstration of this new self made musical instrument, controlled by juggling. Also known (in Dutch) as 'De Kleurwerper'.


Life in 3D Installation (from Alex Galler) (2013)

Alex Galler and Team use Kinect-Via-Synapse as a way to transform user's movements into pan and audio controls within a multi-channel environment.


Check out Alex explain how his team's system works.