In designing Kinectics I wanted to explore the Kinect sensor, reactivity and interactivity. I wanted to create a way to interact with a physics based particle system in MAX 6, and utilized joint tracking a Kinect, and parsed OSC messages using a program called “Synapse“. One at a time, users can kick and toss a particle system of red spheres. The original patch that I was inspired to alter into an immersive experience was from the Physics Patch A Day forum on the cycling74 website which uses the mouse to bounce around red particles. To do this I converged the Synapse OSC data to 7 different tracking points which took the place of the mouse tracker from the original patch and then made the tracking points invisible so that the silhouette of the viewer would blend into the wall projection. These particles, once the system is activated, attract themselves back to the viewer so when action occurs towards a cluster, they explode outwards and bounce around the room. I had to alter the physical attributes to the original system to make it conducive to human kinetic interaction… which took hours and hours of tweaking. The effect is almost like a ball pit in a vortex. The system is unpredictable and chaotic, and ever user’s experience is entirely dependent on how they want to approach the art piece. With the lights off, one can dance or move around (after calibrating the Synapse motion tracker) and interact with the red particles being projected in entirely unique ways every time the system is engaged.
Here is the first patch I put together using the kinect and Synapse which just lays colored circles over the joints:
The main issue that I have with this project is that it lacks aesthetic complexity. I had a hard time figuring out how to customize the particles themselves and spent many hours trouble shooting this design flaw. The particles currently are red spheres that all have the same reflecting point due to an default setting I cannot change on Max. I want to overlay images or animations and I couldn’t figure out how to do this strictly within Max. Another flaw with max is the inability for the visual data on the back-end to be handled in effective and efficient ways. As a result, I had to decrease the particle count on the patch from 400 to 60 particles, otherwise it would lag horribly.
Another criticism I have is Synapse. The limits are calibration which entail basically standing in front of the Kinect and making a goal post gesture with one’s arms. This creates a hickup in the way people can interact with this system. Ideally there would be a fluid recognition of the kinect’s viewer tracking. The other limitation is the single connection to one viewer; I want to have as many viewers interacting with this piece as possible. There are alternatives to this system , or I begin learning CV(Computer Vision) and processing and develop custom systems that are integrated using Java or Open Frameworks. In the future I need to study programming in further depth and potentially work with my computerbootcamped to run windows so that I can run openKinect tracking programs.
I do like the performance aspect of the piece. It is quite beautiful to watch somebody else dancing around and playing with the particles, especially in the dark. There is only a silhouette of their movement, coupled with the overlay of balls clinging to the dimensions of the tracked body.
Double Taker – Golan Levin
Future Self -rAndom International: Wayne McGregor and Max Richter
Interactive Still Life -Scott Garner
Augmented Reality Sandbox – The UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences