Sarah B's profile

Interactive Snowfall

Concept and interaction:
I designed an interactive space by using a camera in front of the user (Kinect 360) and the response of the user’s gestures will be visually projected on a screen. The user is in a middle of a city and it is a snowy day. The snow will keep falling down and when the user enters the space, the snow that falls on his head, shoulder, and arms starts building up until the user moves. If the user moves, the snow falls off his/her body naturally. It also detects objects in front of the screen, not only human bodies. i.e chairs, bags, etc.
 
Technical part:
The Kinect is located in front in front of the user. To detect the user’s gestures, I have used Kinect 360, Processing and a couple of libraries such as (PBox2D, SimpleOpenNI, and v3ga blob detection library). I have used processing software to create the animation of the falling snow and the interaction between the user’s gestures and the snow.
Process:
Throughout this project I have run into a large number or errors, some related to connecting Kinect to my laptop and others related to the code I’m running. At one point, I was going to give up on using Kinect to detect motion, scrap my whole concept, and create one that is possible to build with only using blobscanner and processing. However, I started building up on other people’s projects and one of the most useful projects I found was Amnon Owed’s tutorial on Kinect Physics: http://www.creativeapplications.net/processing/kinect-physics-tutorial-for-processing/. I thought I found a solution when I saw his tutorial, however, a lot of the methods in that code no longer work because they have been updated by the library owner and I couldn’t find much documentation on the changes.
Setup:
This video can be projected on a screen in a public space where people can interact with the snow and have fun.
Reference:
http://www.creativeapplications.net/processing/kinect-physics-tutorial-for-processing/
 
Interactive Snowfall
Published:

Interactive Snowfall

An interactive space by using a camera in front of the user (Kinect 360) and the response of the user’s gestures will be visually projected on a Read More

Published: