This evening I made a huge step in achieving the technical element of my research project. Up until now using Kinect to control anything other than tech demos was little more than theory. Tonight I hooked up Kinect to Celestia and performed a few simple actions. The upshot is: it’s possible!
I used the excellent FAAST – Flexible Action and Articulated Skeleton Toolkit – from the University of Southern California. It’s a relatively simple piece of middleware for OpenNI NITE. This is currently a closed development but they intend on making it open source once some new features have been added and some stability issues resolved. It’s good enough for me for now. It offers the ability to rapidly prototype by emulating key presses/holds based on the detection of a number of stock gestures.
My final product will need to be greatly refined over this, but at least it gives me some confidence that I haven’t bitten off more than I can chew.
Here it is. Not bad for an hour’s work.