Posts Tagged ‘wpf’

This evening I release a short video of a WPF User Control I made. It’s a little plug-and-play component for the Kinect SDK (beta 2…probably works in beta 1 too) to emulate the touch and hold style pointer system used in so many Kinect applications.

I will tidy this up in the coming days and release it to the community. It’s rather simple and I’ve seen various implementations of this elsewhere on the internet, but none of them have been released formally. If I can help someone out with this then great!

Aside, I was stunned to see this get 150 hits in about two minutes. I was equally stunned to see the counter immediately reset to zero and it hasn’t counted up since. Of course, I’m inclined to believe the more flattering score. Naturally :)

Another little piece of the jigsaw: controlling a WPF ScrollViewer (with added animated easing wizardry) using swipe gestures. Freakin’ yah!

Yeah, one of those.

The trouble with the standard WPF ScrollViewer is that it doesn’t scroll very nicely. Firstly it snaps to the extremities of its components (i.e. to the edges of images) making smooth scrolling impossible. Secondly it doesn’t support .NET’s reasonably powerful animation effects. I sorted this by making my own animation mediator for a ScrollViewer, enabling all that stuff I just mentioned. Here’s a demo:

The intention is to hook this up to the Kinect gesture recogniser so that natural interactions (i.e. swipe gestures) can have a ‘natural’ effect on a menu system. Because, really, if you perform a swipe action on an object you don’t expect it to move uniformly and snap to unnatural positions; instead you would expect it to have inertia, to decelerate, and to rest in a natural position (i.e. a function of how hard you ‘pushed’ it). This WPF extension achieves that.

The past few days I’ve been working on Kinect SDK 2D gesture recording and recognition using a Dynamic Time Warping technique designed by YouTube user ‘simboubou’.

This early demo shows how gestures can reliable be recorded and recognised using this simple harness application. The released version will feature the ability to define your own gestures and save your gestures to file. I’m extremely positive about the prospects for this framework. I think that with some wise thinking and some clever community input this could become a viable gesture recognition solution for the Kinect SDK.

Today I made a touch and hold style menu system for my Celestia project. It works by tracking a player’s hands, comparing where and for how long they are ‘hovered’, and if a certain time target is reached an event is fired.

The tricky part is to make this modular so that many different menus can be created without reinventing stuff. Currently this only works for rectangular areas but I will enhance this to cater for circles and irregular shapes soon.

An interesting thing I had to consider here is that the active area around a player must be scaled in such a way that all menu items can be reached without stretching, but is small enough that items aren’t triggered by accident. I think this will mainly be trial and error, and I may not have time to change the scaling based on how far away from the camera the player stands.

So now I have voice controls and touch menus from Microsoft’s SDK and full-body gesture controls from NITE. I have to decide whether to re-create the NITE stuff in the Kinect SDK, wait for someoe else to make a library of gesture controls in the SDK or to use an SDK->NITE binding (which doesn’t even exist yet). For now, though, I think I’ll go to bed.