Surfnet logo

Name-Based Tangible Detection and Tracking on Tabletops

This is a completed project.

Description

This project extends Gesture Toolkit so that it allows for the use of gestures that involve tangibles. The tangibles to be used are provided with names using a separate learning application. These names can then be used in any application implementing gesture toolkit to create gestures (using the gesture definition language) that involve tangible detection, movement, rotation, or removal. For instance, a gesture can be defined by “Object Rotated: Book 50..100”. This means that a callback will be fired when the object named “Book” is rotated between 50 and 100 degrees.

The tangible detection uses the feature-based detection algorithm SURF, making it great for feature-rich objects such as images. It does not require tabletop specific tags to work. It also allows for accurate 360 degree rotation detection.

The toolkit is designed in a manner which makes it easy to move between tabletop platforms. The learning application can be implemented using the API provided for it. All that is required from the developer is to provide the raw image data to the API. When using Gesture Toolkit on a new platform, a new provider will need to be implemented. Again, what is required for the tangible detection to work is to provide the raw image at every new frame.

The tangible extension makes it very easy to use multiple objects in any application. The learning application and extension are written in C#.

 

Images and Videos


 

 

Demos & Software Components

To find the latest version please go to: http://gesturetoolkit.codeplex.com/SourceControl/network/Forks/MahaS/TangibleExtension