Wednesday, May 9, 2012

Project KTracker kicks off

Watch a demo video
We have started to develop a high quality three-dimensional motion tracking system for science education based on the Microsoft Kinect controller, which was released about 18 months ago. This development is part of the Mixed-Reality Labs project funded by the National Science Foundation.

KTracker will provide a versatile interface between the Kinect and many physics experiments commonly conducted in the classroom. It will also provide natural user interfaces for students to control the software for data collection, analysis, and task management. For example, the data collector will automatically pause while the Kinect detects that the experimenter is adjusting the apparatus to create a new experimental condition (during which the data collection should be suspected). Or the user can "wave" to the Kinect to instruct the software to invoke a procedure. In this way, the user will not need to switch hands between the apparatus and the keyboard or mouse of the computer (this "hand-switching" scene seems familiar to the experimentalists reading this post, huh?). The Kinect sensor has the capacity to recognize both gestures of the experimenter and motions of the subject, making it an ideal device for carrying out performance assessment based on motor skill analysis.

KTracker is not a post-processing tool. It is not based on video analysis. Thanks to the high performance infrared-based depth camera built in the Kinect, KTracker is capable of doing motion tracking and kinematic analysis in real time. This is very important as it helps to accelerate the data analysis process and contributes to enhancing the interactivity of laboratory experiments.

KTracker will also integrate a popular physics engine, Box2D, to support simulation fitting. For example, the user can design a computer model of the pendulum shown in the above video and adjust the parameters so that its motion will fit what the camera is showing--all in real time. Like the graph demonstrated in the above video, the entire Box2D will be placed in a translucent pane on top of the camera view, making it easy for the user to align the simulation view and the experiment view.

KTracker will soon be available for download on our websites. We will keep you posted.

Thursday, May 3, 2012

Kinect-based motion tracking and analysis

Click here to watch a video.
Microsoft's Kinect controller offers the first affordable 3D camera that can be used to detect complex three-dimensional motions such as body language, gestures, and so on. It provides a compelling solution to motion tracking, which--up to this point--is often based on analyzing the conventional RGB data from one or more video cameras.

The conventional wisdom of motion tracking based on RGB data requires complicated algorithms to process a large amount of video data, making it harder to implement a real-time application. The Kinect adds a depth camera that detects the distances between the subjects and the sensor based on the difference of the infrared beams it emits and the reflection it receives. This gives us a way to dynamically construct a 3D model of what is in front of the Kinect with a rate of about 10-30 frames per second, fast enough to build interactive applications (see the video linked under the above image). For as low as $100, we now have a revolutionary tool for tracking 3D motions of almost anything.

The demo video in this post shows an example of using the Kinect sensor to track and analyze the motion of a pendulum. The left part of the above image shows the overlay of trajectory and velocity vector to the RGB image of the pendulum, whereas the right part shows the slice of the depth data that is relevant to analyzing the pendulum.

The National Science Foundation provides funding to this work.