Thursday, May 3, 2012

Kinect-based motion tracking and analysis

Click here to watch a video.
Microsoft's Kinect controller offers the first affordable 3D camera that can be used to detect complex three-dimensional motions such as body language, gestures, and so on. It provides a compelling solution to motion tracking, which--up to this point--is often based on analyzing the conventional RGB data from one or more video cameras.

The conventional wisdom of motion tracking based on RGB data requires complicated algorithms to process a large amount of video data, making it harder to implement a real-time application. The Kinect adds a depth camera that detects the distances between the subjects and the sensor based on the difference of the infrared beams it emits and the reflection it receives. This gives us a way to dynamically construct a 3D model of what is in front of the Kinect with a rate of about 10-30 frames per second, fast enough to build interactive applications (see the video linked under the above image). For as low as $100, we now have a revolutionary tool for tracking 3D motions of almost anything.

The demo video in this post shows an example of using the Kinect sensor to track and analyze the motion of a pendulum. The left part of the above image shows the overlay of trajectory and velocity vector to the RGB image of the pendulum, whereas the right part shows the slice of the depth data that is relevant to analyzing the pendulum.

The National Science Foundation provides funding to this work.

Wednesday, April 25, 2012

"Semi-digital" fabrication technologies

A street made by using Energy3D.

Emerging digital fabrication technologies such as 3D printing could trigger a new wave of industrial revolution according to New Scientist. While 3D printers are becoming more affordable and they are growing more powerful, versatile, and speedy, they will likely not be immediately available in the classroom.


Fabrication in schools is fundamentally important to engineering education. The lack of appropriate educational technology that supports students to transform ideas into products could impede student learning and creativity. To meet schools' immediate needs and fill the gap between now and future, we have been developing a flagship app called Energy3D that provides a "semi-digital" solution for fabrication.

The current version of Energy3D focuses on designing, constructing, and testing model buildings. The program supports students to conceive and design a building on the computer. It then converts a computer design into a sketch on paper that can be printed out using a conventional printer. Students can then cut out the pieces from the sketch and then assemble them into buildings as designed. The reason we call this technology "semi-digital" fabrication is because, while the computer helps generate the sketch, students still need to cut and assemble manually.

This has a catch, however, as it assumes the pieces are all as thin as a piece of paper. But for education, it is perfectly fine because it reduces the design and manufacturing complexity for young students, allowing them to address a tractable number of important questions related to math, architecture, engineering, and science.

We are going to the 2012 USA Science and Engineering Festival to be held in Washington DC in April 28-29 to demonstrate this technology. If you happen to be there and are interested in seeing how it works, meet us at the Concord Consortium's Booth #2758 in Hall B.

Monday, March 26, 2012

Augmented reality thermal imaging

IR: Watch the YouTube video
Augmented reality (AR) presents a live view of the real world whose elements are augmented by computer-generated data such as sound or graphics. The technology promises to enhance the user's current perception of reality. AR is considered as an extension of virtual reality (VR). But unlike VR that replaces the real world with a simulated one, AR bridges and takes advantage of the real world and the simulated world.

Augmentation is conventionally in real-time and in semantic context with environmental elements. With the help of AR technology, the information about the surrounding real world of the user becomes digitally manipulable. Artificial information about the environment and its objects can be overlaid on the real world to achieve seamless effects and user experiences.

Our NSF-funded Mixed-Reality (MR) Labs Project has set out to explore how AR/MR technologies can support "augmented inquiry" to help students learn abstract concepts that cannot be directly seen or felt in purely hands-on lab activities.

AR: Watch the YouTube video
One of the first classes of prototype we have built is what we call "Augmented Reality Thermal Imaging." The concepts related to heat and temperature are somehow difficult to some students because thermal energy is invisible to the naked eye. Thermal energy can now be visualized using infrared (IR) imaging. But we have developed AR technology that provides another means of "seeing" thermal energy and its flow.

The first image in this post shows an IR image of a poster board heated by a hair dryer. The second image shows a demo of AR thermal imaging: When a hair dryer blows hot air to a liquid crystal display (LCD), the AR system reacts as if hot air could flow into the screen and leave a trace of heat on the screen, just like what we see in the IR image above. You may click the links below the images to watch the recorded videos.

The tricky part of MR Labs is that, in order to justify the augmentation of a computer simulation to a physical activity, the simulation should be a good approximation of what happens in the real world. We used our computational fluid dynamics (CFD) program, Energy2D, to accomplish this. There are many more demos of MR Labs using Energy2D, which can be viewed at this website.


Tuesday, March 6, 2012

Energy2D: Computational fluid dynamics at your fingertips

Online Energy2D simulations
Energy2D is our signature software for simulating heat transfer and fluid dynamics. Fifty online simulations are now available to the world through Energy2D's website. These simulations run speedily on most computers, bringing a vivid, colorful world of science to your computer screen and allowing you to experiment with them.

All these simulations can be downloaded for editing, provided that you have also installed the standalone Energy2D software on your computer (you don't need it to run the online simulations--only when you need to edit or create a simulation will you need to install it). The editing interface still has limited functionalities, but we are hoping to make it ten times better in the future.

One of our next steps is to make a version that runs on Android. This will allow the simulations you have created to run on tablets and smartphones as well. Work is also underway to include other energy flows and transformations to enrich the natural phenomena it can simulate, and to integrate data from sensors to enable richer user interfaces.

The National Science Foundation provides the funding to make this possible.