Thursday, December 27, 2012

Engineering engineering research: Understanding the fabric of engineering design

A house designed using our Energy3D CAD software.
Perhaps the most important change in the Next Generation Science Standards to be released in March 2013 is the elevation of engineering design to the same level of importance as of scientific inquiry (which was enshrined as a doctrine of science education in the 1996 science standards). But how much do we know about teaching engineering design in K-12 classrooms?

A house made using our Energy3D CAD software.
Surprisingly, our knowledge about students’ learning and ideation in engineering design is dismal. The Committee on Standards for K-12 Engineering Education assembled by the National Research Council in 2010 found “very little research by cognitive scientists that could inform the development of standards for engineering education in K–12.” Most educational engineering projects lacked data collection and analysis to provide reliable evidence of learning. Many simply replicated the “engineering science” model from higher education, which focuses on learning basic science for engineering rather than learning engineering design. Little was learned from these projects about students’ acquisition of design skills and development of design thinking. In the absence of in-depth knowledge about students’ design learning, it would be difficult to teach and assess engineering design.

In response to these problems, we have proposed a research initiative that will hopefully start to fill the gap. As in any scientific research, our approach is to first establish a theory of cognitive development for engineering design and then invent a variety of experimental techniques to verify research hypotheses based on the theory. This blog post introduces these ideas.

In order to study engineering design on a rigorous basis, we need a system that can automatically monitor student workflows to provide us all the fine-grain data we need to understand how they think and learn when they become expert designers from novice designers. This means we have no choice but to move the entire engineering design process onto the computer -- to be more exact, into computer-aided design (CAD) systems -- so that we can keep track of students’ workflows and extract information for inferring their learning. While some educators may be uncomfortable with the virtualization of engineering design, this actually complies with contemporary engineering practices that ubiquitously rely on CAD tools. If we have a CAD system, we can add some data mining mechanisms to turn it into a powerful experimental system for investigating student learning. Fortunately, we have created our own CAD software, Energy3D, from scratch (see the above images about it). So we can do anything we want with it. Since all the CAD tools are similar, the research results should be generalizable.

A cognitive theory of engineering design.
Next we need a cognitive theory of engineering design. Engineering design is interdisciplinary, dynamic, and complicated. It requires students to apply STEM knowledge to solve open-ended problems with a given set of criteria and constraints. It is such a complex process that I am almost certain that any cognitive theory will not be perfect. But without a cognitive theory our research would be aimless. So we must invent one.

Our cognitive theory assumes that engineering design is a process of “knitting” science and engineering. Inquiry and design are at the hearts of science and engineering practices. In an engineering project, both types of practices are needed. All engineering systems are tested during the development phase. A substantial part of engineering is to find problems through tests in order to build robust products. The diagnosis of a problem is, as a matter of fact, a process of scientific inquiry into an engineered system. The results of this inquiry process provide explanations of the problem, as well as feedback to revise the design and improve the system. The modified system with new designs is then put through further tests. Testing a new design can lead to more questions worth investigating, starting a new cycle of inquiry. This process of interwoven inquiry and design repeats itself until the system is determined to be a mature product. 

These elements in our cognitive theory all sound logical and necessary. Now the question is: If we agree on this theory, how are we going to make it happen in the classroom and how are we going to measure its degree of success? Formative assessment seems to be the key. So the next thing we need to invent is a method of formative assessment. But what should we assess in order not to miss the entire picture of learning? This requires us to develop a deep understanding of the fabric of engineering design.

A time series model of design assessment.
Engineering design is a complex process that involves multiple types of science and engineering tasks and subprocesses that occur iteratively. Along with the properties and attributes of the designed artifacts that can be calculated, the order, frequency, and duration learners handle the tasks provide invaluable insights into the fabric of engineering design. These data can be monitored and collected as time series. Formative assessment can then be viewed as the analysis of a set of time series, each representing an aspect of learning or performance. In other words, each time series logs a “fiber” of engineering design.

At first glance, the time series data may look stochastic, just like the Dow Jones index. But buried under the noisy data are students’ behavioral and cognitive patterns. Time series analysis, which has been widely used in signal processing and pattern recognition, will provide us the analytic power to detect learner behaviors from the seemingly random data and then generate adaptive feedback to steer learning to less arbitrary, more productive paths. For example, spectral or wavelet analysis can be used to calculate the frequency of using a design or test tool. Auto-correlation analysis can be used to find repeating patterns in a subprocess. Cross-correlation analysis can be used to examine if an activity or intervention in one subprocess has resulted in changes in another. Cross-correlation provides a potentially useful tool for tracking a designer’s activity with regard to knowledge integration and system thinking.

In the next six months, we will undertake this ambitious research project and post our findings in this blog as we move forward. Stay tuned!

Wednesday, December 12, 2012

Detecting students' "brain waves" during engineering design using a CAD tool

Design a city block with Energy3D.
We were in a school these two weeks doing a project that aims to understand how students learn engineering design. This has been a difficult research topic as engineering design is an extremely complicated cognitive process that involves the application of science and mathematics -- another two sets of complicated subjects themselves.


Two types of problems are commonly encountered in the classroom. The first type is related to using a "cookbook" approach that confines students to step-by-step procedures to complete a "design" project. I added double quotes because this kind of project often leads to identical or similar products from students, violating the first principle of design that mandates alternatives and varieties. However, if we make the design project completely open-ended, we will run into the second type of problem: The arbitrariness and caprice in student designs often make it difficult for teachers and researchers to assess student thinking and learning reliably. As much as we want students to be creative and open-minded, we also want to ensure that they learn what is intended and we must provide an objective way to evaluate their learning outcomes.


To tackle these issues, we are taking a computer science-based approach. Computer-aided design (CAD) tools offer an opportunity for us to move the entire process of engineering design to the computer (this is what CAD tools are designed for in the first place for industry folks). What we need to do in our research is to add a few more things to support data mining.

A sample design of the city block.
This blog post reports a timeline tool that we have developed to measure student activity levels while engaged in using a CAD tool (our Energy3D CAD software in this case) to solve a design challenge. This timeline tool is basically a logger that records the number of the learner's design actions at a given frequency (say, 2-4 times a minute) during a design session. These design actions are defined to be the "atomic" actions stored in the Undo Manager of the CAD tool we are using. The timeline approximately describes the user's frequency of construction actions with the CAD tool. As the human-computer interaction is ultimately driven by the brain, this kind of timeline data could be regarded as a reflection of the user's "brain wave."

There are four things that characterize such a timeline graph:

A sample timeline graph.
  • The height of a spike measures the action intensity at that moment, i.e., how many actions the user has taken since the last recording;
  • The density of spikes measures the continuity and persistence of actions over a time period;
  • A gap indicates an off-task time window: A short idling window may be an effect of instruction or discussion;
  • The trend of height and density may be related to loss of interest or improvement of proficiency in the CAD tool: If the intensity (the combination of height and density of spikes) drops consistently over time, the student's interest may be fading away; if the intensity increases consistently over time, the student might be improving on using the design tool to explore design options.
Timeline graphs from six students.
Of course, this kind of timeline data is not perfect. It certainly has many limitations in measuring learning. We are still in the process of analyzing these timeline data and juxtaposing them with other artifacts we have gathered from the students to provide a more comprehensive picture of design learning. But the timeline analysis represents a rudimentary step towards a more rigorous methodology for performance assessment of engineering design.

The above six "brain wave" graphs were collected from six students in a 90-minute class period. Hopefully, these data will lead to a way to identify novice designers' behaviors and patterns when they are solving a design challenge.

Friday, November 23, 2012

Revisiting Educational Parallel Computing

About four years ago, I dreamed about how multicore computing could push educational computing into a high-performance era. It turns out that the progress in multicore computing has been slow. The computer I am using to write this blog post has four physical cores that support eight virtualized cores, but I don't feel it is dramatically faster than my previous one bought more than six years ago. Worse, it feels much slower than my new Thinkpad tablet, which is powered by a recent I7 dual-core processor. The fact that a dual-core Intel CPU beats a quad-core Intel CPU suggests something must be wrong in the multicore business.

Before the real promise of multicore computing arrives in my computers, two other things have changed the landscape of personal parallel computing. The first is general-purpose computing on graphics processing units (GPGPU), which uses hundreds of GPU processors in a graphics card to perform some calculations traditionally done in CPUs. OpenCL and CUDA are currently two frameworks that support developers to write parallel code to leverage the GPU power (or the power of hybrid CPU/GPU).

The second is cloud computing. Public clouds provide access to thousands of processors. IT companies have developed cutting-edge technologies that make modern search engines so fast. Can they be used to accelerate scientific simulations on your tablets? The Magellan Report on Cloud Computing for Science published by the U.S. Department of Energy last year provides some perspectives from the science community. Cloud gaming provides some complementary perspectives from the game industry. Putting these insights together, I think there is an opportunity here for educational technology developers who want to deliver killer animations for digital textbooks or online courses. After all, like games, the competition in the education media market will eventually be driven by the quality of animations. And when it comes to animations, high quality usually means realistic details and fast renderings.

GPGPU and cloud computing represent a departure from multicore computing to many-core computing. Regardless of what the future of computing will be, parallel computing is not optional -- it is inevitable. Educational technology can benefit from this wave of paradigm shift if we take actions now.

Tuesday, November 6, 2012

InfraMation Keynote Delivered

Orlando is the center of the thermal imaging universe in November 6-8 when it hosts the largest infrared imaging conference in the world: InfraMation. Invited by FLIR Systems, I gave a Keynote Speech on the educational applications of IR imaging in this morning's Opening Plenary and I felt that it was very well received. The PEPSI joke about how to use an IR camera to produce a PEPSI logo (see the second image in this post) was a hit. Everyone laughed.

Here is the link to download my slides in PDF format (34MB). 

Once again, I was thrilled by the power of IR imaging and how this kind of technology can knock down the barrier between disciplines.Even if we are an educational technology firm with a primary mission to teach science, we are in no place to be humble because the science we are seeing through our IR cameras is exactly the same as the science the industry folks are seeing through theirs. Our original discoveries, intended to teach students science concepts, were equally recognized by world leaders in IR imaging technologies such as Prof. Dr. Michael Vollmer from the University of Applied Sciences in Brandenburg, Germany in their publication intended for researchers and professionals. With cutting-edge and yet easy-to-use technologies like IR imaging, the line between research and education is never so blurry. This ought to get science educators to think about the possibilities opened up by new technologies. We keep hearing some educators pushing back by asserting that children are not scientists and cannot think or act like scientists. This kind of argument largely neglects the advancement of technology and throws away the opportunities they bring along. It is time for a change, at least a try.

Wednesday, October 24, 2012

Molecular Workbench downloaded over one million times

I checked our Web log today and the statistics showed that the Molecular Workbench software (Java version) has been downloaded for 1,014,439 times since 2005. This number doesn't include those instances in which MW is embedded in other software or run as an applet. And the number doesn't include the 30+ employees of the Concord Consortium who could conceivably inflate the data a bit.

While I can't say this number translates into a million people (on the other hand teachers tend to have multiple students working together in front of one computer), this is still a significant number that forms a substantial international user base, indicating that the need for this kind of simulation is probably not a false one.

We are often scrutinized by funders whether their investments would turn out worthy. The story of MW suggests a potential weakness in the typical cost-effectiveness analysis based on the initial investment. Federal funding for a project may take a long time to pay back. And the impact tends to accelerate after a critical mass is reached. I bet that the two-million milestone will be reached much sooner than the seven years it took for the first million.

Saturday, October 20, 2012

Think Molecularly: Incredibly Simple Infrared Imaging Experiments Open a Door to Incredibly Deep Scientific Explorations

Figure 1
One of the most fascinating parts of science is the search for answers to strange phenomena. In the past nine months, I have posted more than fifty IR videos on my Infrared YouTube channel. These experiments are all very easy to do, but not all of them are easy to explain and some of them appear to be quite strange at first glance. In this article, I will try to explain one of those experiments, with one of my other skills -- molecular simulation.

Warming surprise

This incredibly simple IR experiment is about putting a piece of paper above a cup of room temperature (nearly) water (Figure 1). I hear you saying, what is the big deal of it? You have probably done that several times in your life, for whatever reasons.

If you happen to have an IR camera and you watch this process through it, you may be surprised. Many of you know that water in an open cup is slightly cooler ( 1-2°C lower) than room temperature because of evaporative cooling: constant evaporation of water molecules from liquid water takes thermal energy away from the cup and causes it to be a bit cooler than the room temperature (which is why you feel cold when you just step out of a swimming pool). You may think that the paper would also cool down when you put it on top of the water because at room temperature paper is a bit warmer than the water in the cup and, based on what your science teacher in high school has told you, heat would flow from the warmer paper to the cooler water, causing the temperature of the paper to drop a bit.

Figure 2 (Watch it in YouTube)
But the result was exactly the opposite -- the paper actually warmed up (Figure 2)! And the warming appeared to be pretty significant -- up to 2°C could be observed in a dry winter day. I don't know your reaction to this finding, but I was baffled when I saw it because this effect first appeared to be a violation of the Second Law of Thermodynamics (which, of course, is impossible)! In fact, the reason I did this experiment at that time was to figure out how sensitive my IR camera might be. My intention was to exploit the evaporative cooling of water to provide a small, stable thermal gradient. I was examining if the IR camera could capture the weak heat transfer between the water and the paper.

Figure 3 (Watch it in YouTube)
I quickly figured out that the culprit responsible for this surprising warming phenomenon must come from the water vapor, which we cannot see with the naked eye. But what we can't see doesn't mean it doesn't exist. When water molecules in the vapor encounters the surface molecules of the paper, they will be captured (this is known as adsorption). When more and more water molecules are captured and condense onto the paper surface, they will return to the liquid state and, according to the Law of Conservation of Energy, release the excessive energy they carry, which causes the paper to warm up. In other words, the paper somehow recovers the energy that the water in the cup lost through evaporation. As you can see now, this is a pretty delicate thermodynamic cycle that connects two phase changes, evaporation and condensation, in two different places and their latent heats. The physicists among us would appreciate if I say that this shows entropy at work: evaporation is an entropic effect caused by water molecules wanting to maximize their entropy by leaving their more organized liquid state. The interaction between the vapor molecules and the paper molecules acts to reverse this process by returning the water molecules to the condensed liquid state and a certain amount of net energy can be extracted from this (known as the enthalpy of vaporization). Of course, the adsorption process itself, caused by the hydrogen bonding between water molecules and paper molecules, releases some amount of heat that contributes to the warming effect as well.

Figure 4: Sensor results.
At this point, I hope you have been enticed enough to want to try this out yourself. If you don't have an IR camera, you can use a temperature sensor or an IR thermometer as a substitution to observe this phenomenon (undoubtedly, nothing beats an IR camera in terms of seeing heat -- with a point thermometer you just need to be patient and be willing to do more tedious work).

But wait, this is not the end of the story!

Dynamic equilibrium

If you keep observing the paper, you will see that this condensation warming effect will diminish in a few minutes (Figure 3). This trend is more clearly shown in Figure 4 in which the temperature of the paper was recorded for ten minutes using a fast-response surface temperature sensor. What the heck happened?

Figure 5 (Watch it in YouTube)
The answer to this question can be illustrated using a schematic molecular simulation (Figure 5) I designed to explain the underlying molecular physics (in that simulation water molecules are simplified as single round particles). After water molecules condense onto the paper surface, a thin layer of condensate will form. When it becomes thick enough, water molecules will evaporate from it, too, just like from the surface layer of water in the cup. When the rate of evaporation equals the rate of condensation, there is no more net warming: The condensation warming and evaporative cooling will eventually reach a "break-even" point. Reaching this equilibrium state doesn't mean that condensation and evaporation on the surface of the paper will stop. In fact, water molecules will keep condensing to the layer and evaporating from it. This is known as "dynamic equilibrium." If you move the paper, you will break this dynamic equilibrium. Figure 6 shows a pattern in which evaporative cooling and condensation warming occurred simultaneously on a single piece of paper after the paper had been shifted a bit. In Figure 6, evaporation dominated in the blue zone that was shifted out of the cup area, condensation dominated in the white zone that was shifted into the cup area, and the overlap zone in the middle remained close to the equilibrium state because it was the zone that still remained inside the cup area -- so business as usual.

Figure 6 (Watch it in YouTube)
As you can see, there is a lot of science in this "simple" experiment! But nothing we have done so far requires expensive materials or supplies. Everything needed to do this experiment is probably within the reach of your arms if you are reading this article at home (and you happen to have a digital thermometer, or better, an IR camera, nearby). If you are an educator, this experiment should fascinate you because this will be a perfect inquiry activity for students. If you are a scientist, this experiment should fascinate you because what I have shown you is in fact an atomic layer deposition experiment that anyone can do -- some Fermi calculation suggests that the thickness of the layer is in the nanometer range (only a few hundred layers of water molecules or 1/10,000th of the diameter of your hair). What we are seeing is in fact a signal from the nanoscale world! Isn't that cool?

Figure 7 (Watch it in YouTube)
Does our story end now? Absolutely not. The new questions you can ask are practically endless if you keep "thinking molecularly." The following are six extended questions I have asked myself. You can try to explore all of these without leaving your kitchen.

When will the paper cool down?

Returning to the original purpose of my experiment (looking for cooling due to heat transfer), can we find a situation in which we will indeed see cooling instead of warming? Yes, if the water is cold enough (Figure 7). When the water is cold, the evaporation rate drops. There will be less water molecules hitting the surface of the paper. The energy gain from weaker condensation warming cannot compensate the energy loss due to the heat transfer between the paper and the cold water. (By the way, I think the heat transfer in this case is predominantly radiative, because air doesn't conduct heat well and natural convection acts against heat transfer in this situation.)

What if the paper has been atop the water for a long time?
Figure 8 (Watch it in YouTube)

If you leave the paper atop the cup of water (water slightly cooler than room temperature, not ice water) for a few hours and you come back to examine it, you would probably be surprised again: The paper is now cooler than room temperature (Figure 8). I wouldn't be surprised if you are totally confused now: This warming and cooling business is indeed quite complicated -- even though everything we have done so far has been limited to manipulating paper and water. To keep the story short, I will tell you that this is because water molecules have traveled through the porous layer of the paper through capillary action and shown up on the other side of the paper (this molecular movement is often known as percolation). Their evaporation from the upper side of the paper cools down the paper. The building science guys among us can use this experiment to teach moisture transport through materials. Can the temperature of the upper side be somehow used to gauge the moisture vapor transmission rate (MVTR) of a porous material? If so, this may provide a way to automatically measure MVTR of different materials. The American Society for Testing and Materials already has established a standard based on IR sensors. Perhaps this experiment can be related to that.

Different materials have different dew points?

Figure 9 (Watch it in YouTube)
Do water molecules condense to other materials such as plastic? We know plastic materials do not absorb water (which is why they are good vapor barriers). If plastic materials are not cold enough, water molecules do not condense to them. Figure 9 shows this difference by using a piece of paper half-covered by a transparency film taped to the underside. Warming was only observed in the paper part, indicating water molecules do not condense to the plastic film. This experiment raises an interesting question: The so-called dew point, the temperature below which the water vapor in the air at a constant barometric pressure will condense into liquid water, may not be an entirely reliable way to predict condensation. Condensation actually depends on the chemical property of the material surface. Hydrophobic (water-hating) materials like plastic tend to have a low dew point, whereas hydrophilic (water-loving) materials tend to have a high dew point. The porosity of the material should matter, too, because a more porous material will provide a large surface for interaction with water molecule -- paper happens to be such a material because of its fiber texture.

Figure 10 (Watch it in YouTube)
Vapor pressure depression

What will happen if we add some salt (or baking soda or sugar) to the water? Figure 10 shows that the condensation warming effect becomes weaker. For our chemist friends, this is known as vapor pressure depression. The salt ions do not evaporate themselves, but their presence in a solution somehow slows down the evaporation of water molecules.

A vapor column?

What will happen if the paper approaches the water from a different angle such as in the vertical direction? How does the shape of the water vapor distribution above a cup of water look like? Does it look like a steam from a cup of coffee? Figure 11 could probably give you some clue.

What about alcohol?

Figure 11 (Watch it in YouTube)
So far we have used only water. What about other liquids? Alcohol is pretty volatile. So I tried some isopropyl alcohol (91%). Once again, I was baffled. Our experience with applying rubbing alcohol to our skin says that alcohol cools faster than water. So I expected that when the isopropanol  molecules condense, they would release more heat. But this is not what Figure 12 suggests! Given the fact that the enthalpies of vaporization of alcohol and water are 44 and 41 kJ/mol, respectively, the only sensible explanation may be that the warming effect is not only due to the condensation of the vapor molecules, but also the interaction between the vapor molecules and the cellulose molecules of the paper. If the interaction between an alcohol molecule and a cellulose molecule is weaker, then the adsorption rate will be slower and the adsorption of the alcohol molecule onto the paper surface will produce less heat. I don't know how to prove this now, but this could be a good topic of research.
Figure 12 (Watch it in YouTube)

Concluding remark

Even if this is a lengthy article (and thanks for making it to the end), I am pretty sure that the scientific exploration does not stop here. There are other questions that you can ask yourself. For me, I have been intrigued by the fascinating thermodynamic cycles in a humble cup of water and have been wondering if they could be used to engineer something that can harvest the latent heats of evaporation and condensation. In other words, could we turn a cup of water into a tiny power plant to, say, charge my cell phone? In theory, this is possible as any temperature gradient, however weak it may be, can be translated into electric current using a thermoelectric generator based on the Seebeck effect. The evaporation of water molecules from an open cup is a free gift of entropy from Mother Nature that ought to be harnessed some day.

Wednesday, October 17, 2012

Energy3D Version 1.0 released!

Looking for free tools to teach engineering design in K-12 classrooms? We are pleased to announce that Energy3D Version 1.0 is now available for free download at http://energy.concord.org/energy3d. Energy3D is a computer-aided design and fabrication tool for designing and making model buildings. With it, your students can easily conceptualize a dream house on the computer, print and assemble a real model, and take it home to show to their parents!

Energy3D works just like Google's SketchUp: You can create a 3D structure by drag-and-drop -- no number crunching is required. But unlike SketchUp, it is tailor-made for building design, evaluation, and fabrication to support engineering design learning in K-12 schools. One of its great features is the "print-out" functionality, which allows students to print out the houses they designed using a regular printer and then cut out the 2D pieces for 3D assembly (see the second image in this blog post).

You can imagine how Energy3D may work for your students by looking at the houses designed by a class of high school students in the third image of this blog post. The tool is very easy to use and works well even for young kids. So if you are teaching in an elementary school, give it a try and tell us how it can be improved for younger students.


The development of Energy3D has been funded by the National Science Foundation under the Engineering Energy Efficiency Project. Dr. Saeid Nourian, a computer scientist with a Ph. D. from the University of Ottawa, has been the primary developer since joining the project in 2010. The software is based on the open-source scene graph game engine, Ardor3D, which requires Java to be installed.


Sunday, September 23, 2012

A Visual Approach to Nanotechnology Education

A hypothetical nano sorting machine.
The International Journal of Engineering Education published our paper "A Visual Approach to Nanotechnology Education." The paper presents a systematic approach based on scientific visualization to teaching and learning concepts in nanoscience and nanotechnology. Five types of mathematical models are used to generate visual, interactive simulations that provide a powerful software environment for experiential learning through virtual experimentation. These five types, which are implemented in the Molecular Workbench software, are:
  • All-atom molecular dynamics
  • Coarse-grained molecular dynamics
  • Gay-Berne molecular dynamics
  • Soft-body biomolecular dynamics
  • Quantum dynamics (including real space and imaginary space)
The nanotechnology content areas covered by this approach are discussed. These areas include notoriously difficult subjects such as statistical mechanics and quantum mechanics.

A Gay-Berne model of molecular self-assembly.
A variety of instructional strategies for effective use of these simulations are discussed. These inquiry-based strategies cover use in lecture, student-centered exploration, and student model construction.

Preliminary results from a pilot study at the college level, which was conducted by Dr. Hee-Sun Lee at Department of Physics, University of California Santa Cruz, demonstrated the potential of this approach for improving nanotechnology learning.

Tuesday, August 21, 2012

Natural learning interfaces

Natural user interfaces (NUIs) are the third generation of user interface for computers, after command line interfaces and graphical user interfaces. A NUI uses natural elements or natural interactions (such as voice or gestures) to control a computer program. Being natural means that the user interface is built upon something that most people are already familiar with. Thus, the learning curve can be significantly shortened. This ease of use allows computer scientists to build more complicated but richer user interfaces that simulate the existing ways people interact with the real world.

Research on NUIs is currently one of the most active areas in computer science and engineering. It is one of the most important directions of Microsoft Research. In line with this future, our NSF-funded Mixed-Reality Labs (MRL) project has proposed a novel concept called the Natural Learning Interfaces (NLIs), which represents our latest ambition to realize the educational promise of cutting-edge technology. In the context of science education, an NLI provides a natural user interface to interact with a scientific simulation on the computer. It maps a natural user action to the change of a variable in the simulation. For example, the user uses a hot or cold source to control a temperature variable in a thermal simulation. The user exerts a force to control the pressure of a gas simulation. NLIs use sensors to acquire real-time data that are then used to drive the simulation in real time. In most cases, it involves a combination of multiple sensors (or multiple types of sensors) to feed more comprehensive data to a simulation and to enrich the user interface.

I have recently invented a technology called the Frame, which may provide a rough idea of what NLIs may look like as an emerging learning technology for science education. The Frame technology is based on the fact that the frame of a computer screen is the natural boundary between the virtual world and the physical world and is, therefore, an intuitive user interface for certain human-computer interactions. Compared with other interfaces such as touch screens or motion trackers, the Frame allows users to interact with the computer from the edges of the screen.

Collaborating with Jennie Chiu's group at the University of Virginia (UVA), we have been working on a few Frame prototypes that will be field tested with several hundred Virginia students in the fall of 2012. These Frame prototypes will be manufactured using UVA's 3D printers. One of the prototypes shown in this blog post is a mixed-reality gas lab, which was designed for eighth graders to learn the particulate nature of temperature and pressure of a gas. With this prototype, students can push or pull a spring to exert a force on a virtual piston, or use a cup of hot water or ice water to adjust the temperature of the virtual molecules. The responsive simulation will immediately show the effect of those natural actions on the state of the virtual system. Besides the conventional gas law behavior, students may discover something interesting. For example, when they exert a large force, the gas molecules can be liquified, simulating gas liquifying under high pressure. When they apply a force rapidly, a high-density layer will be created, simulating the initiation of a sound wave. I can imagine that science centers and museums may be very interested in using this Frame lab as a kiosk for visitors to explore gas molecules in a quick and fun way.

A mixed-reality gas lab (a Frame prototype)
As these actions can happen concurrently, two students can control the simulation using two different mechanisms: changing temperature or changing pressure. This makes it possible for us to design a student competition in which two students use these two different mechanisms to push the piston into each other's side as far as possible. To the best of our knowledge, this is the first collaborative learning of this kind mediated by a scientific simulation.

NLIs are not just the results of some programming fun. NLIs are deeply rooted in cognitive science. Constructivism views learning as a process in which the learner actively constructs or builds new ideas or concepts based upon current and past knowledge or experience. In other words, learning involves constructing one's own knowledge from one's own experiences. NLIs are learning systems built on what learners already know or what they feel natural. The key of a NLI is that it engineers natural interactions that connect prior experiences to what students are supposed to learn, thus building a bridge for stronger mental association and deeper conceptual understanding.

Friday, August 17, 2012

Energy2D to reach thousands of schools

Thermoregulation
Project Lead The Way (PLTW) is the leading provider of rigorous and innovative Science, Technology, Engineering, and Mathematics (STEM) education curricular programs used in middle and high schools across the US. The PLTW Pathway To Engineering (PTE) program includes a foundational course called the Principles of Engineering (POE) designed for 10-11th grade students. PLTW curriculum currently reaches 4,780 schools.

According to Bennett Brown, Associate Director of Curriculum and Instruction of PLTW, our Energy2D software will be adopted in the POE curriculum to support a variety of core engineering concepts including power, energy, heat transfer, controls, and environmental factors.
Solar heating cycles

Since the release of the first alpha version in 2011, Energy2D has already been used by thousands of users worldwide, but the collaboration with PLTW will be a big step forward for Energy2D to reach more students. The timing of this collaboration is particularly important to engineering tools such as Energy2D, as--for the first time--engineering has been officially written into the US K-12 Science Education Standards. Once the Standards roll out, thousands of teachers will be looking for leading-edge tools that can help them teach engineering. This will be a great opportunity for Energy2D.

Why is Energy2D so special that people want to use it? Our website provides many self-explanatory examples. But there is one hidden gem I want to emphasize here: Its computational engine is based on good algorithms I devised specially for this simulator. Its heat solver can be so accurate that a simulation can maintain the total energy of an isolated system at a level as accurate as 99.99% for as long as it runs, regardless of the complexity of the structures in the system! The fact that the sum of energy from all the 10,000 grid cells remains a constant after billions of individual calculation steps reflects the holy grail of science and engineering. If anything, engineering is about accuracy. A good engineering tool should be able to give students a good engineering habit of mind and accuracy should be a paramount part of it.

Wednesday, August 8, 2012

The first Earth science simulation in Energy2D is here: Mantle convection!

It is my goal to make the Energy2D software a powerful simulation tool for a wide audience. Last week I have added some engineering examples and blogged about them.

Last night I came up with an idea for simulating mantle convection, the slow creeping motion of Earth's rocky mantle caused by convection currents carrying heat from the interior of the Earth to the surface. It turned out that the idea worked out.
 
This blog post demonstrates the first geoscience simulation created using Energy2D. The two screenshots show mantle convection at different times. The streamlines in the second image represent the convective currents. From the simulation, you can see the gradual cooling of the core due to mantle convection--This happens in the time frame of billions of years, but a computer simulation can show it in a few seconds. For simplicity, we don't distinguish the inner core and the outer core in this model. Later, we can build a more complex one that includes these subtle details.

The simulation is available online at: http://energy.concord.org/energy2d/mantle.html. Take a look and stay tuned for more Earth science simulations--brought to you by Energy2D!

Friday, August 3, 2012

Energy2D V1.0 released!

The first stable version of Energy2D, an open-source and free heat transfer simulation tool made possible by funding from the National Science Foundation, is now available for download. The program can be installed as a desktop app, which can be used to create high-quality simulations that can be deployed on the Internet as applets. It comes with about 40 templates to help you get started to design your own simulations. The Energy2D website provides plenty of examples that show how you can integrate your simulations on your websites. The examples cover a wide range of topics in heat transfer, fluid dynamics, and thermal engineering. Thermal engineering is a major feature added recently and will be expanded in the future. The example to the right, "How solar cycles affect the duty cycle of a thermostat," showcases this new feature.

When you click the "Java Webstart Installer" on the website, the software will be automatically downloaded and installed on your desktop. The website's Download page has detailed information for how to publish your Energy2D simulations or integrate them with your web stuff.

If you have used the Energy2D app before, you will need to remove the previous installation in order to enjoy the convenience of full OS integration that this version offers. For Windows users, go to "Control Panel > Java." For Mac users, go to the Java Preference. In either case, you can find the previous installation in "Temporary Internet Files."

If you have just used the online applets on our website but haven't downloaded the app, there is nothing you need to remove. Although it is perfectly fine to use the online applets as they are, we think you should try the app--It will give you the full ability to create, design, and test.

Friday, July 27, 2012

Thermostats in Energy2D

A thermostat is a controller that maintains a system's temperature near a fixed point. The simplest thermostat does this by switching a heater or AC on and off to maintain the desired temperature (known as the bang-bang control). I spent a couple of days adding thermostats to Energy2D and developing a simple GUI for setting up thermostats.

In Energy2D, a thermostat is a connection between a power source and a thermometer. A thermometer can be linked to any number of power sources, but a power resource can only be linked to one thermometer. In the property window of a thermometer, the user can select the power sources it will control.

This Energy2D model demonstrates how a thermostat works. Turn on the temperature graph. Let the simulation run for a few cycles and then turn on the sunlight. Compare the behavior of the temperature graph. You can also try to move the temperature sensor around to examine how the on/off time of the thermostat depends on its location.

You should discover from this simulation that, when the sun shines on the house, it ends up using less energy to maintain the inside temperature because the time that the heater is on is shorter (see the differences of the two graphs in the first two images of this post). You should also find out why we should not put the sensor of a thermostat near a window.

The third image shows multiple thermostats at work to create different heating zones. This Energy2D simulation has four heaters in three rooms, each of which is controlled by a thermostat. 

From these demos of thermostats in Energy2D, you can see the richness of the software. I will add more useful features like this to make Energy2D even better. Stay tuned!

Monday, July 23, 2012

Two Interactive Features Added to Energy2D

Energy2D is our signature software for heat transfer and fluid dynamics simulations. Written in Java, it runs speedily either as a standalone app on your desktop or an embedded applet within a browser. It is actively being developed to meet the need of energy education to have an interactive and constructive learning environment based on rigorous scientific principles. Energy2D is already a highly interactive system--you can change anything that is allowed to change by the author of a simulation while it is running. Recently, I have added two new features to make it even more interactive. Both features apply to all existing Energy2D simulations I (or you) have created.

The first one is a "heat dropper," a mode in which the user can click or drag the mouse to add or remove heat from the location in the model that the mouse points to. If you have a touch screen, you can touch or swipe your finger across it and the heat dropper works as if your finger could give heat to the virtual space in the simulation. The first video in this blog post shows how it works.

The second one is a "field reader," a mode in which the user can move the mouse to read the value of a property distribution field at the location the mouse points to. Currently, the supported property fields include temperature, thermal energy, and fluid velocity (which will be zero in a solid). The second video shows how it works.

If a web page that embeds an Energy2D applet doesn't already have a drop-down menu on the page for you to switch to these modes, you can always access them through the View Options dialog window. The View Options menu can be found if you right-click on a spot in the simulation window that is not occupied by a model component (like a polygon or a sensor).

Wednesday, July 18, 2012

Molecular Workbench used at University of Ottawa Medical School to teach molecular simulations

The Molecular Workbench software has been widely used in middle and high schools. It is relatively unknown that many colleges and universities around the world use it in their classrooms as well.

Recently, the software was used in the Summer School in the Systems Biology of Neurodegenerative Disease offered by the Ottawa Institute of Systems Biology. Students in this Summer School learned about the basics of molecular dynamics simulations using tools including our "intuitive" Molecular Workbench. They then applied their new knowledge to either model and simulate bilayer membranes made of various lipid species or strictly model a lipid using three different approaches.

For the Molecular Workbench, we have developed a set of unique simulation techniques that can render a dynamic cartoon view of biomolecular processes that are usually too complicated to show all the fine details (see the images to the right for a cartoonized simulation of micelle formation in water and oil, respectively). This capability turns what used to be static illustrations in a biology textbook dynamic and interactive and provide opportunities of exploration to students. This is the key why the coarse-grain modeling techniques developed for MW based on soft body dynamics and particle dynamics looks so promising for the current wave of digitization of chemistry and biology textbooks.

Thursday, July 5, 2012

A simple IR experiment to prove that the North Carolina Sea Level Rise Bill is just flat wrong

Last month, North Carolina's Senate passed a bill that would have required the state's Coastal Resources Commission to base predictions of future sea level rise along the state's coast on a steady, linear rate of increase. This has sparked controversies across the nation amid the record heat waves in many states.

If the lawmakers had done our very simple IR experiment on visualizing thermohaline in a cup, published in the July issue of last year's Journal of Chemical Education (see the image to the left), they would have had a better understanding about the possibility of the nonlinear acceleration of ice shelf melting: The less salty the seawater is, the faster the ice shelf above it melts. And the faster ice melts, the less salty the seawater will become. This creates a positive feedback loop that accelerates the melting process. If the speed of ice melting in systems as simple as a cup of saltwater is not as nice as the "steady, linear" rate some of the lawmakers would like to see, who can be sure that systems as complex as the Earth would follow a "steady, linear" trajectory of change?

If you bother to read on, this experiment uses just a cup of tap water, a cup of salt water, and some ice cubes. The two cups are placed next to each other on a table for comparison. (a) An IR image right after an ice cube was added to a cup of freshwater (left) and a cup of saltwater (right). (b) An IR image taken after four minutes showing a downwelling column in the freshwater. (c) An IR image taken after nine minutes showing the tabletop was cooled significantly near the freshwater cup. (d) An IR image taken after 16 minutes showing that the bottom of the freshwater cup became cooler than the top whereas the bottom of the saltwater cup remained warmer than the top.

To see the entire process caught under an IR camera, you can watch the embedded YouTube videos in this blog post. Feel free to send these videos to your representatives if you happen to live in the coastal area of North Carolina. Or send to a science teacher in North Carolina in the hope that the bill will be revised in the future to consider the possibility of nonlinear acceleration.

Note that these videos do not represent any political view and should not be considered as in support of any agenda, my purpose is only to provide a humble scientific demonstration to prove that things do not always go smoothly as we wish.

Monday, July 2, 2012

Investigating thermoimaging in augmented multisensory learning about heat transfer

Jesper Haglund from Linköping University presents a poster about our Sweden-US collaborative research on thermal visualization at the 2012 World Conference on Physics Education held in Istanbul, Turkey. Below is the abstract of the poster:

"Infrared (IR) thermal imaging is a powerful technology which holds the pedagogical potential of ‘making the invisible visible’, and is becoming increasingly affordable for use in educational contexts. Science education research has identified many challenges and misconceptions related to students’ learning of thermodynamics, including disambiguation of temperature and heat, and a common belief that our sense of touch is an infallible thermometer. The purpose of the present study was to explore how thermal imaging technology might influence students’ conceptual understanding of heat and temperature. This was carried out by investigating three different conditions with respect to students exploration of the thermal phenomena of different objects (e.g. wood, metal and wool), namely the effect of students’ use of real-time imaging generated from a FLIR i3 IR camera, students’ interpretation of static IR images, and students’ deployment of traditional thermometer apparatus. Eight 7th-grade students (12-13 years old) worked in pairs across the three experimental conditions, and were asked to predict, observe and explain (POE) the temperature of a sheet-metal knife and a piece of wood before, during and after placing them in contact with their thumbs. The participants had not been exposed to any formal teaching of thermodynamics and the ambition was to establish if they could discover and conceptualise the thermal interaction between their thumbs and the objects in terms of heat flow with minimal guidance from the researchers. The main finding was that a cognitive conflict was induced in all three conditions, as to the anomaly between perceived ‘hotness’ and measured temperature, with a particular emotional undertone in the real-time IR condition. However, none of the participants conceptualised the situation in terms of a heat flow. From the perspective of establishing a baseline of the understanding of thermal phenomena prior to teaching, extensive quantities, e.g. ‘heat’ or ‘energy’, were largely missing in the participants’ communication. In conclusion, although an unguided discovery or inquiry-based approach induced a cognitive conflict, it was not sufficient for adjusting the students’ conceptual ecologies with respect to the age group studied here. Future research will exploit the promise of the cognitive conflict observed in this study by developing a more guided approach to teaching thermal phenomena that also takes full advantage of the enhanced vision offered by the thermal camera technology."

If you happen to be at WCPE 2012, drop by his poster: Session - 1.04, Date & Time: 7/3/2012 / 13:00 - 14:00, Room: D406 (3rd Floor).

If you don't know what thermal visualization is, visit our InfraredTube website.

Saturday, June 30, 2012

Investigating the Kármán vortex street using Energy2D

Run this simulation.
The Kármán vortex street is a repeating pattern of swirling vortices caused by the unsteady separation of flow of a fluid over bluff bodies. It is named after the great scientist Theodore von Kármán who co-founded NASA's JPL. This effect is observable in nature like in a stream, but you need some luck since it requires some picky conditions that are not always there for you.

Now, with our online simulation program Energy2D you can create and investigate the Kármán vortex street in your browser without depending on Mother Nature to give you an opportunity window.

For example, you can test how big an obstacle should be in order to produce this effect. You will find that an obstacle must be large enough to create a steady vortex street. If the shape of the obstacle is not streamlined, what will you see?

If you stick a thermometer in a thermal vortex street, you should see that the temperature will swing pretty regularly between a high value and a low value (see the image to the right). This means this effect could be used to warm and cool an array of things periodically. Could there be some engineering use of this?

Friday, June 1, 2012

YouTube Physics features our infrared videos

AAPT's Physics Teacher runs a column called YouTube Physics edited by Diane Riendeau, an award-winning physics teacher. In May, the entire column featured five intriguing YouTube videos from our IR website and recommended instructional strategies to use them effectively in the classroom.

Diane recently wrote about the YouTube Physics Column: "Through the use of YouTube, we can show our students demos that we do not have the capability of doing in class. We can use these videos to inspire them and show them some of the cutting-edge discoveries in our field. We can also show them videos from around the world. Students need to realize that the physics community is global, not just national. They should learn to marvel in the discoveries made by physicists from all nations."

We resonate with her vision, which is why we are publishing our IR videos on YouTube to allow students from all over the world to learn thermodynamics, heat transfer, chemistry, and other science subjects in everyday phenomena through IR vision. In the long run, we hope this effort will give birth to an "IRTube" that collects IR views of many scientific phenomena. With the introduction of thermal imaging technology into the classroom, we hope students will begin to upload their own IR videos to the IRTube. Darren Binnema, a student from the King's University College in Edmonton, Canada, has contributed the first IR video to the "IRTube." His IR video visualizes the heat of solutions of NaOH and KCl (see the above image).

For more IR videos, please visit the IRTube website.

Wednesday, May 9, 2012

Project KTracker kicks off

Watch a demo video
We have started to develop a high quality three-dimensional motion tracking system for science education based on the Microsoft Kinect controller, which was released about 18 months ago. This development is part of the Mixed-Reality Labs project funded by the National Science Foundation.

KTracker will provide a versatile interface between the Kinect and many physics experiments commonly conducted in the classroom. It will also provide natural user interfaces for students to control the software for data collection, analysis, and task management. For example, the data collector will automatically pause while the Kinect detects that the experimenter is adjusting the apparatus to create a new experimental condition (during which the data collection should be suspected). Or the user can "wave" to the Kinect to instruct the software to invoke a procedure. In this way, the user will not need to switch hands between the apparatus and the keyboard or mouse of the computer (this "hand-switching" scene seems familiar to the experimentalists reading this post, huh?). The Kinect sensor has the capacity to recognize both gestures of the experimenter and motions of the subject, making it an ideal device for carrying out performance assessment based on motor skill analysis.

KTracker is not a post-processing tool. It is not based on video analysis. Thanks to the high performance infrared-based depth camera built in the Kinect, KTracker is capable of doing motion tracking and kinematic analysis in real time. This is very important as it helps to accelerate the data analysis process and contributes to enhancing the interactivity of laboratory experiments.

KTracker will also integrate a popular physics engine, Box2D, to support simulation fitting. For example, the user can design a computer model of the pendulum shown in the above video and adjust the parameters so that its motion will fit what the camera is showing--all in real time. Like the graph demonstrated in the above video, the entire Box2D will be placed in a translucent pane on top of the camera view, making it easy for the user to align the simulation view and the experiment view.

KTracker will soon be available for download on our websites. We will keep you posted.

Thursday, May 3, 2012

Kinect-based motion tracking and analysis

Click here to watch a video.
Microsoft's Kinect controller offers the first affordable 3D camera that can be used to detect complex three-dimensional motions such as body language, gestures, and so on. It provides a compelling solution to motion tracking, which--up to this point--is often based on analyzing the conventional RGB data from one or more video cameras.

The conventional wisdom of motion tracking based on RGB data requires complicated algorithms to process a large amount of video data, making it harder to implement a real-time application. The Kinect adds a depth camera that detects the distances between the subjects and the sensor based on the difference of the infrared beams it emits and the reflection it receives. This gives us a way to dynamically construct a 3D model of what is in front of the Kinect with a rate of about 10-30 frames per second, fast enough to build interactive applications (see the video linked under the above image). For as low as $100, we now have a revolutionary tool for tracking 3D motions of almost anything.

The demo video in this post shows an example of using the Kinect sensor to track and analyze the motion of a pendulum. The left part of the above image shows the overlay of trajectory and velocity vector to the RGB image of the pendulum, whereas the right part shows the slice of the depth data that is relevant to analyzing the pendulum.

The National Science Foundation provides funding to this work.

Wednesday, April 25, 2012

"Semi-digital" fabrication technologies

A street made by using Energy3D.

Emerging digital fabrication technologies such as 3D printing could trigger a new wave of industrial revolution according to New Scientist. While 3D printers are becoming more affordable and they are growing more powerful, versatile, and speedy, they will likely not be immediately available in the classroom.


Fabrication in schools is fundamentally important to engineering education. The lack of appropriate educational technology that supports students to transform ideas into products could impede student learning and creativity. To meet schools' immediate needs and fill the gap between now and future, we have been developing a flagship app called Energy3D that provides a "semi-digital" solution for fabrication.

The current version of Energy3D focuses on designing, constructing, and testing model buildings. The program supports students to conceive and design a building on the computer. It then converts a computer design into a sketch on paper that can be printed out using a conventional printer. Students can then cut out the pieces from the sketch and then assemble them into buildings as designed. The reason we call this technology "semi-digital" fabrication is because, while the computer helps generate the sketch, students still need to cut and assemble manually.

This has a catch, however, as it assumes the pieces are all as thin as a piece of paper. But for education, it is perfectly fine because it reduces the design and manufacturing complexity for young students, allowing them to address a tractable number of important questions related to math, architecture, engineering, and science.

We are going to the 2012 USA Science and Engineering Festival to be held in Washington DC in April 28-29 to demonstrate this technology. If you happen to be there and are interested in seeing how it works, meet us at the Concord Consortium's Booth #2758 in Hall B.