Showing posts with label Thermography. Show all posts
Showing posts with label Thermography. Show all posts

Wednesday, November 21, 2018

The SmartIR Gallery

Fig. 1: Thermal images in the Gallery
Gallery is a tool within the SmartIR app for users to manage their own thermal images, temperature graphs, and other data within the app. Unlike the default gallery in the operating system, SmartIR Gallery shows only things related to thermal imaging and provides image processing functions for analyzing thermal images and videos.

Fig. 2: Image processing in the Gallery
The gallery can be opened from the action bar of the SmartIR app. Once launched, it displays all the files that the user has collected through the app and stored on the device in a grid view (Figure 1). Tapping an image within the grid will display an enlarged view of the image in a popup window (by the way, the enlarged image in Figure 1 shows the liquid level of a propane tank revealed by the effect of cooling due to evaporation that feeds the gaseous fuel to the pipeline). Within the window, the user can swipe to the left or right, or tap the left or right arrow buttons, to browse the images.
Fig. 3: Temperature graphs in the Gallery

Based on OpenCV, SmartIR Gallery provides a number of image analysis tools for the user to quickly process thermal images. For instance, the user can blur, sharpen, invert, posterize, and adjust the brightness and contrast of an image (Figure 2). In the left image shown in Figure 2, sharpening a thermal image makes the view of the house more pronounced than the default view rendered by the FLIR ONE thermal camera. Inverting the thermal image, as shown in the middle image of Figure 2, may be a quick way to show how a heated house in the winter might look like in the opposite thermal condition such as an air-conditioned house in a hot summer day. In the right image shown in Figure 2, posterizing the thermal image creates an artistic view. More OpenCV tools will be added in the future to extend SmartIR's thermal vision capacity.

SmartIR Gallery also allows the user to filter the image files. For instance, the user can choose to show only the graphs, which are screenshot images taken from the temperature chart of SmartIR (Figure 3). Using the Share button, the user can easily send the displayed image to other apps such as an email app or a social network app.

Wednesday, September 19, 2018

A Small Step towards a Big Dream of Infrared Street View

The Infrared Street View is an award-winning project recognized by the U.S. Department of Energy and subsequently funded by the National Science Foundation. The idea is to create a thermal equivalent of Google's Street View that would serve as the starting point to develop a thermographic information system (i.e., the "Map of Temperature"). This is an ambitious goal that is normally only attainable through big investments from the Wall Street or by big companies like Google. However, as a single developer who doesn't have a lot of resources, I decided to give it a shot on my own. Being a non-Googler, I am counting on the citizen scientists out there to help me build the Infrared Street View. The first step is to create a free app so that they have a way to contribute.

My journey started in the mid of July 2018. In two months I have learned how to develop a powerful app from scratch. At the end, the Infrared Street View is coming into sight! This blog article shows some of the (imperfect but promising) results, as demonstrated in Figures 1 and 2.


Fig. 1: Panoramas in visible light and infrared light generated by SmartIR
This milestone is about developing the functionality in the SmartIR app for creating infrared panoramas so that anyone who has a smartphone with an infrared camera attachment such as FLIR ONE could produce a panoramic image and contribute it to the Infrared Street View, much like what you can do with Google's Street View app. Although this sounds easy at first glance, it has turned out to be quite challenging as we must work under the constraint of a very slow infrared thermal camera that can only take less than ten pictures per second. As our app targets average people who may be interested in science and technology, we must provide an easy-to-do user interface so that the majority of people can do the job without being overwhelmed. Lastly, to create virtual reality in infrared light, we must overcome the challenge of stitching the imperfect thermal images together to produce a seamless panoramic picture. Although they are many image stitchers out there, no one can be sure that they would be applicable to thermal images as those stitchers may have been optimized for only visible light images.

Fig. 2: Panoramas in visible light and infrared light (two coloring styles) generated by SmartIR
To support users to make 360° panoramas, SmartIR guides them to aim at the right angles so that the resulting image set can be used for stitching. These images should be evenly distributed in the azimuthal dimension and they should overlap considerably for the stitcher to have a clue about how to knit them together. SmartIR uses the on-board sensors of the smartphone to detect the orientation of the infrared camera. A number of circles are shown on the screen to indicate the orientations which the user should aim the cursor of the camera at. When the cursor is within the circle, an image is automatically taken and stored in a 360° scroller. By turning at a fixed position and aiming at the circles, the user can capture a series of images for stitching. The following YouTube videos show how this image collector works.



Although this is a very primitive prototype, it nonetheless represents the first concrete step towards the realization of the Infrared Street View. Down the road, stitchers for infrared thermal images still need significant improvements to truly achieve seamless effects similar to those for visible light images. Tremendous challenges for weaving the Map of Temperature still lie ahead. I will keep folks posted as I inch towards the goal and I am quite optimistic that I can get somewhere, even though I am not a Googler.

Sunday, September 9, 2018

Creating Augmented Reality Experiences in the Thermal World with SmartIR

Location-based augmented reality (AR) games such as Pokemon Go have become popular in recent years. What can we learn from them in order to make SmartIR into a fun science app? In the past week, I have been experimenting with AR in SmartIR using the location and orientation sensors of the smartphone. This article shows the limited progress I have achieved thus far. Although there are still tons of challenges ahead, AR appears to be a promising direction to explore further in the world of infrared thermal imaging.

According to Wikipedia, augmented reality is an interactive experience of a real-world environment whereby the objects in the real world are "augmented" by computed information. Typically, AR is implemented with a mobile device that has a camera and a display. In a broad sense, an image from a FLIR ONE thermal camera using the so-called MSX technology is automatically AR by default as it meshes a photo of the real world with false colors generated from the thermal radiation of the objects in the real world measured by the camera's microbolometer array. Similarly, the object-tracking work I have recently done for Project Snake Eyes can augment thermal images with information related to the recognized objects computed from their infrared radiation data, such as their silhouettes and their average temperatures.

Fig. 1: Very simple AR demos in SmartIR

But these are not the AR applications that I want to talk about in this article. The AR experience I hope to create is more similar to games like Pokemon Go, which is based on information geotagged by users and discovered by others. The involvement of users in creating AR content is critical to our app, as it aims to promote location-based observation, exploration, and sharing using thermal imaging around the world and aggregate large-scale thermal data for citizen science applications. Figure 1 shows the augmentation of the thermal image view with geotagged information for a house. If you are wondering about the usefulness of this feature other than its coolness (the so-what question), you can imagine tagging the weak points of a thermal envelope of a building during a home energy assessment. The following YouTube videos show how geotagging works in the current version of SmartIR and how users can later discover those geotags.




At this point of the development, I envision SmartIR to provide the following AR experiences. For users who have a thermal camera, a geotag can obviously guide them to find and observe something previously marked by others. What if you don't have a thermal camera but would still like to see how a place would look like through the lens of a thermal camera? In that case, a geotag allows you to see a plain thermal image or virtual reality (VR) view stored in the geotag, taken previously by someone else who had a thermal camera and overlaid in the direction of the current view on the screen of your phone. If VR is provided for the tagged site, the thermal image can also change when you turn your phone. Although nothing beats using a thermal camera to explore on your own, this is a "better than nothing" solution that mimics the experience of using one. In fact, this is the vision of our Infrared Street View project that aims at providing a thermal view of our world. In addition to the Web-based approach to exploring the Infrared Street View, AR provides a location-based approach that may be more intuitive and exciting.

Thursday, August 30, 2018

Adding Instructional Support in SmartIR

Fig. 1
A goal of the SmartIR app is to provide basic instructional support directly in the app so that students, citizen scientists, professionals, and other users can learn what they can do with the incredible power of thermal vision beyond its conventional applications. This requires a lot of development work, such as inventing the artificial intelligence that can guide them through making their own scientific discoveries and engineering decisions. While that kind of instructional power poses an enormous challenge to us, adding some instructional materials in the app so that users can get started is a smaller step that we can take right now.

As of August 30, 2018, I have added 17 experiments in physical sciences that people can do with thermal vision in SmartIR. These experiments, ranging from heat transfer to physical chemistry, are based on my own work in the field of infrared imaging in the past eight years and are all very easy to do (to the point that I call them "kitchen science"). I now finally have a way to deliver these experiments through a powerful app. Figure 1 shows the list of these experiments in SmartIR. Users can click each card to open the corresponding instructional unit (which is a sequence of steps that guide users through the selected set of experiments).

Fig. 2
To do better than just putting some HTML pages into the app, I have also built critical features that allow users to switch back and forth between the thermal view and the document view (Figure 2). When users jump to the thermal view from a document, a thumbnail view of that document is shown on top of a floating button in the thermal view window (see the left image in Figure 2), allowing users to click it and go back to the document at any time. The thumbnail also serves to remind them which experiment they are supposed to conduct. When they go back to the document, a thumbnail view of the thermal camera is shown on top of a floating button in the document view window (see the right image in Figure 2), allowing users to click it and go back to the thermal view at any time. This thumbnail view also connects to the image stream from the thermal camera so that users can see current picture the camera is displaying without leaving the document.

These instructional features will be further enhanced in the future. For instance, users will be able to insert a thermal image into a container, or even create a slide show, in an HTML page to document a finding. At the end, they will be able to use SmartIR to automatically generate a lab report.

These new features, along with what I have built in the past few weeks, mark the milestone of Version 0.0.2 of SmartIR (i.e., only 2% of the work has been done towards maturity). The following video offers a sneak peek of this humble version.


Thursday, August 2, 2018

Using SmartIR in Science Experiments

Fig. 1: The paper-on-cup experiment with SmartIR
SmartIR is a smartphone app that I am developing to support infrared (IR) thermal imaging applications, primarily in the field of science and engineering, based on the FLIR ONE SDK. The development officially kicked off in July 2018. By the end of the month, a rudimentary version, to which I assigned V 0.0.1 (representing approximately 1% of the work that needs to be done for a mature release), has been completed.

Fig. 2: Time graphs of temperatures in SmartIR
Although a very early version, SmartIR V0.0.1 can already support some scientific exploration. In this article, I share the results from doing the can't-be-simpler experiment that I did back in 2011 with a FLIR I5. This experiment needs only a cup of water, a piece of paper, and, of course, an IR camera (which is FLIR ONE Pro Generation 3 in my case). When a piece of paper is placed on top of an open cup of tap water that has sit in the room for a few hours, it warms up -- instead of cooling down -- as a result of the adsorption of water molecules onto the underside of the paper and the condensation of more water molecules to form a layer of liquid water, as shown in Figure 1.

While the user can observe this effect with any thermal camera, it is sometimes useful to also record the change of temperatures as time goes by. To do this, SmartIR allows the user to add any number of thermometers to the view (and move or delete them as needed) and show their temperature readings in a time graph on top of the thermal image view (this is sort of like the translucent sensor graph in my Energy2D computational fluid dynamics simulation program). Figure 2 shows the time graph of temperatures. To study the effect, I added three thermometers: one for measuring the ambient temperature (T3), one for measuring the temperature of water (T2), and one for measuring the temperature of the paper (T1). Note that, before the paper was placed, T1 and T2 both measured the temperature of the water in the cup. As today is pretty hot, T3 registered higher than 35 °C. Due to the effect of evaporative cooling, T2 registered about 33 °C. When a piece of paper was put on top of the cup, T1 rose to nearly 37 °C in a few seconds!

SmartIR is currently only available in Android. It hasn't been released in Google Play as intense development is expected to be under way in the next six months. A public release may be available next year.

Thursday, August 17, 2017

National Science Foundation funds citizen science project to crowdsource an infrared street view

We are pleased to announce that the National Science Foundation has awarded us a two-year, $500,000 exploratory grant to develop, test, and evaluate a citizen science program that engages youth to investigate energy issues through scientific inquiry with innovative technology. The project will crowd-create the Infrared Street View, a citizen science program that aims to produce a thermal version of Google's Street View using an affordable infrared (IR) camera attached to a smartphone. In collaboration with high schools and out-of-school programs in Massachusetts, we will conduct pilot-tests with approximately 200 students in this exploratory phase. The project will develop SmartIR, a smartphone app that will guide users to collect IR images on both Android and iOS platforms for synthesizing a seamless street view. Figure 1 shows a prototype of the Infrared Street View and Figure 2 shows a little math behind the scenes.

Fig. 1: A hemispherical infrared street view (prototype)
In essence, an IR camera serves as a high-throughput data acquisition instrument that collects thousands of temperature data points each time a picture is taken. With this incredible tool, youth can collect massive geotagged thermal data that have considerable scientific and educational value for visualizing energy usage and improving energy efficiency at all levels. The Infrared Street View program will provide a Web-based platform for youth and anyone interested in energy efficiency to view and analyze the aggregated data to identify possible energy losses. By sharing their scientific findings with stakeholders, youth will make changes to the way energy is being used. 

We are completely aware of possible legal implications and complications of the proposed citizen science program. In the case of Kyllo v. United States in 2001,  the Supreme Court has ruled that the use of a thermal camera from a public vantage point to monitor the radiation of heat from a person's home was a “search” within the meaning of the Fourth Amendment, and thus required a warrant. The ruling seems to be limited to the use of thermal cameras by law enforcement, however. Back then, IR cameras were available to only a handful of professionals, but they are only $200 nowadays and just a few clicks away on Amazon. The widespread use of smartphone-based IR cameras is making thermal images commonplace on the Internet and it is probably an interesting question for law scholars to study how civilian use of IR cameras should be regulated.

Fig. 2: Math behind the scenes.
Regardless, we will take the privacy issue very seriously and will take every precaution that we can think of to avoid potential side effects resulted from this well-intentioned program. Fortunately, we have a lot of public supports to conduct this research on large public buildings and possible commercial buildings, where the concerns of privacy are far less than private residential buildings and the needs to reduce the energy waste of those buildings and save taxpayer dollars are far more pressing. Hence, we will start with school, public, and commercial buildings in selected areas where performing thermal scan of the buildings and publishing their thermal images for educational and research purposes are permitted by school leaders, town officials, and property owners.  

From a broader perspective, the Infrared Street View program could serve as a pilot test that may shed light on increasingly important issues related to citizen privacy in the era of the Internet of Things (IoT), which features the ubiquity of sensor data collection that could be viewed by many as invasive into their physical space (not just cyberspace). While no one can deny the tremendous potential of the technology in transforming the ways people learn, work, and live, careful research must be carried out to address legitimate concerns. This program could be one of those projects that provide a unique approach to meet those challenges from a citizen science point of view, which integrates many interesting scientific, technical, educational, and legal aspects. The lessons we can learn from conducting this work could be very useful to the citizen science community in the IoT era.

Wednesday, July 26, 2017

Thermal imaging as a universal indicator of chemical reactions: An example of acid-base titration

Fig. 1: NaOH-HCl titration
Funded by the National Science Foundation, we are exploring the feasibility of using thermal imaging as a universal indicator of chemical reactions. The central tenet is that, as all chemical reactions absorb or release thermal energy (endothermic or exothermic), we can infer certain information from the time evolution and spatial distribution of the temperature field.

To prove the concept, we first chose titration, a common laboratory method of quantitative chemical analysis that is used to determine the unknown concentration of an identified analyte, as a beginning example. A reagent, called the titrant, is prepared as a standard solution. A known concentration and volume of titrant reacts with a solution of analyte to determine its concentration.

The experiment we did today was an acid-base titration. An acid–base titration is the determination of the concentration of an acid or base by exactly neutralizing the acid or base with a base or acid of known concentration. Such a titration is typically done with a burette that drops titrant into an Erlenmeyer flask containing the analyte. A pH indicator is used to determine whether the equivalence point has been reached. The pH indicator usually depends on the analyte and the titrant. But a differential thermal analysis based on infrared imaging may provide a universal indicator as the technique depends only on the heat of reaction and thermal energy is universal.

Fig. 2: The dish-array titration revealed by FLIR ONE
Figures 1 and 2 in this article show the results of the NaOH+HCl titration, taken using a FLIR ONE thermal camera attached to my iPhone 6. A solution of 10% NaOH was prepared as the analyte of "unknown" concentration and 1%, 3%, 5%, 7%, 10%, 12%, 15%, 18%, and 20% HCl were used as the titrant. The experiment was conducted with a 3×3 array of Petri dishes. Hence, we call this setup as dish-array titration. Preliminary results of this first experiment appeared to be encouraging, but we have to be cautious as the dissolving of HCl after the acid-base neutralization completes can also release a significant amount of heat. How to separate the thermal signatures of reaction and dissolving requires some further thinking.

Wednesday, April 5, 2017

A demo of the Infrared Street View

An infrared street view
The award-winning Infrared Street View program is an ambitious project that aims to create something similar to Google's Street View, but in infrared light. The ultimate goal is to develop the world's first thermographic information system (TIS) that allows the positioning of thermal elements and the tracking of thermal processes on a massive scale. The applications include building energy efficiency, real estate inspection, and public security monitoring, to name a few.
An infrared image sphere


The Infrared Street View project is based on infrared cameras that work with now ubiquitous smartphones. It takes advantages of the orientation and location sensors of smartphones to store information necessary to knit an array of infrared thermal images taken at different angles and positions into a 3D image that, when rendered on a dome, creates an illusion of immersive 3D effects for the viewer.

The project was launched in 2016 and later joined by three brilliant computer science undergraduate students, Seth Kahn, Feiyu Lu, and Gabriel Terrell, from Tufts University, who developed a primitive system consisting of 1) an iOS frontend app to collect infrared image spheres, 2) a backend cloud app to process the images, and 3) a Web interface for users to view the stitched infrared images anchored at selected locations on a Google Maps application.

The following YouTube video demonstrates an early concept played out on an iPhone:



Thursday, October 6, 2016

Infrared Street View won Department of Energy's JUMP competition

Creating an infrared street view using SmartIR and FLIR ONE
Our Infrared Street View (ISV) program has won the JUMP Competition sponsored jointly by CLEAResult, the largest provider of energy efficiency programs and services in North America, and the National Renewable Energy Laboratory (NREL), a research division of the US Department of Energy (DOE). This JUMP Competition called for innovations in using smartphones' sensing capabilities to improve residential energy efficiency. Finalists were selected from a pool of submitted proposals and invited to make their pitches to the audience at the CLEAResult Energy Forum held in Austin, TX on October 4-6, 2016. There is only one winner among all the good ideas for each competition. This year, we just happened to be one.

IR homework
We envision the Infrared Street View as an infrared (IR) counterpart of Google's Street View (I know, I know, this is probably too big to swallow for an organization that is a few garages small). Unlike Google's Street View in the range of visible light, the Infrared Street View will provide a gigantic database of thermal images in the range of invisible IR light emitted by molecular vibrations related to thermal energy. If you think about these images in a different way, they actually are a massive 3D web of temperature data points. What is the value of this big data? If the data are collected in the right way, they may represent the current state of the energy efficiency of our neighborhoods, towns, cities, and even states. In a sense, what we are talking about is in fact a thermographic information system (TIS).

We are not the only group that realized this possibility (but we are likely the first one that came up with the notion and name of TIS). A few startup companies in Boston area have worked in this frontier earlier this decade. But none of them has tapped into the potential of smartphone technologies. With a handful of drive-by trucks or fly-by drones with a bunch of mounted infrared cameras, it probably would take these companies a century to complete this thermal survey for the entire country. Furthermore, the trucks can only take images from the front of a building and the drones can only take images from above, which mean that their data are incomplete and cannot be used to create the thermal web that we are imagining. In some cases, unsolicited thermal scan of people's houses may even cause legal troubles as thermal signatures may accidentally disclose sensitive information.

Our solution is based on FLIR ONE, a $200-ish thermal camera that can be plugged into a smartphone (iOS or Android). The low cost of FLIR ONE, for the first time in history, makes it possible for the public to participate in this thermal survey. But even with the relatively low price tag, it is simply unrealistic to expect that a lot of people will buy the camera and scan their own houses. So where can we find a lot of users who would volunteer to participate in this effort?

Let's look elsewhere. There are four million children entering the US education system each year. Every single one of them is required to spend a sizable chunk of their education on learning thermal science concepts -- in a way that currently relies on formalism (the book shows you the text and math, you read the text and do the math). IR cameras, capable of visualizing otherwise invisible heat flow and distribution, is no doubt the best tool for teaching and learning thermal energy and heat transfer (except for those visually impaired -- my apology). I think few science teachers would disagree with that. And starting this year, educational technology vendors like Vernier and Pasco are selling IR cameras to schools.

What if we teach students thermal science in the classroom with an IR camera and then ask them to inspect their own homes with the camera as a homework assignment? At the end, we then ask them to acquire their parents' permissions and contribute their IR images to the Infrared Street View project. If millions of students do this, then we will have an ongoing crowdsourcing project that can engage and mobilize many generations of students to come.

Sensor-based artificial intelligence
We can't take students' IR images seriously, I hear you criticizing. True, students are not professionals and they make mistakes. But there is a way to teach them how to act and think like professionals, which is actually a goal of the Next Generation Science Standards that define the next two or three decades of US science education. Aside from a curriculum that teaches students how to use IR cameras (skills) and how to interpret IR images (concepts), we are also developing a powerful smartphone app called SmartIR. This app has many innovations but two of them may lead to true breakthroughs in the field of thermography.

Thermogram sphere
The first one is sensor-based intelligence. Modern smartphones have many built-in sensors, including the visible light cameras. These sensors and cameras are capable of collecting multiple types of data. The increasingly powerful libraries of computer vision only enrich this capability even more. Machine learning can infer what students are trying to do by analyzing these data. Based on the analysis results, SmartIR can then automatically guide students in real time. This kind of artificial intelligence (AI) can help students avoid common mistakes in infrared thermography and accelerate their thermal survey, especially when they are scanning buildings independently (when there is no experienced instructor around to help them). For example, the SmartIR app can check if the inspection is being done at night or during the day. If it is during the day (because the clock says so or the ambient light sensor says so), then SmartIR will suggest that students wait to do their scan until nightfall eliminates the side effect of solar heating and lowers the indoor-outdoor temperature difference to a greater degree. With an intelligent app like this, we may be able to increase the quality and reliability of the IR images that are fed to the Infrared Street View project.
Virtual infrared reality (VIR) viewed with Google Cardboard

The second one is virtual infrared reality, or VIR in short, to accomplish true, immersive thermal vision. VIR is a technology that integrates infrared thermography with virtual reality (VR). Based on the orientation and GPS sensors of the phone, SmartIR can create what we called a thermogram sphere and then knit them together to render a seamless IR view. A VIR can be uploaded to Google Maps so that the public can experience it using a VR viewer, such as Google's Cardboard Viewer. We don't know if VIR is going to do any better than 2D IR images in promoting the energy efficiency business, but it is reasonable to assume that many people would not mind seeing a cool (or hot) view like this while searching their dream houses. For the building science professionals, this may even have some implications because VIR provides a way to naturally organize the thermal images of a building to display a more holistic view of what is going on thermally.

With these innovations, we may eventually be able to realize our vision of inventing a visual 3D web of thermal data, or the thermographic information system, that will provide a massive data set for governments and companies to assess the state of residential energy efficiency on an unprecedented scale and with incredible detail.

Saturday, September 17, 2016

National Science Foundation funds chemical imaging research based on infrared thermography

The National Science Foundation (NSF) has awarded Bowling Green State University (BGSU) and Concord Consortium (CC) an exploratory grant of $300 K to investigate how chemical imaging based on infrared (IR) thermography can be used in chemistry labs to support undergraduate learning and teaching.

Chemists often rely on visually striking color changes shown by pH, redox, and other indicators to detect or track chemical changes. About six years ago, I realized that IR imaging may represent a novel class of universal indicators that, instead of using  halochromic compounds, use false color heat maps to visualize any chemical process that involves the absorption, release, or distribution of thermal energy (see my original paper published in 2011). I felt that IR thermography could one day become a powerful imaging technique for studying chemistry and biology. As the technique doesn't involve the use of any chemical substance as a detector, it could be considered as a "green" indicator.

Fig. 1: IR-based differential thermal analysis of freezing point depression
Although IR cameras are not new, inexpensive lightweight models have become available only recently. The releases of two competitively priced IR cameras for smartphones in 2014 marked an epoch of personal thermal vision. In January 2014, FLIR Systems unveiled the $349 FLIR ONE, the first camera that can be attached to an iPhone. Months later, a startup company Seek Thermal released a $199 IR camera that has an even higher resolution and can be connected to most smartphones. The race was on to make better and cheaper cameras. In January 2015, FLIR announced the second-generation FLIR ONE camera, priced at $231 in Amazon. With an educational discount, the price of an IR cameras is now comparable to what a single sensor may cost (e.g., Vernier sells an IR thermometer at $179). All these new cameras can take IR images just like taking conventional photos and record IR videos just like recording conventional videos. The manufacturers also provide application programming interfaces (APIs) for developers to blend thermal vision and computer vision in a smartphone to create interesting apps.

Fig. 2: IR-based differential thermal analysis of enzyme kinetics
Not surprisingly, many educators, including ourselves, have realized the value of IR cameras for teaching topics such as thermal radiation and heat transfer that are naturally supported by IR imaging. Applications in other fields such as chemistry, however, seem less obvious and remain underexplored, even though almost every chemistry reaction or phase transition absorbs or releases heat. The NSF project will focus on showing how IR imaging can become an extraordinary tool for chemical education. The project aims to develop seven curriculum units based on the use of IR imaging to support, accelerate, and expand inquiry-based learning for a wide range of chemistry concepts. The units will employ the predict-observe-explain (POE) cycle to scaffold inquiry in laboratory activities based on IR imaging. To demonstrate the versatility and generality of this approach, the units will cover a range of topics, such as thermodynamics, heat transfer, phase change, colligative properties (Figure 1), and enzyme kinetics (Figure 2).

The research will focus on finding robust evidence of learning due to IR imaging, with the goal to identify underlying cognitive mechanisms and recommend effective strategies for using IR imaging in chemistry education. This study will be conducted for a diverse student population at BGSU, Boston College, Bradley University, Owens Community College, Parkland College, St. John Fisher College, and SUNY Geneseo.

Partial support for this work was provided by the National Science Foundation's Improving Undergraduate STEM Education (IUSE) program under Award No. 1626228. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Friday, September 16, 2016

Infrared Street View selected as a finalist in Department of Energy's JUMP competition

JUMP is an online crowdsourcing community hosted by five national laboratories of the US Department of Energy (DOE) and some of the top private companies in the buildings sector. The goal is to broaden the pool of people from whom DOE seeks ideas and to move these ideas to the marketplace faster.

In July, the National Renewable Energy Laboratory (NREL) and CLEAResult launched a Call for Innovation to leverage crowdsourcing to solicit new ideas for saving energy in homes based on smartphone technologies. Modern smartphones are packed with a variety of sensors capable of detecting all kinds of things about their surroundings. Smartphones can determine whether people are home, or close to home, which may be useful for managing their HVAC systems and controlling lighting and appliances. Smartphones can also gather and analyze data to inform homeowners and improve residential energy efficiency.

Infrared images of houses
We responded to the call with a proposal to develop a smartphone app that can be used to create an infrared version of Google's Street View, which we call Infrared Street View. NREL notified us this week that the proposal has been selected as a finalist of the competition and invited us to pitch the idea at the CLEAResult Energy Forum in Austin, TX next month.

The app will integrate smartphone-based infrared imaging (e.g., FLIR ONE) and Google Map, along with built-in sensors of the smartphone such as the GPS sensor and the accelerometer, to create thermal views of streets at night in the winter in order to reveal possible thermal anomalies in neighborhoods and bring awareness of energy efficiency to people. These infrared images may even have business values. For example, they may provide information about the conditions of the windows of a building that may be useful to companies interested in marketing new windows.

The app will be based on the SDK of FLIR ONE and the Google Map API, backed by a program running in the cloud to collect, process, and serve data. The latest FLIR ONE model now costs $249 and works with common Android and iOS devices, making it possible for us to implement this idea. A virtual reality mode will also be added to enhance the visual effect. So this could be an exciting IR+VR+AR (augmented reality) project.

You may be wondering who would be interested in using the app to create the infrared street views. After all, the success of the project depends on the participation of a large number of people. But we are not Google and we do not have the resources to hire a lot of people to do the job. Our plan is to work with schools. We have a current project in which we work with teachers to promote infrared imaging as a novel way to teach thermal energy and heat transfer in classrooms. This is an area in science education that every school covers. Many teachers -- after seeing an infrared camera in action -- are convinced that infrared imaging is the ultimate way to teach thermal science. If this project is used as a capstone activity in thermal science, it is possible that we can reach and motivate thousands of students who would help make this crowdsourcing project a success.

Those who know earlier efforts may consider this initiative a new round to advance the idea. The main new things are: 1) our plan is based on crowdsourcing with potentially a large number of students who are equipped with smartphone-based IR cameras, not a few drive-by trucks with cameras that homeowners have no idea about; 2) the concerns of privacy and legality should be mitigated as students only scan their own houses and neighbors with permissions from their parents and neighbors and only publish their images in the Google Map app when permitted by their parents and neighbors; and, most importantly, 3) unlike the previous projects that do not put people first, our project starts with the education of children and has a better chance to convince adults.

Friday, April 29, 2016

Personal thermal vision could turn millions of students into the cleantech workforce of today

So we have signed the Paris Agreement and cheered about it. Now what?

More than a year ago, I wrote a proposal to the National Science Foundation to test the feasibility of empowering students to help combat the energy issues of our nation. There are hundreds of millions of buildings in our country and some of them are pretty big energy losers. The home energy industry currently employs probably 100,000 people at most. It would take them a few decades to weatherize and solarize all these residential and commercial buildings (let alone educating home owners so that they would take such actions).

But there are millions of students in schools who are probably more likely to be concerned about the world that they are about to inherit. Why not ask them to help?

You probably know a lot of projects on this very same mission. But I want to do something different. Enough messaging has been done. We don't need to hand out more brochures and flyers about the environmental issues that we may be facing. It is time to call for actions!

For a number of years, I have been working on infrared thermography and building energy simulation to knock down the technical barriers that these techniques may pose to children. With NSF awarding us a $1.2M grant last year and FLIR releasing a series of inexpensive thermal cameras, the time of bringing these tools to large-scale applications in schools has finally arrived.

For more information, see our poster that will be presented at a NSF meeting next week. Note that this project has just begun so we haven't had a chance to test the solarization part. But the results from the weatherization part based on infrared thermography has been extremely encouraging!

Saturday, March 5, 2016

Infrared imaging evidence of geothermal energy in a basement

Geothermal energy is the thermal energy generated or stored in the Earth. The ground maintains a nearly constant temperature six meter (20 feet) under, which is roughly equal to the average annual air temperature at the location. In Boston, this is about 13 °C (55 °F).

You can feel the effect of the geothermal energy in a basement, particularly in a hot summer day in which the basement can be significantly cooler. But IR imaging provides a unique visualization of this effect.

I happen to have a sub-basement that is partially buried in the ground. When I did an IR inspection of my basement in an attempt to identify places where heat escapes in a cold night, something that I did not expect struck me: As I scanned the basement, the whole basement floor appeared to be 4-6 °F warmer than the walls. Both the floor and wall of my basement are simply concrete -- there is no insulation, but the walls are partially or fully exposed to the outside air, which was about 24 °F at that time.

This temperature distribution pattern is opposite to the typical temperature gradient observed in a heated room where the top of a wall is usually a few degrees warmer than the bottom of a wall or the floor as hot air rises to warm up the upper part.

The only explanation of this warming of the basement floor is geothermal energy, caught by the IR camera.

Friday, July 24, 2015

The National Science Foundation funds large-scale applications of infrared cameras in schools


We are pleased to announce that the National Science Foundation has awarded the Concord Consortium, Next Step Living, and Virtual High School a grant of $1.2M to put innovative technologies such as infrared cameras into the hands of thousands of secondary students. This education-industry collaborative will create a technology-enhanced learning pathway from school to home and then to cognate careers, establishing thereby a data-rich testbed for developing and evaluating strategies for translating innovative technology experiences into consistent science learning and career awareness in different settings. While there have been studies on connecting science to everyday life or situating learning in professional scenarios to increase the relevance or authenticity of learning, the strategies of using industry-grade technologies to strengthen these connections have rarely been explored. In many cases, often due to the lack of experiences, resources, and curricular supports, industry technologies are simply used as showcases or demonstrations to give students a glimpse of how professionals use them to solve problems in the workplace.


Over the last few years, however, quite a number of industry technologies have become widely accessible to schools. For example, Autodesk has announced that their software products will be freely available to all students and teachers around the world. Another example is infrared cameras that I have been experimenting and blogging since 2010. Due to the continuous development of electronics and optics, what used to be a very expensive scientific instrument is now only a few hundred dollars, with the most affordable infrared camera falling below $200.

The funded project, called Next Step Learning, will be the largest-scale application of infrared camera in secondary schools -- in terms of the number of students that will be involved in the three-year project. We estimate that dozens of schools and thousands of students in Massachusetts will participate in this project. These students will use infrared cameras provided by the project to thermally inspect their own homes. The images in this blog post are some of the curious images I took in my own house using the FLIR ONE camera that is attached to an iPhone.

In the broader context, the Next Generation Science Standards (NGSS) envisions “three-dimensional learning” in which the learning of disciplinary core ideas and crosscutting concepts is integrated with science and engineering practices. A goal of the NGSS is to make science education more closely resemble the way scientists and engineers actually think and work. To accomplish this goal, an abundance of opportunities for students to practice science and engineering through solving authentic real-world problems will need to be created and researched. If these learning opportunities are meaningfully connected to current industry practices using industry-grade technologies, they can also increase students’ awareness of cognate careers, help them construct professional identities, and prepare them with knowledge and skills needed by employers, attaining thereby the goals of both science education and workforce development simultaneously. The Next Step Learning project will explore, test, and evaluate this strategy.

Tuesday, May 12, 2015

SimBuilding on iPad

SimBuilding (alpha version) is a 3D simulation game that we are developing to provide a more accessible and fun way to teach building science. A good reason that we are working on this game is because we want to teach building science concepts and practices to home energy professionals without having to invade someone's house or risk ruining it (well, we have to create or maintain some awful cases for teaching purposes, but what sane property owner would allow us to do so?). We also believe that computer graphics can be used to create some cool effects that demonstrate the ideas more clearly, providing complementary experiences to hands-on learning. The project is funded by the National Science Foundation to support technical education and workforce development.

SimBuilding is based on three.js, a powerful JavaScript-based graphics library that renders 3D scenes within the browser using WebGL. This allows it to run on a variety of devices, including the iPad (but not on a smartphone that has less horsepower, however). The photos in this blog post show how it looks on an iPad Mini, with multi-touch support for navigation and interaction.

In its current version, SimBuilding only supports virtual infrared thermography. The player walks around in a virtual house, challenged to correctly identify home energy problems in a house using a virtual IR camera. The virtual IR camera will show false-color IR images of a large number of sites when the player inspects them, from which the player must diagnose the causes of problems if he believes the house has been compromised by problems such as missing insulation, thermal bridge, air leakage, or water damage. In addition to the IR camera, a set of diagnostics tools is also provided, such as a blower-door system that is used to depressurize a house for identifying infiltration. We will also provide links to our Energy2D simulations should the player become interested in deepening their understanding about heat transfer concepts such as conduction, convection, and radiation.

SimBuilding is a collaborative project with New Mexico EnergySmart Academy at Santa Fe. A number of industry partners such as FLIR Systems and Building Science Corporation are also involved in this project. Our special thanks go to Jay Bowen of FLIR, who generously provided most of the IR images used to create the IR game scenes free of charge.

Thursday, January 9, 2014

The time of infrared imaging in classrooms has arrived

At the Consumer Electronics Show (CES) 2014, FLIR Systems debuted the FLIR ONE, the first thermal imager for smartphones that sells for $349. Compared with standalone IR cameras that often cost between $1,000 and $60,000, this is a huge leap forward for the IR technology to be adopted by millions.

With this price tag, FLIR ONE finally brings the power of infrared imaging to science classrooms. Our unparalleled Infrared Tube is dedicated to IR imaging experiments for science and engineering education. This website publishes the experiments I have designed to showcase cool IR visualizations of natural phenomena. Each experiment comes with an illustration of the setup (so you can do it yourself) and a short IR video recorded from the experiment. Teachers and students may watch these YouTube videos to get an idea about how the unseen world of thermodynamics and heat transfer looks like through an IR camera -- before deciding to buy such a camera.

For example, this post shows one of my IR videos that probably can give you some idea why the northern people are spraying salt on the road like crazy in this bone-chilling weather. The video demonstrates a phenomenon called freezing point depression, a process in which adding a solute to a solvent decreases the freezing point of the solvent. Spraying salt to the road melts the ice and prevents water from freezing. Check out this video for an infrared view of this mechanism! 

Thursday, July 28, 2011

The thermogenesis of a moth under an IR camera

Is a moth warm-blooded or cold-blooded? If you google this, some would tell you it is cold-blooded. They are not completely right. This infrared study shows how a moth warms up before it can fly. So at least a moth is warm-blooded when it moves.

The moth (is this a winter moth -- operophtera brumata?) was kept in a glass jar. The first IR image shows that when it was idle, its body temperature is the same as the ambient temperature. This means that it does not lose energy to the environment -- a clever way for saving energy and probably protecting itself from predators that hunt by detecting thermal radiation.
However, before making a move, it needs to warm up its flying muscles (near its head where the wings are attached, called the thorax) to above 30 degrees Celsius. In this observation, the warming process took 1-2 minutes for the subject, as shown by the sequence of the IR images to the right. (Note: You may only observe this effect when the moth is energetic. A moth on the verge of death does not have enough energy to warm up.)


Click to view a larger image
Note that we used the automatic color remapping, i.e., the heat map is rescaled based on the lowest and highest temperatures detected in the view. As a result, while the moth warmed up and appeared more reddish in the IR view, the background -- in contrast -- became bluer in the IR view. This, however, does not mean that the temperature of the background has decreased. This automatic remapping could create some confusion, but it is necessary in many cases, especially when you don't know what to expect. It maximizes the difference by increasing the contrast and, therefore, allows the observer to pick up subtle changes like this one.

The last image shows that, after the temperature was high enough, the moth started to move. In this particular experiment, the moth responded slowly because it could have been exhausted as it had struggled quite a bit in the jar before it was imaged.

What interests me in this experiment is thermogenesis: the process of heat production in organisms. What biochemical reactions are responsible for the thermogenesis in moths and bees? Can we learn from them to find a green way to heat our homes?