Thursday, October 6, 2016

Infrared Street View won Department of Energy's JUMP competition

Creating an infrared street view using SmartIR and FLIR ONE
Our Infrared Street View (ISV) program has won the JUMP Competition sponsored jointly by CLEAResult, the largest provider of energy efficiency programs and services in North America, and the National Renewable Energy Laboratory (NREL), a research division of the US Department of Energy (DOE). This JUMP Competition called for innovations in using smartphones' sensing capabilities to improve residential energy efficiency. Finalists were selected from a pool of submitted proposals and invited to make their pitches to the audience at the CLEAResult Energy Forum held in Austin, TX on October 4-6, 2016. There is only one winner among all the good ideas for each competition. This year, we just happened to be one.

IR homework
We envision the Infrared Street View as an infrared (IR) counterpart of Google's Street View (I know, I know, this is probably too big to swallow for an organization that is a few garages small). Unlike Google's Street View in the range of visible light, the Infrared Street View will provide a gigantic database of thermal images in the range of invisible IR light emitted by molecular vibrations related to thermal energy. If you think about these images in a different way, they actually are a massive 3D web of temperature data points. What is the value of this big data? If the data are collected in the right way, they may represent the current state of the energy efficiency of our neighborhoods, towns, cities, and even states. In a sense, what we are talking about is in fact a thermographic information system (TIS).

We are not the only group that realized this possibility (but we are likely the first one that came up with the notion and name of TIS). A few startup companies in Boston area have worked in this frontier earlier this decade. But none of them has tapped into the potential of smartphone technologies. With a handful of drive-by trucks or fly-by drones with a bunch of mounted infrared cameras, it probably would take these companies a century to complete this thermal survey for the entire country. Furthermore, the trucks can only take images from the front of a building and the drones can only take images from above, which mean that their data are incomplete and cannot be used to create the thermal web that we are imagining. In some cases, unsolicited thermal scan of people's houses may even cause legal troubles as thermal signatures may accidentally disclose sensitive information.

Our solution is based on FLIR ONE, a $200-ish thermal camera that can be plugged into a smartphone (iOS or Android). The low cost of FLIR ONE, for the first time in history, makes it possible for the public to participate in this thermal survey. But even with the relatively low price tag, it is simply unrealistic to expect that a lot of people will buy the camera and scan their own houses. So where can we find a lot of users who would volunteer to participate in this effort?

Let's look elsewhere. There are four million children entering the US education system each year. Every single one of them is required to spend a sizable chunk of their education on learning thermal science concepts -- in a way that currently relies on formalism (the book shows you the text and math, you read the text and do the math). IR cameras, capable of visualizing otherwise invisible heat flow and distribution, is no doubt the best tool for teaching and learning thermal energy and heat transfer (except for those visually impaired -- my apology). I think few science teachers would disagree with that. And starting this year, educational technology vendors like Vernier and Pasco are selling IR cameras to schools.

What if we teach students thermal science in the classroom with an IR camera and then ask them to inspect their own homes with the camera as a homework assignment? At the end, we then ask them to acquire their parents' permissions and contribute their IR images to the Infrared Street View project. If millions of students do this, then we will have an ongoing crowdsourcing project that can engage and mobilize many generations of students to come.

Sensor-based artificial intelligence
We can't take students' IR images seriously, I hear you criticizing. True, students are not professionals and they make mistakes. But there is a way to teach them how to act and think like professionals, which is actually a goal of the Next Generation Science Standards that define the next two or three decades of US science education. Aside from a curriculum that teaches students how to use IR cameras (skills) and how to interpret IR images (concepts), we are also developing a powerful smartphone app called SmartIR. This app has many innovations but two of them may lead to true breakthroughs in the field of thermography.

Thermogram sphere
The first one is sensor-based intelligence. Modern smartphones have many built-in sensors, including the visible light cameras. These sensors and cameras are capable of collecting multiple types of data. The increasingly powerful libraries of computer vision only enrich this capability even more. Machine learning can infer what students are trying to do by analyzing these data. Based on the analysis results, SmartIR can then automatically guide students in real time. This kind of artificial intelligence (AI) can help students avoid common mistakes in infrared thermography and accelerate their thermal survey, especially when they are scanning buildings independently (when there is no experienced instructor around to help them). For example, the SmartIR app can check if the inspection is being done at night or during the day. If it is during the day (because the clock says so or the ambient light sensor says so), then SmartIR will suggest that students wait to do their scan until nightfall eliminates the side effect of solar heating and lowers the indoor-outdoor temperature difference to a greater degree. With an intelligent app like this, we may be able to increase the quality and reliability of the IR images that are fed to the Infrared Street View project.
Virtual infrared reality (VIR) viewed with Google Cardboard

The second one is virtual infrared reality, or VIR in short, to accomplish true, immersive thermal vision. VIR is a technology that integrates infrared thermography with virtual reality (VR). Based on the orientation and GPS sensors of the phone, SmartIR can create what we called a thermogram sphere and then knit them together to render a seamless IR view. A VIR can be uploaded to Google Maps so that the public can experience it using a VR viewer, such as Google's Cardboard Viewer. We don't know if VIR is going to do any better than 2D IR images in promoting the energy efficiency business, but it is reasonable to assume that many people would not mind seeing a cool (or hot) view like this while searching their dream houses. For the building science professionals, this may even have some implications because VIR provides a way to naturally organize the thermal images of a building to display a more holistic view of what is going on thermally.

With these innovations, we may eventually be able to realize our vision of inventing a visual 3D web of thermal data, or the thermographic information system, that will provide a massive data set for governments and companies to assess the state of residential energy efficiency on an unprecedented scale and with incredible detail.

Sunday, October 2, 2016

Designing solar farms and solar canopies with Energy3D

Fig. 1: Single rack
Many solar facilities use racking systems to hold and move arrays of solar panels. Support of racks is now available in our Energy3D software. This new feature allows users to design many different kinds of solar farm, solar park, and solar canopy, ranging from small scale (a few dozen) to large scale (a few thousand).

Fig. 2: Multiple racks
Mini solar stations often use a single rack to hold an array of solar panels (Figure 1). This may be the best option when we cannot install solar panels on the building's roof. You probably have seen this kind of setup at some nature centers where the buildings are often shadowed by surrounding trees.

If you have more space, you probably can install multiple racks (Figure 2), especially when you are considering using altazimuth dual-axis solar trackers to drive them. This configuration is also seen in some large photovoltaic power stations.

Fig. 3: Rack arrays
Larger solar farms typically use arrays of long racks (Figure 3). Each rack can be driven by a horizontal single-axis tracker. Using taller racks usually requires larger inter-rack spacing, which may be an advantage as it allows maintenance trucks to drive through. In a recent experiment, SunPower experimented with how to grow crops or raise animals in the inter-rack space with their Oasis 3.0 system. So arrays of taller racks may be desirable if you want to combine green energy with green agriculture.

Fig. 4: Solar canopy above a parking lot
If you raise the height of a rack, it becomes a so-called solar canopy that provides shading for human activities like the green canopies of trees do. The most common type of solar canopy converts parking lots into power stations and provides shelters from the sun for cars in the summer (Figure 4).

Designing solar canopies for schools' parking lots may be a great engineering project for students to undertake. This is being integrated into our Solarize Your School Project. In fact, Figure 4  shows a real project in Natick High School in Massachusetts. The hypothetical design has more than 1,500 solar panels (each of them has the size of 0.99 x 1.96 m) and costs over a million dollars.

Saturday, September 17, 2016

National Science Foundation funds chemical imaging research based on infrared thermography

The National Science Foundation (NSF) has awarded Bowling Green State University (BGSU) and Concord Consortium (CC) an exploratory grant of $300 K to investigate how chemical imaging based on infrared (IR) thermography can be used in chemistry labs to support undergraduate learning and teaching.

Chemists often rely on visually striking color changes shown by pH, redox, and other indicators to detect or track chemical changes. About six years ago, I realized that IR imaging may represent a novel class of universal indicators that, instead of using  halochromic compounds, use false color heat maps to visualize any chemical process that involves the absorption, release, or distribution of thermal energy (see my original paper published in 2011). I felt that IR thermography could one day become a powerful imaging technique for studying chemistry and biology. As the technique doesn't involve the use of any chemical substance as a detector, it could be considered as a "green" indicator.

Fig. 1: IR-based differential thermal analysis of freezing point depression
Although IR cameras are not new, inexpensive lightweight models have become available only recently. The releases of two competitively priced IR cameras for smartphones in 2014 marked an epoch of personal thermal vision. In January 2014, FLIR Systems unveiled the $349 FLIR ONE, the first camera that can be attached to an iPhone. Months later, a startup company Seek Thermal released a $199 IR camera that has an even higher resolution and can be connected to most smartphones. The race was on to make better and cheaper cameras. In January 2015, FLIR announced the second-generation FLIR ONE camera, priced at $231 in Amazon. With an educational discount, the price of an IR cameras is now comparable to what a single sensor may cost (e.g., Vernier sells an IR thermometer at $179). All these new cameras can take IR images just like taking conventional photos and record IR videos just like recording conventional videos. The manufacturers also provide application programming interfaces (APIs) for developers to blend thermal vision and computer vision in a smartphone to create interesting apps.

Fig. 2: IR-based differential thermal analysis of enzyme kinetics
Not surprisingly, many educators, including ourselves, have realized the value of IR cameras for teaching topics such as thermal radiation and heat transfer that are naturally supported by IR imaging. Applications in other fields such as chemistry, however, seem less obvious and remain underexplored, even though almost every chemistry reaction or phase transition absorbs or releases heat. The NSF project will focus on showing how IR imaging can become an extraordinary tool for chemical education. The project aims to develop seven curriculum units based on the use of IR imaging to support, accelerate, and expand inquiry-based learning for a wide range of chemistry concepts. The units will employ the predict-observe-explain (POE) cycle to scaffold inquiry in laboratory activities based on IR imaging. To demonstrate the versatility and generality of this approach, the units will cover a range of topics, such as thermodynamics, heat transfer, phase change, colligative properties (Figure 1), and enzyme kinetics (Figure 2).

The research will focus on finding robust evidence of learning due to IR imaging, with the goal to identify underlying cognitive mechanisms and recommend effective strategies for using IR imaging in chemistry education. This study will be conducted for a diverse student population at BGSU, Boston College, Bradley University, Owens Community College, Parkland College, St. John Fisher College, and SUNY Geneseo.

Partial support for this work was provided by the National Science Foundation's Improving Undergraduate STEM Education (IUSE) program under Award No. 1626228. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Friday, September 16, 2016

Infrared Street View selected as a finalist in Department of Energy's JUMP competition

JUMP is an online crowdsourcing community hosted by five national laboratories of the US Department of Energy (DOE) and some of the top private companies in the buildings sector. The goal is to broaden the pool of people from whom DOE seeks ideas and to move these ideas to the marketplace faster.

In July, the National Renewable Energy Laboratory (NREL) and CLEAResult launched a Call for Innovation to leverage crowdsourcing to solicit new ideas for saving energy in homes based on smartphone technologies. Modern smartphones are packed with a variety of sensors capable of detecting all kinds of things about their surroundings. Smartphones can determine whether people are home, or close to home, which may be useful for managing their HVAC systems and controlling lighting and appliances. Smartphones can also gather and analyze data to inform homeowners and improve residential energy efficiency.

Infrared images of houses
We responded to the call with a proposal to develop a smartphone app that can be used to create an infrared version of Google's Street View, which we call Infrared Street View. NREL notified us this week that the proposal has been selected as a finalist of the competition and invited us to pitch the idea at the CLEAResult Energy Forum in Austin, TX next month.

The app will integrate smartphone-based infrared imaging (e.g., FLIR ONE) and Google Map, along with built-in sensors of the smartphone such as the GPS sensor and the accelerometer, to create thermal views of streets at night in the winter in order to reveal possible thermal anomalies in neighborhoods and bring awareness of energy efficiency to people. These infrared images may even have business values. For example, they may provide information about the conditions of the windows of a building that may be useful to companies interested in marketing new windows.

The app will be based on the SDK of FLIR ONE and the Google Map API, backed by a program running in the cloud to collect, process, and serve data. The latest FLIR ONE model now costs $249 and works with common Android and iOS devices, making it possible for us to implement this idea. A virtual reality mode will also be added to enhance the visual effect. So this could be an exciting IR+VR+AR (augmented reality) project.

You may be wondering who would be interested in using the app to create the infrared street views. After all, the success of the project depends on the participation of a large number of people. But we are not Google and we do not have the resources to hire a lot of people to do the job. Our plan is to work with schools. We have a current project in which we work with teachers to promote infrared imaging as a novel way to teach thermal energy and heat transfer in classrooms. This is an area in science education that every school covers. Many teachers -- after seeing an infrared camera in action -- are convinced that infrared imaging is the ultimate way to teach thermal science. If this project is used as a capstone activity in thermal science, it is possible that we can reach and motivate thousands of students who would help make this crowdsourcing project a success.

Those who know earlier efforts may consider this initiative a new round to advance the idea. The main new things are: 1) our plan is based on crowdsourcing with potentially a large number of students who are equipped with smartphone-based IR cameras, not a few drive-by trucks with cameras that homeowners have no idea about; 2) the concerns of privacy and legality should be mitigated as students only scan their own houses and neighbors with permissions from their parents and neighbors and only publish their images in the Google Map app when permitted by their parents and neighbors; and, most importantly, 3) unlike the previous projects that do not put people first, our project starts with the education of children and has a better chance to convince adults.