Thursday, August 30, 2018

Adding Instructional Support in SmartIR

Fig. 1
A goal of the SmartIR app is to provide basic instructional support directly in the app so that students, citizen scientists, professionals, and other users can learn what they can do with the incredible power of thermal vision beyond its conventional applications. This requires a lot of development work, such as inventing the artificial intelligence that can guide them through making their own scientific discoveries and engineering decisions. While that kind of instructional power poses an enormous challenge to us, adding some instructional materials in the app so that users can get started is a smaller step that we can take right now.

As of August 30, 2018, I have added 17 experiments in physical sciences that people can do with thermal vision in SmartIR. These experiments, ranging from heat transfer to physical chemistry, are based on my own work in the field of infrared imaging in the past eight years and are all very easy to do (to the point that I call them "kitchen science"). I now finally have a way to deliver these experiments through a powerful app. Figure 1 shows the list of these experiments in SmartIR. Users can click each card to open the corresponding instructional unit (which is a sequence of steps that guide users through the selected set of experiments).

Fig. 2
To do better than just putting some HTML pages into the app, I have also built critical features that allow users to switch back and forth between the thermal view and the document view (Figure 2). When users jump to the thermal view from a document, a thumbnail view of that document is shown on top of a floating button in the thermal view window (see the left image in Figure 2), allowing users to click it and go back to the document at any time. The thumbnail also serves to remind them which experiment they are supposed to conduct. When they go back to the document, a thumbnail view of the thermal camera is shown on top of a floating button in the document view window (see the right image in Figure 2), allowing users to click it and go back to the thermal view at any time. This thumbnail view also connects to the image stream from the thermal camera so that users can see current picture the camera is displaying without leaving the document.

These instructional features will be further enhanced in the future. For instance, users will be able to insert a thermal image into a container, or even create a slide show, in an HTML page to document a finding. At the end, they will be able to use SmartIR to automatically generate a lab report.

These new features, along with what I have built in the past few weeks, mark the milestone of Version 0.0.2 of SmartIR (i.e., only 2% of the work has been done towards maturity). The following video offers a sneak peek of this humble version.


Wednesday, August 22, 2018

Project Snake Eyes: Automatic Feature Extraction Based on Thermal Vision

I am pleased to announce Project Snake Eyes. This ambitious project aims to combine image analysis and infrared imaging to create biomimetic thermal vision -- computer vision that simulates the ability of some animals such as snakes to see in darkness. One of the goals of Project Snake Eyes is to create probably the world's first robotic snake that can hunt through thermal sensing. Project Snake Eyes not only can detect heat, but it can also estimate the size and proximity of the source, giving the robotic snake the artificial intelligence to figure out whether or not it should strike the target. Through two weeks of intense research and development, I have come up with original algorithms that allow robust automatic feature extraction in real time from thermal images through a FLIR ONE camera. This article reports my progress thus far. The algorithms will be published in a journal in the field of image processing and computer vision later.



Figure 1 shows the detection of windows at night from outside a house. Half of the window was open at the time of observation. The lower temperature of the upper pane was partially due to the reflection of infrared light from the cooler environment such as the sky by the glass. The lower pane, which was the open part of the window, was due to the warmer inside of the house. As you can see, Project Snake Eyes approximately identified the shapes and sizes of the two panes, which stood out from the background because of their distinct temperatures. This ability will be used to develop advanced algorithms to automatically analyze the thermal signature of a building, paving the road to large-scale building thermal analyses through automation.

Figure 2 shows a more complex scenario using my computer desk as an example. As you can see, Project Snake Eyes automatically detected most of the hot spots (or cold spots, but I use hot spots to represent points of interest at either high or low temperature). Feature extraction such as blob detection through thermal vision could result in enhanced computer vision for technologies such as unmanned vehicles.

Figure 3 shows object reconstruction based on residue heat using my hand as an example. The residue heat that my hand left on the wall revealed its shape as six separate polygons (the largest one corresponds to the palm and the five smaller ones to the fingertips). Object reconstruction through residue heat could find its applications in certain industry monitoring and control.

Project Snake Eyes aims to identify and track objects.
Figure 4 shows object tracking using a colleague as an example. As you can see, Project Snake Eyes basically captured his body shape. The video clip that follows shows how this works in real time. The algorithms still need optimization to reduce the lag, but you get the basic idea of how object tracking using Project Snake Eyes works. Object tracking will be used to realize animal and human tracking in dark conditions for many science and engineering applications. For instance, we are collaborating with biologists in Texas to develop a system that can track bats at night in order to monitor the health of their colonies.

With all these excitements, I expect to carry on the research and development of Project Snake Eyes in the coming years. As a landmark step, we are working tirelessly towards the unveiling of the first prototype of the robotic snake with thermal vision, collaborating with two robotics teams led by Prof. Yan Gu at the University of Massachusetts Lowell and Prof. Zhenghui Sha at the University of Arkansas, respectively.

Stay tuned!

Thursday, August 2, 2018

Using SmartIR in Science Experiments

Fig. 1: The paper-on-cup experiment with SmartIR
SmartIR is a smartphone app that I am developing to support infrared (IR) thermal imaging applications, primarily in the field of science and engineering, based on the FLIR ONE SDK. The development officially kicked off in July 2018. By the end of the month, a rudimentary version, to which I assigned V 0.0.1 (representing approximately 1% of the work that needs to be done for a mature release), has been completed.

Fig. 2: Time graphs of temperatures in SmartIR
Although a very early version, SmartIR V0.0.1 can already support some scientific exploration. In this article, I share the results from doing the can't-be-simpler experiment that I did back in 2011 with a FLIR I5. This experiment needs only a cup of water, a piece of paper, and, of course, an IR camera (which is FLIR ONE Pro Generation 3 in my case). When a piece of paper is placed on top of an open cup of tap water that has sit in the room for a few hours, it warms up -- instead of cooling down -- as a result of the adsorption of water molecules onto the underside of the paper and the condensation of more water molecules to form a layer of liquid water, as shown in Figure 1.

While the user can observe this effect with any thermal camera, it is sometimes useful to also record the change of temperatures as time goes by. To do this, SmartIR allows the user to add any number of thermometers to the view (and move or delete them as needed) and show their temperature readings in a time graph on top of the thermal image view (this is sort of like the translucent sensor graph in my Energy2D computational fluid dynamics simulation program). Figure 2 shows the time graph of temperatures. To study the effect, I added three thermometers: one for measuring the ambient temperature (T3), one for measuring the temperature of water (T2), and one for measuring the temperature of the paper (T1). Note that, before the paper was placed, T1 and T2 both measured the temperature of the water in the cup. As today is pretty hot, T3 registered higher than 35 °C. Due to the effect of evaporative cooling, T2 registered about 33 °C. When a piece of paper was put on top of the cup, T1 rose to nearly 37 °C in a few seconds!

SmartIR is currently only available in Android. It hasn't been released in Google Play as intense development is expected to be under way in the next six months. A public release may be available next year.