Sunday, September 9, 2018

Creating Augmented Reality Experiences in the Thermal World with SmartIR

Location-based augmented reality (AR) games such as Pokemon Go have become popular in recent years. What can we learn from them in order to make SmartIR into a fun science app? In the past week, I have been experimenting with AR in SmartIR using the location and orientation sensors of the smartphone. This article shows the limited progress I have achieved thus far. Although there are still tons of challenges ahead, AR appears to be a promising direction to explore further in the world of infrared thermal imaging.

According to Wikipedia, augmented reality is an interactive experience of a real-world environment whereby the objects in the real world are "augmented" by computed information. Typically, AR is implemented with a mobile device that has a camera and a display. In a broad sense, an image from a FLIR ONE thermal camera using the so-called MSX technology is automatically AR by default as it meshes a photo of the real world with false colors generated from the thermal radiation of the objects in the real world measured by the camera's microbolometer array. Similarly, the object-tracking work I have recently done for Project Snake Eyes can augment thermal images with information related to the recognized objects computed from their infrared radiation data, such as their silhouettes and their average temperatures.

Fig. 1: Very simple AR demos in SmartIR

But these are not the AR applications that I want to talk about in this article. The AR experience I hope to create is more similar to games like Pokemon Go, which is based on information geotagged by users and discovered by others. The involvement of users in creating AR content is critical to our app, as it aims to promote location-based observation, exploration, and sharing using thermal imaging around the world and aggregate large-scale thermal data for citizen science applications. Figure 1 shows the augmentation of the thermal image view with geotagged information for a house. If you are wondering about the usefulness of this feature other than its coolness (the so-what question), you can imagine tagging the weak points of a thermal envelope of a building during a home energy assessment. The following YouTube videos show how geotagging works in the current version of SmartIR and how users can later discover those geotags.

At this point of the development, I envision SmartIR to provide the following AR experiences. For users who have a thermal camera, a geotag can obviously guide them to find and observe something previously marked by others. What if you don't have a thermal camera but would still like to see how a place would look like through the lens of a thermal camera? In that case, a geotag allows you to see a plain thermal image or virtual reality (VR) view stored in the geotag, taken previously by someone else who had a thermal camera and overlaid in the direction of the current view on the screen of your phone. If VR is provided for the tagged site, the thermal image can also change when you turn your phone. Although nothing beats using a thermal camera to explore on your own, this is a "better than nothing" solution that mimics the experience of using one. In fact, this is the vision of our Infrared Street View project that aims at providing a thermal view of our world. In addition to the Web-based approach to exploring the Infrared Street View, AR provides a location-based approach that may be more intuitive and exciting.

1 comment:

Augmented Reality Software Development said...

Discover the realm of Augmented Reality Development, where virtual elements seamlessly blend with the physical world. This concise guide explores the intricacies of AR development, from coding overlays to transforming user experiences.