How hard are you concentrating as you read this article? There is one not-so-obvious way to find out: Read your hidden cognition levels using your personal heat signature, courtesy of smart thermal imaging tech. That is what a new research project by a group of international engineers from Australia, Germany, and Japan set out to do — with impressive results.
“We’ve explored a new way of estimating cognitive load, i.e. how much mental effort the user is putting into a given task,” Eduardo Velloso, a lecturer at the School of Computing and Information Systems at the University of Melbourne, told Digital Trends. “To do so, we use a thermal camera. Whereas in an image captured by a normal camera, each pixel corresponds to a color, in an image captured by a thermal camera, each pixel corresponds to a temperature value. In our system, we capture the user’s facial temperature signature with a thermal camera.”
The researchers’ software automatically analyzes how the temperature is distributed in a person’s face and provides an estimate of their cognitive load. This is ascertained by looking at the way that blood flows through the body in different cognitive and emotional states. “When we are scared, blood flows to our legs to help us run; and when we are embarrassed, blood flows to our face, making us blush,” Velloso continued. “In a similar way, when we encounter a difficult task, it causes a change in how the blood is distributed on our face, and therefore also on our facial temperature signature.”
Long-term, Velloso said the technology could be used in various domains, such as education. For instance, a webcam equipped with thermal imaging might monitor students as they study, and help reveal when they are facing difficulties or else finding their work too easy.
“In the future, we will also combine thermal imaging with other metrics,” Velloso continued. “We are currently incorporating eye tracking into the mix. One downside of our metric is that it gives us a hint of how hard the user is thinking, but it does not tell us much about what caused an increase in cognitive load. By also monitoring where the user is looking, we will have a complete picture of precisely where the user was looking when we saw an increase in cognitive load. So far, we have only explored this application in controlled experiments. The next steps will be to take our system into the wild and fine-tune our algorithms to be robust to other confounds, such as changes in environmental temperature and other emotions.”
A paper describing the work was recently published in the journal Proceedings of the ACM on Interactive, Mobile, Wearable, and Ubiquitous Technologies.