How Sensor Data Gets Me EMOTIONAL!

How Sensor Data Gets Me EMOTIONAL!

By Sean Montgomery on Sep 1, 2019.  

Signals are constantly running throughout our bodies. Electrical, chemical, and mechanical messages carry highly amplified messages about our physical, emotional and psychological state from the decision-making centers in our brain to the sympathetic and parasympathetic nervous systems of the body that help the body prepare for “fight or flight” versus “rest and digest”. Evolutionarily these systems have been critical to avoid getting eaten by tigers, finding food, and conserving energy but in the 21st century with the help of carefully engineered sensors like EmotiBit, it’s possible to derive a moment-by-moment portrait of our emotional state. How this information about internal context might help you become a faster learner, share deeper empathy, or enjoy lower stress and better health will be the subject of a future post (so stay tuned!), but here I want to talk a little bit about biometrics, how to derive them, and what’s possible with state-of-the-art science & technology. To explain how that works, let’s start with some examples.

Lions, spiders, and snakes, oh my! (Fear)

For you to perceive a snake camouflaged in the jungle, the visual system in your brain has to work over-time detecting lines and colors and patterns, etc. But once your visual system, combined with memory and emotion centers in your brain detect “SNAKE!”, the fear alarm bells start ringing and your brain cranks up your body’s sympathetic “fight or flight” dial to 11! Your heart rate increases along with increased blood volume pulse (how much blood each heartbeat pushes out the body’s extremities), respiration rate and respiration 

depth increase. Your body temperature increases and a sweat gland in your skin activates, creating heightened and frequent electro-dermal responses (stay tuned for a future post discussing electro-dermal activity in more detail). Evolutionarily, all these changes in the body’s physiology are getting you ready to run away, and by sensing these biometric signals you can detect that fear response even without any verbal communication for physical movement.

Relax… everything is going to be alright... (Calm)

Quite the opposite of a fear response is a feeling of calm and relaxation that’s accompanied by activation of the body’s parasympathetic “rest and digest” system. To save energy, the body lowers the heart rate and beat strength, and the breathing becomes more regular. The increase in parasympathetic neurotransmitters released onto your heart muscle also creates measurable changes in heart rate variability (HRV) indicative of lower stress. Sweat responses and local body temperature also decrease, and viewed as a constellation of your biometric signals are softly whispering “I’m calm”.

Don’t hold your breath! (Stress)

Anxiety and stress is a part of modern life. That stress can be measured in a number of ways, but scientific research has shown that when we’re stressed we hold our breath… a LOT. The technical term is called “apnea” and it’s been shown that people regularly hold their breath while writing emails and engaging in social media. Measuring drops in respiration rate and depth, drops in blood oxygen (SpO2) levels, increased electro-dermal activity (EDA) and changes in heart rate variability together can help identify which daily activities are particularly stressful for you and perhaps even help break the cycle.

How biometric signatures can identify emotions

These examples each show how each emotion triggers a constellation of changes in your body’s physiology. And each specific constellation of changes creates a signature for different emotions. While engagement and anxiety may look similar on some biometrics, other biometrics can easily distinguish the 2 emotional states. The more biometrics you measure, the more detailed those signatures can be, which makes it possible to distinguish more subtle emotional differences, like separating fear from anxiety and anger from frustration.

It’s worth pointing out that in creating these signatures it’s possible to derive many relevant biometrics from a single high-value raw data stream and multiple data streams can be combined to create new biometrics still. For example, from a single infrared PPG data stream it’s possible to extract heart rate, heart rate variability, respiration rate, respiration depth, and a host of other derivative metrics. Adding a red PPG data stream to the infrared creates a multiplier and adds blood oxygen, more accurate blood volume pulse, and potentially hydration information. 

I’ll discuss PPG in more detail in a forthcoming blog post where I’ll speculate on new discoveries that can potentially come out of combining machine learning with labeled 3-wavelength PPG data from the EmotiBit. But for this discussion, the relevant point is that from the 16 high-value raw signals that EmotiBit captures, it’s possible to derive dozens of, perhaps a hundred or more, biometrics that combine to create detailed biometric signatures reflective of moment-by-moment changes in emotional state.

Welcome to the machine learning

Dozens, possibly over a hundred biometrics?! Wow, cool! Now what? I hope there are many more answers to that question are than I can possibly imagine (and stay tuned for a speculative blog post on the topic). Maybe you want to make a shirt that flashes with your heartbeat or create an art installation that changes based on electro-dermal activity. In that case it’s easy to pull out the few signals that you need and either directly attach an Adafruit FeatherWing to control LEDs or a motor (etc) or receive the data in our cross-platform visualizer and use it to control video or audio or anything you want from computer. 

However, if you want to go deep and extract detailed changes in emotional state on a moment by moment basis, EmotiBit enables state-of-the-art research to be applied. By easily adding time-stamped labels to the data from within the visualizer or by adding your own labels/signals in the source code, you can apply the latest machine learning algorithms to extract emotional state from the data. Current research suggests at least 6-10 emotions can be extracted with up to 80% reliability.

Our belief is that as more people have access to high-quality labeled biometric data and work collaboratively (e.g. with our open-source python libraries) to develop machine learning algorithms and data science tools, we’ll push the envelope to truly understand how our emotions are changing on a moment by moment basis throughout our daily activities.