Lab Talk

In Search of Emotion in the EEG

Defining emotion is still a challenge. However the circuitry of emotion and its manifestation in the EEG is being exploited to understand it better.

Think back to a great adventure. It might be standing atop a mountain of newly powdered snow with nothing but a board or two between you and the elements. It might be catching the crest of a wave on boat or on body, wondering if your vessel or the wave will win. It might be something less physical: walking down the aisle towards the person who is going to become your partner in life. Waiting for that person at the end of that same aisle.

What did you experience? Perhaps your palms were sweaty, breath hot and quick, heart beating frantically. Perhaps the whole world stood so still that you were afraid to breathe lest you hurry the moment. You may have felt faint, pumped with adrenaline, or any physical experience in between. Your particular response depends on both the moment and on you, but in these moments our bodies invariably respond.

So do our emotions.

Defining emotion

Though it may not be the first thing you think of when faced with a moment of great adventure, defining our emotions can be a more challenging prospect than defining cognition. The very relationship between the body’s response to stimuli and emotions is an issue of great debate. Which comes first? Are the two merely correlated or is the relationship causal? What other pathways, senses, or activities are involved in which ways? Space for philosophical pondering is endless. Luckily, the realm of emotion has moved beyond philosophy.

First it came to the psychologists. Around 1885 William James and Carl Lange independently developed the idea that physiological arousal gives rise to our emotions. The James-Lange theory has since become a cornerstone of emotional psychology, but it isn’t the only one. The Cannon-Bard Theory posits that emotion and physiological responses occur simultaneously. Physiological responses, they reasoned, can exist without an emotional response and thus cannot be the causal factor. In 1962, Jerome Singer and Stanley Schachter asked how it is that two people can respond to the same stimuli with different emotions. The Two-Factor Theory of Emotion accounts for this by bringing cognition into play. Emotion is a response to two factors, they posit, namely arousal and cognitive labeling.

The anatomy of emotion

 Though these theories serve to enhance our understanding of how emotions might work, they are all lacking in real evidence. It wasn’t until Neurologists hit the scene that emotional pathways could be studied in a physical sense. In 1937, a Cornell University based neuroanatomist by the name of James Papez sought to identify the brain anatomy responsible for emotion. We have been following in his footsteps ever since, albeit with bigger budgets, better tools, and atop the shoulders of ever increasing shared knowledge.

Moving beyond the nuts and bolts definition is our ability to measure emotion. Linking emotions with their physical centers is a way of quantifying something that is often considered qualitative.

A better understanding of the pathways that contribute to our emotional lives can help us to understand each other, identify potential problems, and help people overcome emotionally-based problems. If we know that a particular pathway or pattern of the brain is responsible for empathy, we can more easily identify people who are lacking in their ability to emphasize. We can streamline treatment and validate that our methods are working based on physical changes in the brain instead of patient confirmation. A psychopath’s word is not as reliable a source as direct evidence of electricity-laden neurons shifting their firing patterns.

It is now well established that the amygdala, a deep structure within the limbic system plays a prominent role in emotion and emotional learning.  Those with damage to the amygdala have trouble ascribing emotional content.  More significantly, evidence is now mounting that establishing emotional patterns entail plasticity in both the cortex and the amygdala.  Can emotion then be read in the cortex?

Emotion in EEG

A number of groups (seemingly concentrated in Taiwan) have made progress identifying emotion in the spatio-temporal connectivity patterns using EEG.  In 2014 You-Yun Lee and Shulan Hsieh from National Cheng Kung University in Taiwan recorded EEG from forty people while eliciting emotional states with film clips. The states were deemed positive, negative or neutral based on the film content. Researchers measured correlation, coherence and phase synchronization of EEG spectral bands to estimate functional brain connectivity and compare whether it varied between emotional states. It did.

Theta and Alpha bands showed statistically significant differences in functional connectivity throughout several areas of the brain. For instance theta band correlations were significantly higher in posterior regions in negative states relative to neutral and higher in frontal regions in positive states relative to neutral.  Using various connectivity metrics the authors were able to classify the patterns by emotional state with an accuracy ranging from 53% to 69%, substantially greater than the chance level of 33% for the three emotional states.  However, others such as Yuan-Pin Lin et al from National Taiwan University have reported better results with over 90% accuracy in classifying four music induced emotional states from EEG.

See related post The Blue Frog in the EEG

From detection to application

Researchers from Nanyang Technological University in Singapore are taking this understanding of emotions and their affiliated brain signals one step further to seek applications. They have developed a musical site that uses EEG to recognize six distinct emotional patterns and deploys music appropriate to helping users deal with their feelings. They have also developed a technology that translates EEG-detected emotions through human-computer interfaces as a visual representation of the feelings, allowing these interfaces to communicate feelings.

see related post Neurofeedback and Anxiety

The use of EEG to measure emotions is only beginning to be explored.  The number of types of emotions or emotional states classified by different theories and systems is wide and varied, as are the methods used to induce them and the connectivity metrics and classification approaches to predict them. However, the results are directionally promising and the potential applications many.

3 thoughts on “In Search of Emotion in the EEG

  1. Isn’t this old news? Lots of similar findings were reported in the late 20th century. Jaak Panksepp in particular developed sophisticated hypotheses about the neural (especially brainstem) and EEG correlates of emotions. Walt Disney’s film ‘Fantasia’ provided a marvellous early example of how to correlate music with emotion-laden perceptions. And of course it has longbeen known that cows respond to Mozart by increang their milk yields!

  2. Interested readers might like to explore:
    Onton J & S Makeig. Broadband high-frequency EEG dynamics during emotion imagination. Frontiers in Human Neuroscience, 2009.

    There, the temporal cortical sites generating an EEG marker of emotional valence (degree of ‘good’ or ‘bad’) were reported, a finding supported more recently by some fMRI studies .

  3. A mapping between anatomical structure and emotions is a critical component of attempting to utilize EEG as a tool for reading our emotions. The anatomical structures that ‘implement’ our emotions tend to be buried deep within our grains – generally subcortical. Further, there is a necessary and typical interaction with cortical structures, rendering EEG detectable generators difficult to find utilizing standard EEG equipment. Typically, emotions are registered utilizing a physiological and cognitive approach. I have spent 100s of hours connecting students up with GSR, BVP, EMG, EEG, respiratory belts, whilst they were watching grass grow or A Clockwork Orange. Utilizing EEG alone, relying on subtle changes in the power spectrum of EEG detectable bands is not terribly reliable, and impractical for day to day emotion acquisition schemes. Lastly, there is the chicken and egg issue that must be addressed. On an individual basis, self reporting an emotion is required in order to classify the EEG signals. In many instances, a person will experience mixtures of emotions at any particular time, although they are required to state which specific emotion they were feeling, given some stimulus. The blending of emotions presents significant issues that must be addressed before we can claim AI has addressed the emotional state of an individual. Further, we can be led along a garden path quite readily. If the system believes we are happy with an input, and proceeds accordingly, we very well might alter our emotional state accordingly. In so doing, we will have altered our emotional state such that we may not actually remember what the initial state was. Emotions are flexible and fleeting states, save very extreme emotions such as rage or fear – those with a significant ANS component. The more subtle, cognitive based emotions are subject to change without notice! Please do not get me wrong – I am a total AI person – working for 25+ years in the computational cognitive science domain. I am looking for real solid results – not hype and what ifs… So please, let us make it so everyone – this can be done!

Leave a Reply