Mind & body, Research

New study shows how the brain reacts emotionally to the real world

Scientists reveal how brain activity predicts a person's response — normal or abnormal — to an emotionally charged image

closeup of the face of a snarling dog and a cute, red-headed baby with adorable blue eyes
How and to what degree we respond emotionally to the real world is handled by a region at the back of the brain called the occipital temporal cortex. These images were among 1,620 shown to volunteers in a study of the electrical activity in this region in response to viewing emotionally charged images. The results were used to predict the emotional responses of other participants based on brain activity alone.

Adobe stock image

In a new study, researchers were able to predict a person’s response to emotionally charged scenes using brain imaging and computer modeling alone — gauging not only whether the person’s reaction was positive, negative or neutral, but also how strong the reaction was.

The study helps neuroscientists understand how the brain represents complex emotional natural stimuli, according to senior author Sonia Bishop, adjunct associate professor of neuroscience at the University of California, Berkeley, and the newly appointed chair of psychology at Trinity College Dublin.

The simple tasks used in the research will also make it easier to study autism spectrum disorder, where researchers seek to understand how individuals differ in processing everyday emotional stimuli.

The study by neuroscientists at UC Berkeley, Trinity College Dublin and Google was published July 9 in the journal Nature Communications.

“It is hugely important for all species to be able to recognize and respond appropriately to emotionally salient stimuli, whether that means not eating rotten food, running from a bear, approaching an attractive person in a bar or comforting a tearful child,” said Bishop, who is also a member of UC Berkeley’s Helen Wills Neuroscience Institute. “How the brain enables us to respond in a nuanced way to emotionally charged situations and stimuli has long been of interest, but little is known about how the brain stores schemas or neural representations to support the nuanced behavioral choices we make in response to emotional natural stimuli.”

In addition, few studies have looked beyond a simple binary reaction — approach or avoid, fight or flight — when humans clearly have a more nuanced response.

“Neuroscience studies of motivated behavior often focus on simple approach or avoidance behaviors, such as lever-pressing for food or changing locations to avoid a shock,” she said. “However, when faced with natural emotional stimuli, humans don’t simply choose between ‘approach’ or ‘avoid’. Rather, they select from a complex range of suitable responses. So, for example, our avoid response to a large bear — leave the area ASAP — is different to our avoid response to a weak, diseased animal — don’t get too close. Similarly, our approach response to the positive stimuli of a potential mate differs from our approach reaction to a cute baby.”

a man and woman hugging
One of the images shown to study volunteers to evoke a strong emotional response. Using such images with functional magnetic resonance imaging of the brain, researchers hope to study emotional responses in those with autism or other neurological or psychiatric conditions.

Adobe stock image

In the new study, led by former UC Berkeley doctoral student Samy Abdel-Ghaffar, who is now at Google, human volunteers were shown a variety of natural images — a baby’s face, a snarling dog, a person vomiting — chosen to evoke an emotional response. The participants’ 3D brain activity was measured with a functional magnetic resonance imager (fMRI); they also were asked to rate the images as positive, negative or neutral and reported the degree of emotional arousal to each.

Analysis of brain-wide activity showed that regions of the occipital temporal cortex, located in the back of the brain, are tuned to represent both the type of stimulus — single human, couple, crowd, reptile, mammal, food, object, building, landscape — and the emotional characteristics of the stimulus. For example, positive high-arousal faces were represented in slightly different regions than negative high-arousal faces or neutral low-arousal faces.

“Our research reveals that the occipital temporal cortex is tuned not only to different categories of stimuli; it also breaks down these categories based on their emotional characteristics in a way that is well suited to guide selection between alternate behaviors,” Bishop said.

Abdel-Ghaffar then used machine learning, a type of artificial intelligence, to predict the response of a second group of volunteers to the same images based solely on the stable tuning patterns in the occipital temporal cortex. He found that he could. In fact, analyzing brain activity was a better predictor of participants’ reactions than a machine learning assessment of the emotional aspects of the actual images.

“This suggests that the brain chooses which information is important or not important to represent and holds stable representations of sub-categories of animate and inanimate stimuli that integrate affective information and are optimally organized to support the selection of behaviors to different types of emotional natural stimuli,” Bishop said.

She noted also that “the paradigm used does not involve a complex task, making this approach suitable in the future, for example, to further understanding of how individuals with a range of neurological and psychiatric conditions differ in processing emotional natural stimuli.”

The research was funded by the National Institutes of Health. Other co-authors of the paper are UC Berkeley neuroscience professor Jack Gallant and three former members of UC Berkeley’s Helen Wills Neuroscience Institute: University of Texas Austin professor Alexander Huth, University of Nevada Reno professor Mark Lescroart and data scientist Dustin Stansbury.

RELATED INFORMATION