How the Brain Charts Emotion in a Map-like Way

AI neural network helps reveal human cognitive mechanisms for organizing knowledge about a range of feelings

How the Brain Charts Emotion in a Map-like Way

AI neural network helps reveal human cognitive mechanisms for organizing knowledge about a range of feelings

It is well established in psychology that humans conceptualize emotions by features known as valence (the degree of pleasantness or unpleasantness) and arousal (the intensity of bodily reactions, such as rapid breathing or a racing heart).

If you think of “pleasantness” as longitude and “bodily reaction” as latitude, you can imagine a “mental map,” with nodes that “chart” knowledge of emotion.

The neural mechanisms giving rise to this configuration, however, have remained unclear.

Now, a new study reveals that hippocampal-prefrontal circuits — neural structures implicated in forming other types of cognitive maps — could support the mental mapping of emotion.

Nature Communications published the research by neuroscientists at Emory University. The results showed how the hippocampus represents emotion concepts in a structured hierarchy of “nodes” of pleasantness and bodily reaction, while the ventromedial prefrontal cortex (vmPFC) more accurately tracks relationships between these different nodes, or how they are distributed on the mental map.

The study showed how the hippocampus represents emotion concepts while the vnPFC tracks the distribution of these concepts on a mental map. (ECCO lab)

The study showed how the hippocampus represents emotion concepts while the vnPFC tracks the distribution of these concepts on a mental map. (ECCO lab)

Pinpointing the neural mechanisms that produce such map-like representations may ultimately help in the treatment for some mental illnesses, says Philip Kragel, senior author of the research and Emory assistant professor of psychology.

“Research has shown that individuals with depression and anxiety represent emotions in a more compressed, less differentiated way,” he explains. “And that people who represent emotion with more granularity and differentiation tend to have better health outcomes.”

The current paper combined human brain imaging data, pattern recognition and simulations using AI neural networks.

“People’s emotional experiences are subjective,” says Yumeng Ma, first author of the paper and an Emory PhD student in Laney Graduate School. “We’re using technology to understand the mechanisms underlying emotions in an objective, scientific way.”

Co-authors Philip Kragel and Yumeng Ma

Co-authors Philip Kragel, assistant professor of psychology, and Yumeng Ma, a PhD student in Kragel's Emotion Cognition and Computation Lab. (Photos by Carol Clark)

Co-authors Philip Kragel, assistant professor of psychology, and Yumeng Ma, a PhD student in Kragel's Emotion Cognition and Computation Lab. (Photos by Carol Clark)

“Emotions are central to human experience, they are not simply reactions to things,” Kragel says. “They are important to our success and to our well-being. They help us to communicate better, learn from our experiences, and empathize with others.”

And yet, he adds, emotions have been notoriously difficult to study scientifically.

Kragel is a leader in developing computational methods to study the nature of emotions. His Emotion Cognition and Computation Lab (ECCO Lab) works at the intersection of psychology, cognitive neuroscience and machine learning.

AI neural networks, modeled on the human brain, are one tool used by the lab.

Similarly to the human brain, an artificial neural network must boil down complex data into its essence, a process known as “embedding,” so that vast amounts of knowledge may be stored in an organized and efficient manner.

“For the current paper, we wanted to probe how the human brain compresses emotion experiences,” Kragel says. “How do we embed these very complicated events? What are the relevant neural signals?”

The researchers developed predictive models to analyze a dataset of ratings of various emotions by participants as they watched film clips. They found that these self-report measures could be decoded from brain imaging patterns of viewers watching these clips. (ECCO Lab)

The researchers developed predictive models to analyze a dataset of ratings of various emotions by participants as they watched film clips. They found that these self-report measures could be decoded from brain imaging patterns of viewers watching these clips. (ECCO Lab)

The researchers developed predictive models to analyze a dataset of ratings of various emotions by participants as they watched film clips. They found that these self-report measures could be decoded from brain imaging patterns of viewers watching these clips. (ECCO Lab)

The researchers developed predictive models to analyze a dataset of ratings of various emotions by participants as they watched film clips. They found that these self-report measures could be decoded from brain imaging patterns of viewers watching these clips. (ECCO Lab)

The researchers began by tapping the multimodal dataset Emo-FiLM (Emotion Research Using Films and fMRI), a component of OpenNeuro, a free and open platform for validating and sharing neuroscience data.

The Emo-FilM dataset includes ratings of various emotions by participants as they watch short, emotionally evocative film clips. These human ratings of emotion experience and the corresponding brain activity scans can be examined in relation to one another to reduce the gap between theory in psychology and empirical neuroscience. The dataset is tuned to understand underlying emotion processes rather than individual differences.

The researchers developed predictive models to analyze this dataset and found, as expected, that self-report measures of emotional experience could be decoded from fMRI patterns of hippocampal-prefrontal activity.

The hippocampus is a seahorse-shaped structure in the temporal lobe that helps organize experiences into memories by linking information from across the brain. The ventromedial prefrontal cortex, or vmPFC, is a brain region in the frontal lobe involved in weighing information about goals, social cues and outcomes, helping people make decisions and evaluate risk and reward.

Analyzing the outputs of predictive models revealed these brain systems contained information consistent with a map-like representation.

“For example,” Ma explains, “occurrences of anger and fear are often closer together compared to those of happiness and excitement.”

The researchers tested the model’s ability to predict both emotion categories and the relations between them. The results showed more information about emotion categories in the hippocampus and more relational information in the vmPFC.

An illustration of the environment in which artificial agents were allowed to "walk" and make their own predictions about what they would experience depending on where they moved along the graph. (ECCO Lab)

An illustration of the environment in which artificial agents were allowed to "walk" and make their own predictions about what they would experience depending on where they moved along the graph. (ECCO Lab)

They further probed their framework using an artificial neural network known as the Tolman-Eichenbaum Machine, or TEM, which serves as a computational model of relational memory in the brain.

The researchers created an artificial environment, represented as an abstract graph, based on emotion category ratings from the film-viewing data. TEM artificial agents, or virtual robots, were exposed to this environment so they could learn how emotion concepts relate to one another.

After this training, trajectories of the artificial agents were plotted as they “walked” through the environment and made their own predictions about what they would experience if they stayed put or moved up, down, to the right or to the left along the graph.

“The main takeaway,” Ma says, “is we found that the hierarchy of emotion categories is represented more broadly — for example, this is good, that is bad — in the interior part of the hippocampus. And in the posterior region, the representations are more granular, finer-grained concepts.”

The results also showed that the vmPFC appears to track long-term transitions for broad, rather than finer-grained, emotion concepts.

The findings offer a neurocomputational explanation of how humans organize abstract emotion knowledge in a generalized, normative way.

The researchers hope to build on their findings by studying how this mental map may differ among those with mental health issues and across different cultures.

They also want to explore how this mental map for emotions develops over time.

“These are open questions,” Kragel says. “Are you born with the ability to form broad categories of emotion, such as good or bad, and then you gradually learn where to add more nuanced nodes on the graph? Or maybe you’re born with the ability to learn general relational structures. Do the emotions come first? Or is it the other way around?”

Story by Carol Clark.

To learn more about Emory, please visit:
Emory News Center
Emory University