Semantic Space Theory
Semantic Space Theory
Opinion
Within affective science, the central line of inquiry, animated by basic emotion Highlights
theory and constructivist accounts, has been the search for one-to-one mappings In the past decade, scientists have
between six emotions and their subjective experiences, prototypical expressions, systematically moved beyond the six
traditionally studied emotions to docu-
and underlying brain states. We offer an alternative perspective: semantic space
ment a broader array of emotion-related
theory. This computational approach uses wide-ranging naturalistic stimuli and experiences and expressive behaviors.
open-ended statistical techniques to capture systematic variation in emotion-
related behaviors. Upwards of 25 distinct varieties of emotional experience have Semantic space approaches to emotion
organize the study of emotion-related
distinct profiles of associated antecedents and expressions. These emotions are
behavior by establishing the number of
high-dimensional, categorical, and often blended. This approach also reveals that distinct patterned responses that occur
specific emotions, more than valence, organize emotional experience, expression, systematically within an emotion-related
and neural processing. Overall, moving beyond traditional models to study broader modality, how these behaviors are most
precisely conceptualized, and whether
semantic spaces of emotion can enrich our understanding of human experience. these behaviors are discrete or exist
along a continuum.
Where these theories diverge is in the extent to which emotions are biologically prepared, how
they should be conceptualized, and how they organize behavior (Table 1). Within BET, it is
assumed that there are only a few emotions, that they are separated by clear boundaries, and
that categories such as anger or awe capture universals in emotional experience, recognition, 1
Department of Psychology, University
and brain representation. Constructivist accounts, by contrast, posit that core affect, or valence
of California, Berkeley, 2121 Berkeley
and arousal, are primary in emotional experience, and that people place different interpretations Way, Berkeley, CA 94704, USA
upon core affect, giving rise to significant individual and cultural variation in experience and the
meaning of expressive behavior.
Efforts to adjudicate between these theories have centered upon tests for one-to-one mappings *Correspondence:
between six kinds of emotion – anger, disgust, fear, happiness, sadness, and surprise – and [email protected] (A.S. Cowen).
124 Trends in Cognitive Sciences, February 2021, Vol. 25, No. 2 https://doi.org/10.1016/j.tics.2020.11.004
© 2020 Elsevier Ltd. All rights reserved.
Trends in Cognitive Sciences
Table 1. Past Theoretical Claims Regarding the Biologically Preparedness, Conceptualization, and Structure of Emotional Behavior
Basic emotion theory Appraisal theories Constructivism Refs
Claims regarding Emotional feelings associated with specific Certain appraisals (e.g., certainty, Certain valence/arousal responses are [12,13,119]
biological cognitive appraisals and behaviors are pleasantness or goal conduciveness) biologically prepared. Specific emotions
preparedness biologically prepared and modified by are biologically prepared and modified involve valence and arousal but are artifacts
experience. Emotional states intervene by experience. Patterns in emotion- of language (i.e., infants and nonhuman
between appraisal and response. related response can be explained by animals do not have emotions).
mappings from appraisal to behavior.
Claims regarding Patterns in emotion-related behavior are Emotion-related behaviors are best Emotions are best conceptualized in [12,33,120]
how emotions best conceptualized in terms of specific explained in terms of particular terms of valence, arousal and language-
should be emotions such as awe and fear. cognitive appraisals (e.g., certainty), based conceptual knowledge.
conceptualized not specific emotions.
Claims regarding Traditional BET reduces emotions to six Emotions reduce to a specific set of Emotion-related behaviors are [2,11,12,33,60,
the structure of or seven discrete clusters of states. appraisal dimensions, usually <10 fundamentally low-dimensional and lack 120,121]
emotion-related Revised BET admits of complex (in a few cases, many more) and any inherent categorical structure.
behaviors (>25 kinds), blended emotions. may or may not fall into discrete
clusters.
subjective experience, prototypical expression, and underlying body and brain states [1–7]. In
BET, evidence confirming these one-to-one mappings reveals the nature of emotion [8]; in
constructivist accounts, evidence disconfirming these one-to-one mappings reveals that emo-
tions are not natural kinds [6,7,9]. This narrow empirical focus limits the inferences to be drawn
regarding the broader structure of emotion [10,11]. As a result, entrenched disagreements
persist over the nature of emotion based on summaries of the same data (Table 1) [6,11–16].
Here we offer a different approach – semantic space theory. Our approach formalizes the study of
emotion in the investigation of representational state spaces capturing systematic variation in
emotion-related response (including experience and expression, as well as associated physiology,
cognition, and motivation). We integrate computational studies of emotional experience, facial–
bodily expression, and vocalization to visualize what one might think of as an emerging taxonomy
of emotion. Next, we discuss how the brain represents these experiences in distinct configurations
of activity across the default mode network and subcortical areas. Building upon these advances,
we synthesize literatures on nonhuman emotion-like behavior and nervous system response,
highlighting emerging evidence that emotional behaviors differentiated within a fine-grained
taxonomy have animal homologies and evolved neural mechanisms. The implication of these
developments is clear: moving beyond traditional models to a broad taxonomy of emotion
(Figure 1) will provide for a richer, more comprehensive science of emotion [17].
Answers to this first question are shaped by knowledge related to a second: what are the
emotions? Out of the rich array of emotions we experience, how many are distinct? Out of the
thousands of facial–bodily and vocal signals that people are anatomically capable of producing
[21–24], how many have distinct meanings? To answer these questions is to map the meanings
of emotional experiences and expressions within a semantic space [25–28].
Semantic spaces of emotion are defined by three properties (Figure 1A). The first is their dimension-
ality: how many different kinds of emotion are distinguished within the space? The second is the
Figure 1. Semantic Spaces of Experience and Expression. (A) The semantic space framework. A semantic space is described by (i) its dimensionality, or the
number of distinct meanings of experiences or expressions within the space; (ii) the conceptualization of these meanings in terms of mental states, intentions, or
appraisals; and (iii) the distribution of experiences or expressions within the space, capturing clusters or blends of states. (B) Semantic space of facial–bodily and vocal
expression. A total of 3523 expressions are lettered, positioned, and colored according to 28 distinct emotions that people reliably attribute to them (28 in facial
expression [42] and 24 in vocal expression [25]). Within the space are gradients in expression between emotions traditionally thought of as discrete, such as fear and
surprise. To explore these expressions, see the interactive maps (face: https://s3-us-west-1.amazonaws.com/face28/map.html, voice: https://s3-us-west-1.amazonaws.
com/vocs/map.html). (C) Semantic space of emotion evoked by 2185 brief videos. At least 27 distinct affective states are reliably captured in reports of emotional
experience evoked by video, best conceptualized in terms of emotion concepts such as fear [26]. Again, gradients bridge emotion concepts traditionally thought of as
discrete, such as fear and surprise. Interactive map: https://s3-us-west-1.amazonaws.com/emogifs/map.html. (D) Semantic space of emotional experience evoked by
1841 music samples in multiple cultures [36]. Music samples are positioned and colored according to 13 emotions with which they are reliably associated in both the USA
and China. Within the space, we find gradients among these states. The similarities in affective response across cultures were most reliably revealed in the use of specific emo-
tion concepts (e.g., desire and fear). Interactive map: https://s3.amazonaws.com/musicemo/map.html. (E) Semantic space of emotion conveyed by prosody in 2519 lexically
identical speech samples. Across the USA and India, at least 12 kinds of emotion are preserved in the recognition of mental states from speech prosody, most reliably revealed
in the use of emotion concepts [28]. Interactive map: https://s3-us-west-1.amazonaws.com/venec/map.html. (F) Emotional expression in Ancient American art [58]. Ancient
American sculpture was found to portray at least five distinct kinds of facial expression that accord, in terms of the emotions they communicate to westerners, with western
expectations for the emotions that might unfold in the eight contexts portrayed. Colors of individual faces (letters) are weighted averages of colors assigned to each kind of
perceived facial expression. Eight example sculptures are shown. (To explore all 63 sculptures, see online map: https://s3.amazonaws.com/precolumbian/map.html.) Credit,
from top left down: (i) Metropolitan Museum of Art 2005.91.12, gift of the Andrall and Joanne Pearson Collection, 2005; (ii) Princeton University Art Museum 2003-26, gift of G.
G. Griffin; (iii) Metropolitan Museum of Art 1979.206.578, Michael C. Rockefeller Memorial Collection, Bequest of Nelson A. Rockefeller, 1979; (iv) Kerr Portfolio 342, Jaina
Figure, photo by J. Kerr; (v) Kimbell Art Museum, Fort Worth, Texas, AP 1971.07, Presentation of Captives to a Maya Ruler (detail); and (vi) Photograph: Museum of Fine
Arts, Boston 1983.288, gift of L.T. Clay.
distribution of states within the space: are there discrete boundaries between emotion categories,
or is there overlap [26,29]? The third is the conceptualization of emotion: what concepts most
precisely capture people’s implicit or explicit differentiation of subjective experiences and expres-
sive behaviors [30,31]? Do experiences and expressions correspond to specific emotions
(e.g., interest, sadness, and amusement) or broader affect and appraisal evaluations such as
valence and arousal [2,29,32] or certainty [33], as posited in appraisal and constructivist theories?
Capturing semantic spaces of emotional response requires new kinds of data and statistical
approaches. The prevalent focus on a limited number of emotions and prototypical stimuli [4,6]
captures, as we detail in the following text, approximately 30% of the information conveyed in self-
report and expressive behavior [34]. Characterizing the meaning of self-report and expression re-
quires vast arrays of evocative stimuli and expressions [11] and participants’ responses in terms
of widely varying emotion terms and questions that capture appraisal processes [27] or nonverbal
behaviors [35]. It requires moving beyond univariate measures [8], recognition accuracy [4], and
factor analysis [2,33], approaches that presuppose universal one-to-one mappings between
emotion-related behaviors and discrete labels (e.g., anger) or position along a few broad dimen-
sions of response (for limitations of factor analysis, see Video S1 in supplemental information
online).
Semantic spaces embody a broader goal: to separate signal from noise. To carry signal about
emotion, all instances of a particular behavior (e.g., a smile) need not map to an identical emo-
tional state, as long as they carry informational value regarding emotional experience. Indeed,
we have identified facial expressions used in everyday life in multiple ways, such as sentimental
expressions of musical performers that resemble expressions of pain [35].
To represent such meanings precisely is to project them onto dimensions that capture the
systematic variance in emotion-related behavior. Identifying these dimensions, whether they are
few and broad or numerous and nuanced, requires multidimensional reliability analysis
approaches – such as principal preserved components analysis – which satisfy the mathematical
objective of finding preserved dimensions across individuals or groups [28,36,37]. In contrast to
recognition accuracy and factor analysis, such approaches neither assume one-to-one mappings
between experiences and expressions, nor ignore dimensions of meaning that are nuanced
yet reliable.
Semantic spaces are further characterized by the distribution of states along their dimensions.
How sharp are the boundaries between different categories of emotion? What is lost by sorting
emotions into discrete classes? Answers to this question inform whether emotions should be
understood as discrete affect programs [38] or continuous processes [39], where experiences
are blended, and transition readily from one to another.
Finally, how should we talk about, or conceptualize, the dimensions of a semantic space?
Answers to this question are central to emotion theory, but formal answers require statistical
modeling [40]. Analogously, the dimensions of perceived color can be conceptualized as red,
green, and blue spectral channels [41]. As we will see, emotion has more dimensions than
color, calling for more complex statistical models and larger-scale data. However, as with
color, if the dimensions of a semantic space that explains emotion-related behavior are best
conceptualized using specific categories (blue and awe), it is apt to refer to these dimensions
as emotions.
First, emotion inhabits a high-dimensional space. People reliably distinguish at least 27 distinct
subjective experiences associated with video [26], 24 distinct emotions in nonverbal vocalizations
[25,28], and 28 distinct emotions in the face and body (Figure 1B,C) [42]. These findings were
observed using both traditional rating methods and open-ended free response. The specific
numbers here matter less than the more general point that emotion is at least four times more
complex than that represented in studies of six emotions. This finding, replicated across
response systems of emotion, is not anticipated by BET, and stands in contrast to assumptions
of low dimensionality – that emotion is largely reducible to valence and arousal – found in
constructivist accounts [12].
Which are more primary in emotional experience and recognition of emotion in the face and voice:
specific emotions, as predicted by BET, or valence and arousal, as predicted by constructivism?
Crosscultural studies reveal that specific emotions are more primary [25,26,28,36,42], in several
ways. First, attributions of feelings such as amusement or embarrassment to oneself or another
person are better preserved across cultures than valence and arousal attributions [28,36].
Second, valence and arousal attributions can be explained as culture-specific valuations of
specific emotions. For instance, across the USA and India, people align more closely in evalua-
tions of anger in vocal expressions than in evaluations of their valence. Yet, we can predict
valence evaluations in one culture from emotion judgments in the other by taking into account
how vocalizations perceived (in both cultures) as angry are considered more negative in the
USA than in India [28] (a slight oversimplification; the predictions involve multivariate patterns of
judgments). Similar findings emerge in the study of subjective experiences evoked by music
across the USA and China [36]. The processes underlying subjective experience and emotion
recognition seem to be grounded in the states we designate with specific emotion categories
(sympathy and awe) in the same sense that color perception is grounded in three color channels.
From these specific states, people infer valence, arousal, and eliciting appraisals in a more
culture-specific manner [28,36] (just as we deem colors warm or cold [43]).
Finally, categories of emotion that have been treated as discrete [4] (e.g., anger and disgust) are
bridged by gradients of blended experiences and expressions (Figure 1B–E) [25,26,28,36,42]. For
instance, pure expressions of fear, surprise, and awe are bridged by gradients of composite
facial–bodily and vocal displays that reliably transmit intermediate meanings [25,42]. Although
there may be modal emotion-related responses [44], much of human emotional life is more complex.
Is there convergence across these studies of experience and expression? Might certain emotions –
a taxonomy of states – emerge as crossmodal response patterns? In Figure 1B–E, we map the
distinct mental states conveyed by upwards of 6000 distinct facial–bodily and vocal expressions
(Figure 1B,E) and evoked by music and video (Figure 1C,D). This synthesis finds upwards of 18
emotions that can both be reliably be distinguished in facial–bodily and vocal expression
(Figure 1B) and evoked by distinct videos (Figure 1C) or music samples (Figure 1D): amusement,
anger, anxiety, awe, confusion, contentment, desire, disgust, elation, embarrassment, fear,
interest, love, pain, relief, sadness, surprise, and triumph. Another 12 emotional states have
documented associations with distinct antecedents and expressions only in certain modalities –
for instance, shame in facial–bodily expression [42] and the dreamy sensation conveyed by
some music [36] – either because other signals await discovery, or because the different modalities
are nonredundant.
A caveat to the findings shown in Figure 1B,C is that they are based on behavioral responses by
American English speakers. However, other studies are finding that people in different cultures
attribute similar mental states to a wide range of expressions [4,28,36,45–55], while culture-
specific accents and display rules account for a consequential, but typically smaller, amount of
systematic variance in emotion attribution (25–30%) [35,50,53,55–57]. For example, responses
to Western and Chinese music in the USA and China (Figure 1D) occupy 13 preserved
dimensions, or kinds of emotional experience [36]. Likewise, speech prosody recognition in the
USA and India (Figure 1E) occupies 12 shared dimensions of emotion [28]. With statistical model-
ing, broad arrays of stimuli, and fine-grained behavioral measures, we uncover a wider range of
crosscultural parallels in emotional behavior than previously documented.
There are lingering questions regarding the recognition of emotion in cultures with limited western
contact, given the methodological complexities of studying such cultures [6,15,34]. By turning to
a broader array of emotions, new kinds of stimuli, and open-ended methods of emotion labeling,
a recent study introduced a new approach to examining emotional behavior in cultures isolated
from the west. This study, a computational analysis of facial expressions portrayed in Ancient
American sculpture, rules out western contact and circumvents biases and nonequivalences
across languages in survey-based methods [58]. Facial expressions in 63 sculptures from the
Ancient Americas were found to accord with contemporary western expectations in terms of
their portrayal in specific social contexts. Ancient American sculptures tend to portray at least five
facial expressions in contexts predicted by westerners, including pain in torture, determination/strain
in heavy lifting, anger in combat, elation in social touch, and sadness in defeat (Figure 1F) –
supporting the universality of these facial expressions.
The insights revealed in Figure 1 bring into focus how much information is captured by traditional
models of emotion – the Basic 6 and valence and arousal – and how much is overlooked. With pre-
dictive (crossvalidated) models, the Basic 6 and valence and arousal are each found to capture
around 30% of the systematic variance in judgments of a wide range of emotion categories (an
upper bound, given that an even broader range of stimuli and responses may expose other dimen-
sions of variance). Thus, studies relying on these traditional models capture only about 30% of the
systematic variance in any response modality, and likely underestimate the diagnostic value of self-
report and expression in terms of how they map onto patterns of physiological and neural
response, predict subsequent behaviors, and influence the behavior of others (Figure 2) [4,6,7,34,59].
Figure 2. What Traditional Models Capture. Venn diagrams represent the proportion of the reliable variance in emotional
behavior captured by the Basic 6 and valence/arousal. By mapping reported emotional experiences and facial expressions into a
high-dimensional space, we can largely predict how they are recognized in terms of the Basic 6 and valence/arousal. However,
the Basic 6 capture only a fraction of the information reliably conveyed by facial expression, vocal expression, and self-reports of
emotional experience in response to video – 28%, 30.8%, and 30.2%, respectively. Valence/arousal capture only 28.5%, 21.3%,
and 29.1%, respectively. Altogether, traditional models based on the Basic 6 and valence and arousal largely fail to capture the
rich and variegated space of the meanings that emotional expressions convey and that emerge in subjective responses to
evocative stimuli.
A more recent study [66] has overcome these limitations by collecting whole-brain fMRI
responses to over 2000 diverse emotionally evocative videos and using well-validated statistical
modeling approaches [63,73–76] to differentiate neural representations of a wide range of emo-
tions, broad affective features such as valence and arousal, and semantic and visual features.
Dozens of emotions evoked by video could accurately be differentiated from patterns of brain
activity. Such differentiation was not observed in simple one-to-one mappings between particular
emotions and brain regions (e.g., fear and amygdala) [9] but in complex configurations across
multiple brain networks (Figure 3) that are consistent across subjects (suggesting that they are
not representations of learned concepts, which would recruit arbitrary and variable patterns of ac-
tivity [77]). Emotion-related representations were distributed across transmodal brain regions
near the hubs of the default mode network (DMN), such as the prefrontal cortex and angular
gyrus. These findings build on well-replicated observations that the DMN is differentially active
during experiences of emotion [78,79], zeroing in on what organizes these patterns of DMN
activity. Namely, activity across the DMN corresponded to self-reported experiences such as
anxiety, disgust and entrancement rather than broad dimensions such as valence and arousal
(or the semantic contents of stimuli, such as animals or landscapes) [63]. Indeed, specific emo-
tions explained greater variability in brain activity than affective dimensions in every cortical and
subcortical region of the brain, even the amygdala and brainstem (using crossvalidated predictive
models). This study suggests that specific emotions are primary in the representation of emotion
throughout the brain [25,26,28,36,42,58].
Experiences of specific emotions, then, are found to involve multiple interacting systems –
situated near DMN and subcortical regions – that enter distinct states in response to perceived
threats and opportunities [9,62,80–82]. Given that these systems encode emotional experience
Figure 3. Emerging Insights into the Brain Representation of Emotion. From Kamitani and colleagues (2020) [66]. Cortical surface maps of decoding accuracies
for specific emotions (five subjects averaged) in 360 brain regions from the Human Connectome Project [154] and ten subcortical regions. Each subject was scanned by
functional magnetic resonance imaging while passively viewing 2181 emotionally evocative videos over the course of seven or more sessions. Decoding models were
trained to predict 34 emotion categories associated with each video. Abbreviations: ACC, anterior cingulate cortex; DLPFC/DMPFC/VMPFC, dorsolateral/dorsomedial/
ventromedial prefrontal cortex; IPL, inferior parietal lobule; MTC, medial temporal cortex; OFC, orbitofrontal cortex; PC, precuneus; STS, superior temporal sulcus; TE,
temporal area; TPJ, temporoparietal junction; VC, visual cortex.
(or affordances) during passive viewing of videos, these findings support the view that specific
emotions proactively recruit psychological faculties likely to support adaptive behavioral
responses, regardless of our intentions and beyond our immediate control [83,84]. Note also
that the distribution of emotion-related brain activity reflected the blends we previously observed
in the experiences of emotions such as anxiety and fear, suggesting that such blended states are
actually represented in intermediate patterns of brain activity. The neural underpinnings of emo-
tion inhabit a high-dimensional, complex semantic space. The complexity of this space is a
lower bound: future studies using active and personal elicitors, for example, gaming, social inter-
action, will likely uncover further brain representations of emotion and appraisal.
Our emergent taxonomy of states points to more precise comparisons between human emotion
and mammalian behavior. In Table 2, we synthesize observations of mammalian behavior
and neurophysiology with parallels to 12 emotions that have emerged as distinct in subjective
experience, expression, and human brain activity. For example, that people link amusement to
both open-mouth smiles and laughter [25,42,86,99–103] informs hypotheses regarding the
Table 2. Organizing Evidence for Dissociable Evolutionary and Physiological Underpinnings of Emotion- Outstanding Questions
Related Behaviors within a Nuanced Semantic Space
How is emotional behavior universal, and
System Examples Refs how is it shaped by culture? Emotion
Amusement, play Play face in nonhuman mammals, laughter and play in mammals, [86,90–92,99,103,105] recognition studies, constrained by
brain stimulation and mirthful laughter language and the focus on the Basic 6,
have led to divergent positions
Anger, aggression Growling/snarl homologies in mammals, hypothalamic [122–126]
regarding the degree of cultural
aggression mechanisms
specificity or universality of emotion-
Anxiety, tension Nonhuman displacement behaviors such as self-grooming [95–98] related behavior. Studies of actual
and their relief via consolation in chimps and reduction by behavior are scant. However, newly
antianxiety drugs in macaques available methods now enable broader
Disgust, aversion Sour/bitter facial response in primates, facial expression and [65,69–72,127–130] observational studies of how people
neural correlates in mice, insula in disgust recognition/ behave across cultures and naturalistic
experience, gastric/immune system disgust response contexts, particular with machine
learning. By documenting the real-life
Ecstasy, pleasure Hedonic response in newborns/primates, facial expression [127–129]
situations in which people in different
and neural correlates in mice
cultures produce the wide range of
Fear, alarm Alarm calls in nonhuman animals, amygdala response to [81,129,131–135] emotional behaviors revealed in the
scream-like sounds, amygdala response to alarm faces, primate study of semantic spaces of emotion,
amygdala and alarm signals, facial expression and neural can we determine what aspects of emo-
correlates in mice, joint brain mechanisms of fleeing/freezing in tional behavior are universal and which
mouse brain stem are shaped by cultural processes?
Love, bonding Filial touch in animals, oxytocin in human bonding, oxytocin in [108–114,136,137]
rat/vole/tamarin bonding, oxytocin-moderated pupil dilation What does nonverbal expression really
and affiliation indicate about emotional experience?
Suppose a film arouses laughs, cries,
Pain, (physical/empathic) Pain grimace in nonhuman animals, ACC and pain recognition/ [65,67,68,129,138–140]
screams, or grimaces. What, if anything,
experience, facial expression and neural correlates in mice,
dissociable psychophysiological response can we infer about how it makes people
feel? Answers to this question are
Pride, status Erect ape posture and bipedal swagger [85,141,142] supplied by everyday intuition, but
Sadness, loss Cry face and whimper in chimpanzees, midbrain responses [143–145] remain elusive to science. Owing to the
to infant cries narrow focus on the Basic 6 and the
face, there is a widely recognized need
Shame, submission Submission displays of postural constriction and shrinkage in [85,146–149]
for broader evidence that can capture
mammals and even nonmammals
the more complex ways in which
Sympathy, consolation Animal consolation behaviors, animal care response to distress [87–89,150–153] people actually move their faces, bodies
cries, ACC oxytocin role in consolation behaviors, ACC/empathy and voices when they feel emotions.
network response and altruism What will broader studies guided by
semantic space approaches reveal
Abbreviations: ACC, anterior cingulate cortex. about how people express subjective
feeling in patterns of behavior?
role of animal homologies of the play face [86,90,91,100] and laugh-like utterances [92,93,99], orga- What is the nature of the
nizing social functional accounts that aim to explain why these expressions often occur concurrently neurocomputational processes that
[86,104] and why their neural correlates overlap with those of play behaviors in humans and animals generate richly varying emotional
responses? Recent studies establish
[92,103,105]. Similarly, that love is linked to both tactile and vocal signals [106–113] informs hypoth-
that the brain represents a wide range
eses regarding the animal homologies of filial touch [108,110,111,113,114] and nurturant prosody of specific emotional experiences
[109,112,115,116], and helps explain their overlapping endocrinological underpinnings, such as (Figure 3). However, questions remain
why both of these behaviors covary with the release of oxytocin [109,111,112]. Such findings support about the processing stages that give
rise to these representations. How are
the role of evolutionary pressures in organizing the neurophysiological underpinnings of emotion-
perceptual inputs converted into high-
related behavior within a high-dimensional semantic space. level system wide representations of
emotion? Which brain regions contribute
Concluding Remarks to the conversion of sensory input into
emotion-related appraisals and emo-
Guided by a semantic space approach to subjective life, and new computational and open- tional experience, and which guide
ended methods, this integration of studies across methods of emotion elicitation and response subsequent behavior?
modalities yields three important conclusions for the future study of emotion. We find that emotion
is high dimensional, involving upwards of 25 distinct kinds of emotions, each with their own
patterned profile of associated responses. We reveal how specific categories of emotion more
so than valence and arousal organize the representation of emotion in experience, expression,
and neural processing. Other findings suggest that boundaries between emotion categories are
not discrete, and that much of emotional response is systematically blended.
In more specific terms, across subjective experience, facial–bodily expression, vocal bursts,
prosody, and brain patterning, we find convergent evidence to be a rich semantic space of emo-
tion. These results point to at least eight distinct, more negatively valenced states found across all
modalities (anger, anxiety, confusion, disgust, embarrassment, fear, pain, and sadness); nine
distinct positively valenced states (amusement, awe, contentment, desire, elation, interest,
love, relief, and triumph); and surprise. Further emotional experiences and expressions may be
modality specific, including pride and shame in the face/body [42,45] and the distinctive feeling
of dreaminess or reverie evoked by certain pieces of music [36].
The field of affective science has long been anchored to the study of six emotions [11]. In opening
up inquiry to a richer space of emotion, one gains purchase in answering old questions in the field,
as suggested in our syntheses of findings related to emotion-related neural response and
mammalian behavior (see Outstanding Questions). Such findings support the organization of
emotion-related physiological responses within a high-dimensional semantic space.
Supplemental Information
Supplemental information associated with this article can be found online at https://doi.org/10.1016/j.tics.2020.11.004.
References
1. Colibazzi, T. et al. (2010) Neural systems subserving valence 7. Lindquist, K.A. et al. (2012) The brain basis of emotion: a meta-
and arousal during the experience of induced emotions. analytic review. Behav. Brain Sci. 35, 121–143
Emotion 10, 377–389 8. Lench, H.C. et al. (2011) Discrete emotions predict changes in
2. Russell, J.A. (2003) Core affect and the psychological cognition, judgment, experience, behavior, and physiology: a
construction of emotion. Psychol. Rev. 110, 145–172 meta-analysis of experimental emotion elicitations. Psychol.
3. Watson, D. and Tellegen, A. (1985) Toward a consensual Bull. 137, 834–855
structure of mood. Psychol. Bull. 98, 219–235 9. Scarantino, A. (2012) Functional specialization does not
4. Elfenbein, H.A. and Ambady, N. (2002) On the universality and require a one-to-one mapping between brain regions and
cultural specificity of emotion recognition: a meta-analysis. emotions. Behav. Brain Sci. 35, 161–162
Psychol. Bull. 128, 203–235 10. Scarantino, A. (2012) How to define emotions scientifically.
5. Mollahosseini, A. et al. (2019) AffectNet: a database for facial Emot. Rev. 4, 358–368
expression, valence, and arousal computing in the wild. IEEE 11. Cowen, A. et al. (2019) Mapping the passions: toward a high-
Trans. Affect. Comput. 10, 18–31 dimensional taxonomy of emotional experience and expres-
6. Barrett, L.F. et al. (2019) Emotional expressions reconsidered: sion. Psychol. Sci. Public Interest 20, 69–90
challenges to inferring emotion from human facial movements. 12. Barrett, L.F. (2017) Categories and their role in the science of
Psychol. Sci. Public Interest 20, 1–68 emotion. Psychol. Inq. 28, 20–26
13. Keltner, D. et al. (2019) Emotional expression: advances in 42. Cowen, A.S. and Keltner, D. (2019) What the face displays:
basic emotion theory. J. Nonverbal Behav. 43, 133–160 mapping 28 emotions conveyed by naturalistic expression.
14. Fridlund, A.J. (2017) Evolution of facial musculature. In The Am. Psychol. 75, 349–364
Science of Facial Expression (Russell, J.A. and Fernandez- 43. Oyama, T. et al. (1962) Affective dimensions of colors. Jpn.
Dols, J.M., eds), pp. 77–92, Oxford Scholarship Online Psychol. Res. 4, 78–91
15. Keltner, D. et al. (2019) What basic emotion theory really says 44. Scherer, K.R. (1994) Toward a concept of “modal emotions”. In The
for the twenty-first century study of emotion. J. Nonverbal Nature of Emotion: Fundamental Questions (Fox, A.S. et al., eds),
Behav. 43, 195–201 pp. 25–31, Oxford University Press
16. Scarantino, A. (2014) Basic emotions, psychological construction 45. Tracy, J.L. and Matsumoto, D. (2008) The spontaneous expression
and the problem of variability. In The Psychological Construction of pride and shame: Evidence for biologically innate nonverbal
of Emotion (Barrett, L.F. and Russell, J.A., eds), pp. 334–376 displays. Proc. Natl. Acad. Sci. U. S. A. 105, 11655–11660
17. Schirmer, A. and Adolphs, R. (2017) Emotion perception from 46. Tracy, J.L. and Robins, R.W. (2008) The nonverbal expression
face, voice, and touch: comparisons and convergence. Trends of pride: evidence for cross-cultural recognition. J. Pers. Soc.
Cogn. Sci. 21, 216–228 Psychol. 94, 516–530
18. James, W. (1884) Ii.—What is an emotion ? Mind os-IX 188–205 47. Hejmadi, A. et al. (2000) Exploring Hindu Indian emotion ex-
19. Keltner, D. and Lerner, J.S. (2010) Emotion. In Handbook of pressions: evidence for accurate recognition by Americans
Social Psychology (Fiske, S.T. et al., eds), Wiley Online Library and Indians. Psychol. Sci. 11, 183–186
20. Levenson, R.W. (1999) The intrapersonal functions of emotion. 48. Hertenstein, M.J. et al. (2006) Touch communicates distinct
Cogn. Emot. 13, 481–504 emotions. Emotion 6, 528–533
21. Titze, I.R. and Martin, D.W. (1998) Principles of voice produc- 49. Cordaro, D.T. et al. (2020) The recognition of 18 facial bodily
tion. J. Acoust. Soc. Am. 104 expressions across nine cultures. Emotion 20, 1292–1300
22. Godinho, R.M. et al. (2018) Supraorbital morphology and so- 50. Cordaro, D.T. et al. (2018) Universals and cultural variations in
cial dynamics in human evolution. Nat. Ecol. Evol. 2, 956–961 22 emotional expressions across five cultures. Emotion 18,
23. Schmidt, K.L. and Cohn, J.F. (2001) Human facial expressions 75–93
as adaptations: Evolutionary questions in facial expression re- 51. Cordaro, D.T. et al. (2016) The voice conveys emotion in ten
search. Am. J. Phys. Anthropol. 116, 3–24 globalized cultures and one remote village in Bhutan. Emotion
24. Srinivasan, R. and Martinez, A.M. (2018) Cross-cultural and 16, 117–128
cultural-specific production and perception of facial expres- 52. Laukka, P. et al. (2013) Cross-cultural decoding of positive and
sions of emotion in the wild. IEEE Trans. Affect. Comput. negative non-linguistic emotion vocalizationscross-cultural
Published online December 18, 2018. https://doi.org/ decoding of positive and negative non-linguistic emotion vocali-
10.1109/TAFFC.2018.2887267 zations. Front. Psychol. 4, 353
25. Cowen, A.S. et al. (2018) Mapping 24 emotions conveyed by 53. Matsumoto, D. and Hwang, H.S. (2010) Culture, emotion, and
brief human vocalization. Am. Psychol. 22, 274–276 expression. In Cross-Cultural Psychology: Contemporary
26. Cowen, A.S. and Keltner, D. (2017) Self-report captures 27 Themes and Perspectives (Keith, K.D., ed.), Wiley–Blackwell
distinct categories of emotion bridged by continuous gradients. 54. Scherer, K.R. et al. (2001) Emotion inferences from vocal
Proc. Natl. Acad. Sci. U. S. A. 114, E7900–E7909 expression correlate across languages and cultures. J. Cross-
27. Cowen, A.S. and Keltner, D. (2018) Clarifying the conceptuali- Cult. Psychol. 32, 76–92
zation, dimensionality, and structure of emotion: response to 55. van Hemert, D.A. et al. (2007) Emotion and culture: a meta-
Barrett and colleagues. Trends Cogn. Sci. 22, 274–276 analysis. Cogn. Emot. 21, 913–943
28. Cowen, A.S. et al. (2019) The primacy of categories in the 56. Elfenbein, H.A. (2013) Nonverbal dialects and accents in facial
recognition of 12 emotions in speech prosody across two expressions of emotion. Emot. Rev. 5, 90–96
cultures. Nat. Hum. Behav. 3, 369–382 57. Laukka, P. et al. (2014) Evidence for cultural dialects in vocal
29. Barrett, L.F. (2006) Are emotions natural kinds? Perspect. emotion expression: Acoustic classification within and across
Psychol. Sci. 1, 28–58 five nations. Emotion 14, 445–449
30. Shaver, P. et al. (1987) Emotion knowledge: further exploration of 58. Cowen, A.S. and Keltner, D. (2020) Universal emotional
a prototype approach. J. Pers. Soc. Psychol. 52, 1061–1086 expressions uncovered in art of the ancient Americas: a com-
31. Scherer, K.R. and Wallbott, H.G. (1994) Evidence for universality putational approach. Sci. Adv. 6, eabb1005
and cultural variation of differential emotion response patterning. 59. Durán, J.I. et al. (2017) Coherence between emotions and
J. Pers. Soc. Psychol. 66, 310–328 facial expressions. In The Science of Facial Expression 1
32. Barrett, L.F. (2006) Valence is a basic building block of emo- (Fernandez-Dols, J.-M. and Russell, J.A., eds), pp. 107–129,
tional life. J. Res. Pers. 40, 35–55 Oxford University Press
33. Smith, C.A. and Ellsworth, P.C. (1985) Patterns of cognitive 60. Skerry, A.E. and Saxe, R. (2015) Neural representations of
appraisal in emotion. J. Pers. Soc. Psychol. 48, 813–838 emotion are organized around abstract event features. Curr.
34. Cowen, A.S. et al. (2019) Mapping the passions: moving from Biol. 25, 1945–1954
impoverished models to a high dimensional taxonomy of emo- 61. Koide-Majima, N. et al. (2018) Distinct dimensions of emotion
tion. Psychol. Sci. Public Interest 20, 69–90 in the human brain and their representation on the cortical
35. Cowen, A.S. et al. Sixteen facial expressions occur in similar surface. NeuroImage Published online November 15, 2020.
contexts worldwide. Nature (in press). doi: https://dx.doi.org/ https://doi.org/10.1016/j.neuroimage.2020.117258
10.1038/s41586-020-3037-7 62. Saarimäki, H. et al. (2018) Distributed affective space repre-
36. Cowen, A.S. et al. (2020) What music makes us feel: sents multiple emotion categories across the human brain.
uncovering 13 kinds of emotion evoked by music across Soc. Cogn. Affect. Neurosci. 13, 471–482
cultures. Proc. Natl. Acad. Sci. U. S. A. 117, 1924–1934 63. Huth, A.G. et al. (2012) A continuous semantic space
37. Demszky, D. et al. (2020) GoEmotions: a dataset of fine- describes the representation of thousands of object and action
grained emotions. arXiv Published online June 3, 2020. categories across the human brain. Neuron 76, 1210–1224
http://arxiv.org/abs/2005.00547 64. Kragel, P.A. et al. (2019) Emotion schemas are embedded in
38. Scherer, K.R. and Ellgring, H. (2007) Multimodal expression of the human visual system. Sci. Adv. 5, eaaw4358
emotion: affect programs or componential appraisal patterns? 65. Shenhav, A. and Mendes, W.B. (2014) Aiming for the stomach
Emotion 7, 158–171 and hitting the heart: Dissociable triggers and sources for
39. Ellsworth, P.C. (2013) Appraisal theory: old and new questions. disgust reactions. Emotion 14, 301–309
Emot. Rev. 5, 125–131 66. Horikawa, T. et al. (2019) The neural representation of emotion
40. Konishi, S. and Kitagawa, G. (2008) Information Criteria and is high-dimensional, categorical, and distributed across
Statistical Modeling, Springer transmodal brain regions. iScience Published online May 22,
41. Williamson, S.J. and Cummins, H.Z. (1983) Light and Color in 2020. https://doi.org/10.1016/j.isci.2020.101060
Nature and Art, Wiley and Sons https://www.amazon.com/ 67. Singer, T. and Lamm, C. (2009) The social neuroscience of
Light-Color-Nature-Samuel-Williamson/dp/0471083747 empathy. Ann. N. Y. Acad. Sci. 1156, 81–96
68. Carrillo, M. et al. (2019) Emotional mirror neurons in the rat’s Negative Valence Systems domain. Psychophysiology 53,
anterior cingulate cortex. Curr. Biol. 29, 1301–1312.e6 355–363
69. Caruana, F. et al. (2011) Emotional and social behaviors elicited 98. Fraser, O.N. et al. (2008) Stress reduction through consolation
by electrical stimulation of the insula in the macaque monkey. in chimpanzees. Proc. Natl. Acad. Sci. U. S. A. 105,
Curr. Biol. 21, 195–199 8557–8562
70. Wicker, B. et al. (2003) Both of us disgusted in My insula: the 99. Davila-Ross, M. et al. (2011) Aping expressions? chimpanzees
common neural basis of seeing and feeling disgust. Neuron produce distinct laugh types when responding to laughter of
40, 655–664 others. Emotion 11, 1013–1020
71. Calder, A.J. et al. (2016) Impaired recognition and experience of 100. Palagi, E. et al. (2016) Rough-and-tumble play as a window on
disgust following brain injury. In Facial Expression Recognition: animal communication. Biol. Rev. 91, 311–327
Selected works of Andy Young, pp. 195–198 101. Martin, J. et al. (2017) Smiles as multipurpose social signals.
72. Kreibig, S.D. (2010) Autonomic nervous system activity in emo- Trends Cogn. Sci. 21, 864–877
tion: a review. Biol. Psychol. 84, 394–421 102. Anikin, A. and Lima, C.F. (2018) Perceptual and acoustic differences
73. Mesgarani, N. et al. (2014) Phonetic feature encoding in human between authentic and acted nonverbal emotional vocalizations.
superior temporal gyrus. Science (80-. ) 343, 1006–1010 Q. J. Exp. Psychol. 71, 622–641
74. Huth, A.G. et al. (2016) Natural speech reveals the semantic 103. Caruana, F. et al. (2015) Mirth and laughter elicited by electrical
maps that tile human cerebral cortex. Nature 532, 453–458 stimulation of the human anterior cingulate cortex. Cortex 71,
75. Ito, T. et al. (2020) Discovering the computational relevance of 323–331
brain network organization. Trends Cogn. Sci. 24, 25–38 104. Crockford, C. and Boesch, C. (2005) Call combinations in wild
76. Nishimoto, S. et al. (2011) Reconstructing visual experiences from chimpanzees. Behaviour 142, 397–421
brain activity evoked by natural movies. Curr. Biol. 21, 1641–1646 105. Yamao, Y. et al. (2015) Neural correlates of mirth and laughter: a
77. Barrett, L.F. (2017) The theory of constructed emotion: an direct electrical cortical stimulation study. Cortex 66, 134–140
active inference account of interoception and categorization. 106. Hertenstein, M.J. et al. (2009) The communication of emotion
Soc. Cogn. Affect. Neurosci. 12, 1–23 via touch. Emotion 9, 566–573
78. Satpute, A.B. and Lindquist, K.A. (2019) The default mode 107. Gonzaga, G.C. et al. (2001) Love and the commitment problem
network’s role in discrete emotion. Trends Cogn. Sci. 23, 851–864 in romantic relations and friendship. J. Pers. Soc. Psychol. 81,
79. Margulies, D.S. et al. (2016) Situating the default-mode 247–262
network along a principal gradient of macroscale cortical orga- 108. Dunbar, R.I.M. (2010) The social role of touch in humans and
nization. Proc. Natl. Acad. Sci. U. S. A. 113, 12574–12579 primates: behavioural function and neurobiological mecha-
80. Cowen, A.S. (2019) Neurobiological explanation for diverse nisms. Neurosci. Biobehav. Rev. 34, 260–268
responses associated with a single emotion. Sci. eLetter 109. Feldman, R. et al. (2007) Evidence for a neuroendocrinological
81. Seo, C. et al. (2019) Intense threat switches dorsal raphe foundation of human affiliation: Plasma oxytocin levels across
serotonin neurons to a paradoxical operational mode. Science pregnancy and the postpartum period predict mother-infant
363, 539–542 bonding. Psychol. Sci. 18, 965–970
82. Kataoka, N. et al. (2020) A central master driver of psychoso- 110. Feldman, R. et al. (2010) Natural variations in maternal and
cial stress responses in the rat. Science 367, 1105–1112 paternal care are associated with systematic changes in oxytocin
83. Bargh, J.A. (1994) The four horsemen of automaticity: aware- following parent-infant contact. Psychoneuroendocrinology 35,
ness, efficiency, intention, and control in social cognition. In 1133–1141
Handbook of Social Cognition: Basic Processes; Applications 111. Kojima, S. et al. (2012) Maternal contact differentially modulates
(Wyer, R.S. Jr. and Srull, T.K., eds), pp. 1–40, Erlbaum central and peripheral oxytocin in rat pups during a brief regime
84. Ochsner, K.N. and Gross, J.J. (2005) The cognitive control of of mother-pup interaction that induces a filial huddling preference.
emotion. Trends Cogn. Sci. 9, 242–249 J. Neuroendocrinol. 24, 831–840
85. de Waal, F. (2019) Mama’s Last Hug: Animal Emotions and 112. Seltzer, L.J. et al. (2010) Social vocalizations can release oxytocin
What They Tell Us About Ourselves, W. W. Norton in humans. Proc. R. Soc. B Biol. Sci. 277, 2661–2666
86. Parr, L.A. et al. (2005) Influence of social context on the use of 113. Wilson, S.P. (2017) Modelling the emergence of rodent filial
blended and graded facial displays in chimpanzees. Int. huddling from physiological huddling. R. Soc. Open Sci.
J. Primatol. 26, 73–103 Published online November 22, 2017. https://doi.org/
87. Burkett, J.P. et al. (2016) Oxytocin-dependent consolation 10.1098/rsos.170885
behavior in rodents. Science 351, 375–378 114. Snowdon, C.T. et al. (2010) Variation in oxytocin is related to
88. Webb, C.E. et al. (2017) Long-term consistency in chimpanzee variation in affiliative behavior in monogamous, pairbonded
consolation behaviour reflects empathetic personalities. Nat. tamarins. Horm. Behav. 58, 614–618
Commun. 8, 292 115. Broesch, T.L. and Bryant, G.A. (2015) Prosody in infant-
89. Romero, T. et al. (2010) Consolation as possible expression of directed speech is similar across western and traditional cultures.
sympathetic concern among chimpanzees. Proc. Natl. Acad. J. Cogn. Dev. 16, 31–43
Sci. U. S. A. 107, 12110–12115 116. Bryant, G.A. and Barrett, H.C. (2007) Recognizing intentions in
90. Llamazares-Martín, C. et al. (2017) Relaxed open mouth reci- infant-directed speech: evidence for universals. Psychol. Sci.
procity favours playful contacts in South American sea lions 18, 746–751
(Otaria flavescens). Behav. Process. 140, 87–95 117. Keltner, D. and Haidt, J. (1999) Social functions of emotions at
91. Waller, B.M. and Cherry, L. (2012) Facilitating play through four levels of analysis. Cogn. Emot. 13, 505–521
communication: significance of teeth exposure in the gorilla 118. Moors, A. et al. (2013) Appraisal theories of emotion: state of
play face. Am. J. Primatol. 74, 157–164 the art and future development. Emot. Rev. 5, 119–124
92. Ishiyama, S. and Brecht, M. (2016) Neural correlates of ticklishness 119. Gentsch, K. et al. (2015) Appraisals generate specific configu-
in the rat somatosensory cortex. Science (80-. ) 354, 757–760 rations of facial muscle movements in a gambling task: evi-
93. Davila Ross, M. et al. (2009) Reconstructing the evolution of dence for the component process model of emotion. PLoS
laughter in great apes and humans. Curr. Biol. 19, 1106–1111 One 10
94. Coleman, K. and Pierre, P.J. (2014) Assessing anxiety in non- 120. Ekman, P. (1992) An argument for basic emotions. Cogn.
human primates. ILAR J. 55, 333–346 Emot. 6, 169–200
95. Schino, G. et al. (1991) Measuring anxiety in nonhuman primates: 121. Roseman, I.J. (1991) Appraisal determinants of discrete emo-
effect of lorazepam on macaque scratching. Pharmacol. tions. Cogn. Emot. 5, 161–200
Biochem. Behav. 38, 889–891 122. Tsai, C.G. et al. (2010) Aggressiveness of the growl-like timbre:
96. Schino, G. et al. (1996) Primate displacement activities as an acoustic characteristics, musical implications, and biomechanical
ethopharmacological model of anxiety. Anxiety 2, 186–191 mechanisms. Music. Percept. 27, 209–221
97. Latzman, R.D. et al. (2016) Displacement behaviors in chimpanzees 123. Faragó, T. et al. (2010) “The bone is mine”: affective and referen-
(Pan troglodytes): a neurogenomics investigation of the RDoC tial aspects of dog growls. Anim. Behav. 79, 917–925
124. Lin, D. et al. (2011) Functional identification of an aggression 140. Zaki, J. et al. (2016) The anatomy of suffering: understanding
locus in the mouse hypothalamus. Nature 470, 221–227 the relationship between nociceptive and empathic pain.
125. Falkner, A.L. et al. (2016) Hypothalamic control of male Trends Cogn. Sci. 20, 249–259
aggression-seeking behavior. Nat. Neurosci. 19, 596–604 141. Weisfeld, G.E. and Beresford, J.M. (1982) Erectness of posture
126. Todd, W.D. et al. (2018) A hypothalamic circuit for the circadian as an indicator of dominance or success in humans. Motiv.
control of aggression. Nat. Neurosci. 21, 717–724 Emot. 6, 113–131
127. Ueno, A. et al. (2004) Facial responses to four basic tastes in 142. Shimizu, D. (2015) Skeletal and dental morphology. In Mahale
newborn rhesus macaques (Macaca mulatta) and chimpan- Chimpanzees: 50 Years of Research (Nakamura, M. et al., eds),
zees (Pan troglodytes). Behav. Brain Res. 154, 261–271 pp. 612–624, Cambridge University Press
128. Steiner, J.E. et al. (2001) Comparative expression of hedonic 143. Snyder, D.S. et al. (1984) Peer separation in infant chimpan-
impact: affective reactions to taste by human infants and zees, a pilot study. Primates 25, 78–88
other primates. Neurosci. Biobehav. Rev. 25, 53–74 144. Parsons, C.E. et al. (2014) Ready for action: a role for the
129. Dolensek, N. et al. (2020) Facial expressions of emotion states human midbrain in responding to infant vocalizations. Soc.
and their neuronal correlates in mice. Science 368, 89–94 Cogn. Affect. Neurosci. 9, 977–984
130. Schaller, M. et al. (2010) Mere visual perception of other 145. Witteman, J. et al. (2019) Towards a neural model of infant cry
people’s disease symptoms facilitates a more aggressive perception. Neurosci. Biobehav. Rev. 99, 23–32
immune response. Psychol. Sci. 21, 649–652 146. Weisfeld, G.E. and Dillon, L.M. (2012) Applying the dominance
131. Fallow, P.M. et al. (2011) Sound familiar? Acoustic similarity hierarchy model to pride and shame, and related behaviors.
provokes responses to unfamiliar heterospecific alarm calls. J. Evol. Psychol. 10, 15–41
Behav. Ecol. 22, 401–410 147. Issa, F.A. and Edwards, D.H. (2006) Ritualized submission and
132. Zuberbühler, K. (2009) Chapter 8 survivor signals. The biology and the reduction of aggression in an invertebrate. Curr. Biol. 16,
psychology of animal alarm calling. Adv. Study Behav. 40, 277–322 2217–2221
133. Arnal, L.H. et al. (2015) Human screams occupy a privileged niche 148. van Hooff, J.A.R.A.M. (1970) A component analysis of the
in the communication soundscape. Curr. Biol. 25, 2051–2056 structure of the social behaviour of a semi-captive chimpanzee
134. Méndez-Bértolo, C. et al. (2016) A fast pathway for fear in group. Experientia 26, 549–550
human amygdala. Nat. Neurosci. 19, 1041–1049 149. Lindegaard, M.R. et al. (2017) Consolation in the aftermath of
135. Kuraoka, K. and Nakamura, K. (2007) Responses of single robberies resembles post-aggression consolation in chimpan-
neurons in monkey amygdala to facial and vocal emotions. zees. PLoS One 12, e0177725
J. Neurophysiol. 97, 1379–1387 150. Lingle, S. and Riede, T. (2014) Deer mothers are sensitive to
136. Leknes, S. et al. (2013) Oxytocin enhances pupil dilation and infant distress vocalizations of diverse mammalian species.
sensitivity to “hidden” emotional expressions. Soc. Cogn. Am. Nat. 184, 510–522
Affect. Neurosci. 8, 741–749 151. Gluth, S. and Fontanesi, L. (2016) Wiring the altruistic brain.
137. Kret, M.E. and De Dreu, C.K.W. (2017) Pupil-mimicry condi- Science 351, 1028–1029
tions trust in partners: Moderation by oxytocin and group 152. FeldmanHall, O. et al. (2015) Empathic concern drives costly
membership. Proc. R. Soc. B Biol. Sci. 284, 20162554 altruism. Neuroimage 105, 347–356
138. Langford, D.J. et al. (2010) Coding of facial expressions of pain 153. Zahn, R. et al. (2009) Subgenual cingulate activity reflects indi-
in the laboratory mouse. Nat. Methods 7, 447–449 vidual differences in empathic concern. Neurosci. Lett. 457,
139. Descovich, K.A. et al. (2017) Facial expression: an under- 107–110
utilized tool for the assessment of welfare in mammals. Altex 154. Glasser, M.F. et al. (2016) A multi-modal parcellation of human
34, 409–429 cerebral cortex. Nature 536, 171–178