<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Hum. Neurosci.</journal-id>
<journal-title>Frontiers in Human Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Hum. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5161</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnhum.2012.00228</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Task-dependent neural bases of perceiving emotionally expressive targets</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Zaki</surname> <given-names>Jamil</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x0002A;</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Weber</surname> <given-names>Jochen</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Ochsner</surname> <given-names>Kevin</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Psychology, Harvard University</institution> <country>Cambridge, MA, USA</country></aff>
<aff id="aff2"><sup>2</sup><institution>Department of Psychology, Columbia University</institution> <country>New York, NY, USA</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Leonhard Schilbach, Max-Planck-Institute for Neurological Research, Germany</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Bhismadev Chakrabarti, University of Reading, UK; Bojana Kuzmanovic, Research Center Juelich, Germany; Susanne Quadflieg, New York University Abu Dhabi, United Arab Emirates</p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: Jamil Zaki, Department of Psychology, Harvard University, Cambridge, MA, USA. e-mail: <email>zaki&#x00040;wjh.harvard.edu</email></p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>02</day>
<month>08</month>
<year>2012</year>
</pub-date>
<pub-date pub-type="collection">
<year>2012</year>
</pub-date>
<volume>6</volume>
<elocation-id>228</elocation-id>
<history>
<date date-type="received">
<day>07</day>
<month>03</month>
<year>2012</year>
</date>
<date date-type="accepted">
<day>17</day>
<month>07</month>
<year>2012</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2012 Zaki, Weber and Ochsner.</copyright-statement>
<copyright-year>2012</copyright-year>
<license license-type="open-access" xlink:href="http://www.frontiersin.org/licenseagreement"><p>This is an open-access article distributed under the terms of the <uri xlink:href="http://creativecommons.org/licenses/by/3.0/">Creative Commons Attribution License</uri>, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and subject to any copyright notices concerning any third-party graphics etc.</p>
</license>
</permissions>
<abstract><p>Social cognition is fundamentally interpersonal: individuals&#x00027; behavior and dispositions critically affect their interaction partners&#x00027; information processing. However, cognitive neuroscience studies, partially because of methodological constraints, have remained largely &#x0201C;perceiver-centric&#x0201D;: focusing on the abilities, motivations, and goals of social perceivers while largely ignoring interpersonal effects. Here, we address this knowledge gap by examining the neural bases of perceiving emotionally expressive and inexpressive social &#x0201C;targets.&#x0201D; Sixteen perceivers were scanned using fMRI while they watched targets discussing emotional autobiographical events. Perceivers continuously rated each target&#x00027;s emotional state or eye-gaze direction. The effects of targets&#x00027; emotional expressivity on perceiver&#x00027;s brain activity depended on task set: when perceivers explicitly attended to targets&#x00027; emotions, expressivity predicted activity in neural structures&#x02014;including medial prefrontal and posterior cingulate cortex&#x02014;associated with drawing inferences about mental states. When perceivers instead attended to targets&#x00027; eye-gaze, target expressivity predicted activity in regions&#x02014;including somatosensory cortex, fusiform gyrus, and motor cortex&#x02014;associated with monitoring sensorimotor states and biological motion. These findings suggest that expressive targets affect information processing in manner that depends on perceivers&#x00027; goals. More broadly, these data provide an early step toward understanding the neural bases of interpersonal social cognition.</p></abstract>
<kwd-group>
<kwd>emotional expressivity</kwd>
<kwd>empathy</kwd>
<kwd>fMRI</kwd>
<kwd>medial prefrontal cortex</kwd>
<kwd>social cognition</kwd>
</kwd-group>
<counts>
<fig-count count="2"/>
<table-count count="3"/>
<equation-count count="0"/>
<ref-count count="66"/>
<page-count count="11"/>
<word-count count="7853"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="introduction" id="s1">
<title>Introduction</title>
<p>Social life requires constant attention to and understanding of others&#x00027; thoughts and feelings; as such, it is unsurprising that research has increasingly focused on the neural bases of these abilities (Decety, <xref ref-type="bibr" rid="B7">2011</xref>; Zaki and Ochsner, <xref ref-type="bibr" rid="B63">2012</xref>). The vast majority of this work has centered around the cognitive and neural processes engaged by perceivers (individuals focusing on another person&#x00027;s internal states) when they encounter social targets (individuals who are the focus of perceivers&#x00027; attention). However, social cognition is fundamentally interpersonal, and social cognitive outcomes (such as interpersonal accuracy and rapport) depend just as deeply on targets&#x00027; behaviors as they do on perceivers&#x00027; skills and motives (Zaki and Ochsner, <xref ref-type="bibr" rid="B62">2011</xref>).</p>
<p>For example, targets vary in their levels of emotional expressivity (i.e., the extent to which their behavior reflects their internal states). Expressivity can be measured either as a trait (e.g., through self-report questionnaires; see Gross and John, <xref ref-type="bibr" rid="B16">1997</xref>) or as a state (e.g., by coding single episodes of behaviors such as emotional facial expressions; see Gross and Levenson, <xref ref-type="bibr" rid="B18">1993</xref>). Trait and state measures of expressivity are moderately correlated, such that individuals who report themselves to be expressive also produce more clear and intense non-verbal emotional cues in experimental contexts (Gross and John, <xref ref-type="bibr" rid="B16">1997</xref>; Gross et al., <xref ref-type="bibr" rid="B17">2000</xref>; Zaki et al., <xref ref-type="bibr" rid="B60">2009</xref>). Perhaps more importantly, expressivity measured as either a trait or a state predicts social outcomes. For example, targets high in trait expressivity are interpersonally &#x0201C;readable,&#x0201D; in that perceivers can accurately assess those targets&#x00027; internal states (Snodgrass et al., <xref ref-type="bibr" rid="B45">1998</xref>; Zaki et al., <xref ref-type="bibr" rid="B59">2008</xref>; Zaki and Ochsner, <xref ref-type="bibr" rid="B62">2011</xref>). State expressivity similarly predicts interpersonal accuracy (Zaki et al., <xref ref-type="bibr" rid="B60">2009</xref>) and rapport (Butler et al., <xref ref-type="bibr" rid="B4">2003</xref>).</p>
<p>How do targets&#x00027; expressive traits and states exert their effects on interpersonal outcomes? Intuitively, we might expect that target attributes &#x0201C;get into the heads&#x0201D; of perceivers and affect their processing of social information. However, such an effect could reflect multiple mechanisms, because perceivers&#x00027; responses to social cues depend heavily on the goals and cognitive resources they have on hand.</p>
<p>When given unconstrained cognitive resources (Gilbert et al., <xref ref-type="bibr" rid="B13">1989</xref>; Epley and Waytz, <xref ref-type="bibr" rid="B10">2009</xref>) and motivation to understand targets (Kunda, <xref ref-type="bibr" rid="B21">1990</xref>), perceivers tend to draw explicit inferences about internal states based on targets&#x00027; behavior and the context in which that behavior is embedded. Such &#x0201C;top down&#x0201D; social information processing is reliably accompanied by activity in a system of brain regions including the medial prefrontal cortex (MPFC), posterior cingulate cortex (PCC), precuneus, and temporoparietal junction (Fletcher et al., <xref ref-type="bibr" rid="B11">1995</xref>; Gallagher et al., <xref ref-type="bibr" rid="B12">2000</xref>; Mitchell et al., <xref ref-type="bibr" rid="B28">2002</xref>; Ochsner et al., <xref ref-type="bibr" rid="B30">2004</xref>; Saxe and Powell, <xref ref-type="bibr" rid="B37">2006</xref>). Critically, inferential processing in this system is dependent on attention to targets&#x00027; states (de Lange et al., <xref ref-type="bibr" rid="B8">2008</xref>; Spunt et al., <xref ref-type="bibr" rid="B47">2010</xref>; Spunt and Lieberman, <xref ref-type="bibr" rid="B50">in press</xref>).</p>
<p>However, perceivers do not always devote their full attention to understanding targets&#x00027; thoughts and feelings; they are often distracted, otherwise occupied, or unmotivated to do so. Although this prevents perceivers from engaging in &#x0201C;top down&#x0201D; inferences, it nonetheless leaves room for a number of &#x0201C;bottom up&#x0201D; information processing mechanisms that draw on a system of brain regions almost wholly distinct from those accompanying explicit social inference (Whalen et al., <xref ref-type="bibr" rid="B55">1998</xref>). For example, perceivers detect faces in their environment&#x02014;a process drawing on the fusiform face area (FFA; see Kanwisher et al., <xref ref-type="bibr" rid="B19">1997</xref>)&#x02014;and vicariously share social targets&#x00027; sensorimotor or visceral states&#x02014;a process drawing on motor and somatosensory cortex (Rizzolatti and Craighero, <xref ref-type="bibr" rid="B34">2004</xref>; Keysers et al., <xref ref-type="bibr" rid="B20">2010</xref>)&#x02014;even in the absence of explicit attention to targets&#x00027; states (Vuilleumier et al., <xref ref-type="bibr" rid="B53">2001</xref>; Winston et al., <xref ref-type="bibr" rid="B58">2003</xref>; Chong et al., <xref ref-type="bibr" rid="B6">2008</xref>; Spunt and Lieberman, <xref ref-type="bibr" rid="B50">in press</xref>).</p>
<p>Differences between the characteristics and neural underpinnings of top down and bottom up social processing suggest that target expressivity might affect perceivers&#x00027; information processing, but in a manner that critically depends on task set. Specifically, when perceivers are directly attending to targets&#x00027; internal states (e.g., emotions), expressive targets might provide a stronger &#x0201C;signal&#x0201D; on which to base top down social inferences, and increase perceivers&#x00027; brain activity in regions associated with such inferences. By contrast, when perceivers are not explicitly attending to targets&#x00027; states, expressive targets could nonetheless produce more salient social cues (e.g., more intense emotional facial expressions), which perceivers could evaluate using bottom up processes instantiated in a separate set of neural structures associated with perceiving faces or sensorimotor states.</p>
<p>The current study sought to test these possibilities. We presented perceivers with videos of social targets who varied in their levels of emotional expressivity, both as assessed through trait measures and through state ratings of their expressivity on a video-by-video basis. As such, trait and state expressivity provided &#x0201C;naturalistic&#x0201D; variance in the intensity of social cues produced spontaneously by social targets experiencing real emotions, as opposed to pictures of posed expressions whose intensity is manipulated by experimenters (Zaki and Ochsner, <xref ref-type="bibr" rid="B61">2009</xref>). Perceivers viewed these targets in one of two conditions (1) while explicitly attending to targets&#x00027; emotions, and (2) while attending to eye-gaze, a more low level feature of target behavior that is uncorrelated with the affect experienced or expressed by targets. This allowed us to directly test the prediction that target expressivity would modulate perceiver brain activity in a task-dependent manner.</p>
<p>More broadly, this study took an explicitly interpersonal tack toward the neural bases of social cognition. In part because of the highly intrapersonal nature of scanner environments, extant neuroimaging research has been almost entirely &#x0201C;perceiver-centric&#x0201D;: focusing on perceivers&#x00027; skills, task sets, and motivations as determinants of judgment and predictors of neural activity. However, both intuition and behavioral research clearly support a more nuanced view of social information processing, in which perceivers&#x00027; abilities and motivations interact with targets&#x00027; behaviors and dispositions to produce interpersonal outcomes (Zayas et al., <xref ref-type="bibr" rid="B65">2002</xref>; Zaki et al., <xref ref-type="bibr" rid="B59">2008</xref>; Zaki and Ochsner, <xref ref-type="bibr" rid="B62">2011</xref>). By directly examining such interactions at the level of the brain, the current study sought to provide early steps toward more deeply characterizing these &#x0201C;interactionist&#x0201D; (Mischel and Shoda, <xref ref-type="bibr" rid="B26">1995</xref>) features of social cognition.</p>
</sec>
<sec sec-type="methods" id="s2">
<title>Methods</title>
<sec>
<title>Stimuli</title>
<p>More detailed descriptions of the methods used here are available elsewhere (Zaki et al., <xref ref-type="bibr" rid="B59">2008</xref>, <xref ref-type="bibr" rid="B60">2009</xref>). In a stimulus collection phase of the study, targets (<italic>N</italic> &#x0003D; 14, 7 female, mean age &#x0003D; 26.5) were videotaped while talking about affective autobiographical memories (e.g., proposing marriage or the death of a loved one). Eighteen videos from 11 social targets were chosen for the final stimulus set, on the basis of their self-rated emotional intensity, and in order to balance the number of videos of each valence and target gender. The mean video length was 125 s (range: 72&#x02013;177 s).</p>
<p>We examined target expressivity in two ways. First, trait expressivity was assessed through targets&#x00027; responses to the Berkeley Expressivity Questionnaire (BEQ; see Gross and John, <xref ref-type="bibr" rid="B16">1997</xref>; Gross et al., <xref ref-type="bibr" rid="B17">2000</xref>). This measure captures targets&#x00027; self-concept of how expressive they are (sample item: &#x0201C;when I&#x00027;m happy, my feelings show&#x0201D;), and produced significant variance in our sample (mean BEQ score &#x0003D; 4.90, range &#x0003D; 3.69&#x02013;6.47, SD &#x0003D; 1.02). In order to code &#x0201C;state&#x0201D; expressivity in each video, we used a behavioral coding system developed by Gross and Levenson (<xref ref-type="bibr" rid="B18">1993</xref>), which uses rules developed by Ekman and Friesen (<xref ref-type="bibr" rid="B9">1975/2003</xref>) to assess facial signs of emotion. We focused on the coding system&#x00027;s category: &#x0201C;affective intensity,&#x0201D; because it provides a single global measure of the strength of targets&#x00027; non-verbal emotional displays (see Zaki et al., <xref ref-type="bibr" rid="B60">2009</xref> for more details). Two independent coders trained in the use of this system rated the average emotional intensity of each video, producing reliable ratings (Cronbach&#x00027;s alpha: 0.85; mean intensity score &#x0003D; 2.21, range &#x0003D; 1.17&#x02013;4.02, SD &#x0003D; 0.61). As discussed elsewhere (Zaki et al., <xref ref-type="bibr" rid="B60">2009</xref>) and found by others (Gross and John, <xref ref-type="bibr" rid="B16">1997</xref>), targets&#x00027; self-perceived trait expressivity as measured by the BEQ was correlated with the intensity of their non-verbal expressive behavior on a video by video basis, as assessed by independent raters (<italic>r</italic> &#x0003D; 0.28, <italic>p</italic> &#x0003C; 0.005).</p>
</sec>
<sec>
<title>Protocol</title>
<p>Perceivers (<italic>n</italic> &#x0003D; 16, 11 female, mean age &#x0003D; 19.10, SD &#x0003D; 1.72) were scanned using fMRI while they watched all 18 target videos. While watching six of these videos, perceivers continuously inferred how positive or negative they believed targets felt at each moment; this will be referred to as the <italic>emotion rating</italic> condition. Under this condition, videos appeared in the center of a black screen; a cue orienting perceivers toward their task (e.g., &#x0201C;how good or bad was this person feeling?&#x0201D;) was presented above the video, and a nine-point rating scale (anchored at 1 &#x0003D; &#x0201C;very negative&#x0201D; and 9 &#x0003D; &#x0201C;very positive&#x0201D;) was presented below the video. Perceivers were instructed to change their rating whenever they believed target&#x00027;s emotional state changed in a perceptible way. At the beginning of each video, the number 5 was presented in bold. Whenever perceivers pressed the left arrow key, the bolded number shifted to the left (i.e., 5 was unbolded and 4 was bolded). When perceivers pressed the right arrow key, the bolded number shifted to the right. In this way, perceivers could monitor their ratings in the scanner.</p>
<p>While watching six other videos, perceivers were instructed to continuously rate how far to the left or right the targets&#x00027; eye-gaze was directed; this will be referred to as the <italic>eye-gaze rating</italic> condition. The protocol for this condition was identical to the emotion rating condition, except that the task cue (&#x0201C;where is this person&#x00027;s eye gaze directed&#x0201D;) and Likert scale (1 &#x0003D; &#x0201C;far left,&#x0201D; 9 &#x0003D; &#x0201C;far right&#x0201D;) oriented perceivers toward the target&#x00027;s eye gaze. This task allowed us to examine brain activity evoked by perceivers&#x00027; attending to targets, but not explicitly focusing on targets&#x00027; internal states<xref ref-type="fn" rid="fn0001"><sup>1</sup></xref>.</p>
<p>Perceivers viewed videos under emotion rating and eye gaze rating in a pseudorandomized order, designed to ensure that (1) equal numbers of positive and negative videos were viewed by each perceiver under eye-gaze and emotion rating conditions, (2) equal numbers of videos featuring male and female targets were viewed by each perceiver under eye-gaze and emotion rating conditions, (3) no more than two consecutive videos were viewed under the same task (eye gaze or emotion rating), and (4) a roughly equal number of perceivers viewed each video under each task condition (e.g., a given video would be viewed by eight perceivers under the eye gaze condition, and by eight perceivers under the emotion rating condition). Finally, six additional videos were viewed under another condition not discussed here (see Zaki et al. (<xref ref-type="bibr" rid="B60a">2012</xref>) for details about this condition).</p>
</sec>
<sec>
<title>Imaging data acquisition</title>
<p>Images were acquired using a 1.5 Tesla GE Twin Speed MRI scanner equipped to acquire gradient-echo, echoplanar T2<sup>&#x0002A;</sup>-weighted images (EPI) with blood oxygenation level dependent (BOLD) contrast. Each volume comprised 26 axial slices of 4.5 mm thickness and a 3.5 &#x000D7; 3.5 mm in-plane resolution, aligned along the AC-PC axis. Volumes were acquired continuously every 2 s. Three functional runs were acquired from each subject. Because stimulus videos varied in length and were randomized across runs, the length of each run varied across subjects (range &#x0003D; 345&#x02013;406 TRs). Each run began with five &#x0201C;dummy&#x0201D; volumes, which were discarded from further analyses. At the end of the scanning session, a T-1 weighted structural image was acquired for each subject.</p>
</sec>
<sec>
<title>Neuroimaging analyses</title>
<p>Images were preprocessed and analyzed using SPM2 (Wellcome Department of Imaging Neuroscience, London, UK), and using custom code in Matlab 7.1 (The Mathworks, Matick, MA). All functional volumes from each run were realigned to the first volume of that run, spatially normalized to the standard MNI-152 template, and smoothed using a Gaussian kernel with a full width half maximum (FWHM) of 6 mm. Mean intensity of all volumes from each run were centered at a mean value of 100, trimmed to remove volumes with intensity levels more than three standard deviations from the run mean, and detrended by removing the line of best fit. After this processing, all three runs were concatenated into one consecutive timeseries for the regression analysis.</p>
<p>After preprocessing, we employed three analytic approaches using the general linear model. Across all three approaches, videos were modeled as blocks, in which the onset and duration of each video was convolved with a hemodynamic function. Our first analytic approach employed main effect contrasts to compare brain activity during the <italic>emotion rating</italic> and <italic>eye-gaze rating</italic> conditions; this served primary as a manipulation check, ensuring that attention to targets&#x00027; emotion or to eye gaze preferentially engaged regions involved in making attributions about mental states and assessing low-level features of dynamic social stimuli (e.g., biological motion), respectively.</p>
<p>The second analytic approach directly addressed our primary hypotheses. Here, we used parametric analyses used to isolate perceiver neural structures in which activity varied as a function of target trait and state expressivity. In separate analyses, (1) targets&#x00027; BEQ scores and (2) the intensity of emotional cues in each video were used as parametric modulators, providing regression weights for each video block. Using this method, we searched for clusters of activity that tracked&#x02014;within perceivers&#x02014;with the expressivity of targets they were watching; that is, regions that were more engaged when perceivers viewed a relatively expressive target, and less engaged when they viewed a relatively inexpressive target. These analyses were performed separately for the <italic>emotion rating</italic> and <italic>eye-gaze rating</italic> conditions.</p>
<p>Finally, to more directly assess the task dependency of expressivity related effects, we included two analyses aimed at isolating differences and similarities across eye-gaze and emotion monitoring. To examine differences across tasks, we computed a direct, whole brain analysis contrasting BOLD signal related to target expressivity (assessed at both state and trait levels) during emotion rating vs. eye gaze rating, and visa versa. This allowed us to directly assess an expressivity by task interaction in predicting perceivers&#x00027; brain activity. To examine similarities across tasks, we computed a conjunction including maps reflecting expressivity-related activity in the eye-gaze rating and emotion-rating conditions, using the minimum statistic approach (Nichols et al., <xref ref-type="bibr" rid="B29">2005</xref>). This analysis identifies clusters that were significantly engaged at our threshold in not one, but both conditions. Both of these analyses were performed separately for state and trait expressivity.</p>
<p>All analyses were thresholded at <italic>p</italic> &#x0003C; 0.005, with an extent threshold of <italic>k</italic> &#x0003D; 30. This cluster size was selected to correspond with a corrected threshold of <italic>p</italic> &#x0003C; 0.05, based on Monte Carlo simulations implemented in Matlab (Slotnick et al., <xref ref-type="bibr" rid="B44">2003</xref>).</p>
</sec>
</sec>
<sec sec-type="results" id="s3">
<title>Results</title>
<sec>
<title>Behavioral data</title>
<p>To assess participants&#x00027; engagement during the session, we measured response rates: the number of times that perceivers changed their ratings per minute in each of the conditions. Individuals made significantly more ratings during the eye-gaze rating (mean &#x0003D; 14.11 ratings/minute) condition than during emotion rating (mean &#x0003D; 9.83 ratings/minute) condition, <italic>t</italic>(15) &#x0003D; 3.17, <italic>p</italic> &#x0003C; 0.01. Across both conditions, participants on average made ratings at least one rating per each 6.1 s, suggesting that they were engaged in both tasks.</p>
</sec>
<sec>
<title>Neuroimaging data</title>
<sec>
<title>Manipulation checks: neural bases of emotion rating vs. eye-gaze rating</title>
<p>We first explored neural activity distinctly engaged when perceivers explicitly attended to targets&#x00027; internal states (<italic>emotion rating</italic>) and when they attended to lower-level features of target behavior (<italic>eye-gaze rating</italic>). As predicted, emotion rating&#x02014;when compared to the eye-gaze monitoring&#x02014;engaged brain regions classically associated with drawing inferences about mental states, including large clusters in MPFC, PCC, and precuneus (see Figure <xref ref-type="fig" rid="F1">1</xref> and Table <xref ref-type="table" rid="T1">1</xref>), as well as a number of clusters in left ventral and dorsal prefrontal cortex potentially related to the cognitive components necessary to making high-level emotional appraisals (Mitchell, <xref ref-type="bibr" rid="B27">2009</xref>).</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p><bold>Clusters more engaged during <italic>emotion rating</italic> than during <italic>eye-gaze rating</italic> (in orange); clusters more engaged during <italic>eye-gaze rating</italic> than during <italic>emotion rating</italic> (in blue).</bold> STS, superior temporal sulcus; FFA, fusiform face area; MPFC, medial prefrontal cortex; PCC, posterior cingulate cortex. All clusters exceed a significance thresholded of <italic>p</italic> &#x0003C; 0.005, uncorrected, with an extent threshold of at least 30 voxels, corresponding with a threshold of <italic>p</italic> &#x0003C; 0.05, corrected as computed using Monte Carlo simulations.</p></caption>
<graphic xlink:href="fnhum-06-00228-g0001.tif"/>
</fig>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p><bold>Main effects of condition</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left"><bold>Region</bold></th>
<th align="center" colspan="3"><bold>Coordinates</bold></th>
<th align="left"><bold>T-score</bold></th>
<th align="left"><bold>Volume (vox)</bold></th>
</tr>
<tr>
<th/>
<th align="left"><bold><italic>x</italic></bold></th>
<th align="left"><bold><italic>y</italic></bold></th>
<th align="left"><bold><italic>z</italic></bold></th>
<th/>
<th/>
</tr>
</thead>
<tbody>
<tr>
<td align="left" colspan="6"><bold>EMOTION RATING &#x0003E; EYE-GAZE MONITORING</bold></td>
</tr>
<tr>
<td align="left">ACC/MPFC</td>
<td align="right">&#x02212;2</td>
<td align="right">24</td>
<td align="right">42</td>
<td align="left">6.2</td>
<td align="left">1255</td>
</tr>
<tr>
<td align="left">ACC/MPFC</td>
<td align="right">&#x02212;6</td>
<td align="right">18</td>
<td align="right">12</td>
<td align="left">5.25</td>
<td align="left">485</td>
</tr>
<tr>
<td align="left">MPFC</td>
<td align="right">&#x02212;8</td>
<td align="right">42</td>
<td align="right">28</td>
<td align="left">4.24</td>
<td align="left">148</td>
</tr>
<tr>
<td align="left">Middle Frontal Gyrus</td>
<td align="right">&#x02212;26</td>
<td align="right">44</td>
<td align="right">34</td>
<td align="left">3.94</td>
<td align="left">147</td>
</tr>
<tr>
<td align="left">Middle Frontal Gyrus</td>
<td align="right">&#x02212;46</td>
<td align="right">8</td>
<td align="right">46</td>
<td align="left">4.63</td>
<td align="left">80</td>
</tr>
<tr>
<td align="left">Middle Frontal Gyrus</td>
<td align="right">&#x02212;34</td>
<td align="right">26</td>
<td align="right">46</td>
<td align="left">4.14</td>
<td align="left">122</td>
</tr>
<tr>
<td align="left">Inferior Frontal Gyrus</td>
<td align="right">&#x02212;46</td>
<td align="right">40</td>
<td align="right">&#x02212;6</td>
<td align="left">5.57</td>
<td align="left">64</td>
</tr>
<tr>
<td align="left">Inferior Frontal Gyrus</td>
<td align="right">&#x02212;44</td>
<td align="right">24</td>
<td align="right">&#x02212;6</td>
<td align="left">4.18</td>
<td align="left">72</td>
</tr>
<tr>
<td align="left">Dorsolateral Prefrontal Cortex</td>
<td align="right">&#x02212;46</td>
<td align="right">26</td>
<td align="right">26</td>
<td align="left">4.48</td>
<td align="left">45</td>
</tr>
<tr>
<td align="left">Frontal Operculum</td>
<td align="right">&#x02212;56</td>
<td align="right">14</td>
<td align="right">10</td>
<td align="left">5.18</td>
<td align="left">232</td>
</tr>
<tr>
<td align="left">Caudate</td>
<td align="right">12</td>
<td align="right">8</td>
<td align="right">10</td>
<td align="left">4.1</td>
<td align="left">58</td>
</tr>
<tr>
<td align="left">Precuneus/PCC</td>
<td align="right">0</td>
<td align="right">&#x02212;22</td>
<td align="right">40</td>
<td align="left">4.85</td>
<td align="left">161</td>
</tr>
<tr>
<td align="left">Precuneus/PCC</td>
<td align="right">&#x02212;2</td>
<td align="right">&#x02212;64</td>
<td align="right">40</td>
<td align="left">3.66</td>
<td align="left">175</td>
</tr>
<tr>
<td align="left">Fusiform Gyrus</td>
<td align="right">24</td>
<td align="right">&#x02212;76</td>
<td align="right">&#x02212;10</td>
<td align="left">3.83</td>
<td align="left">197</td>
</tr>
<tr>
<td align="left">Striate Visual Cortex</td>
<td align="right">&#x02212;16</td>
<td align="right">&#x02212;70</td>
<td align="right">&#x02212;10</td>
<td align="left">5.47</td>
<td align="left">355</td>
</tr>
<tr>
<td align="left">Cuneus</td>
<td align="right">2</td>
<td align="right">&#x02212;84</td>
<td align="right">22</td>
<td align="left">3.9</td>
<td align="left">116</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>EYE-GAZE MONITORING &#x0003E; EMOTION RATING</bold></td>
</tr>
<tr>
<td align="left">Premotor Cortex</td>
<td align="right">&#x02212;26</td>
<td align="right">&#x02212;6</td>
<td align="right">46</td>
<td align="left">6.15</td>
<td align="left">577</td>
</tr>
<tr>
<td align="left">Premotor Cortex</td>
<td align="right">&#x02212;58</td>
<td align="right">2</td>
<td align="right">36</td>
<td align="left">3.81</td>
<td align="left">25</td>
</tr>
<tr>
<td align="left">Premotor Cortex</td>
<td align="right">54</td>
<td align="right">0</td>
<td align="right">36</td>
<td align="left">5.65</td>
<td align="left">1316</td>
</tr>
<tr>
<td align="left">Supplementary Motor Area</td>
<td align="right">8</td>
<td align="right">&#x02212;4</td>
<td align="right">62</td>
<td align="left">3.37</td>
<td align="left">37</td>
</tr>
<tr>
<td align="left">SII</td>
<td align="right">64</td>
<td align="right">&#x02212;24</td>
<td align="right">24</td>
<td align="left">5.65</td>
<td align="left">363</td>
</tr>
<tr>
<td align="left">Superior Parietal Lobe</td>
<td align="right">20</td>
<td align="right">&#x02212;62</td>
<td align="right">56</td>
<td align="left">5.59</td>
<td align="left">770</td>
</tr>
<tr>
<td align="left">Intraparietal Sulcus</td>
<td align="right">&#x02212;32</td>
<td align="right">&#x02212;42</td>
<td align="right">48</td>
<td align="left">5.48</td>
<td align="left">1219</td>
</tr>
<tr>
<td align="left">Fusiform Gyrus/STS</td>
<td align="right">54</td>
<td align="right">&#x02212;58</td>
<td align="right">&#x02212;10</td>
<td align="left">5.10</td>
<td align="left">729</td>
</tr>
<tr>
<td align="left">Fusiform Gyrus</td>
<td align="right">&#x02212;44</td>
<td align="right">&#x02212;48</td>
<td align="right">&#x02212;14</td>
<td align="left">4.67</td>
<td align="left">144</td>
</tr>
<tr>
<td align="left">Extrastriate Visual Cortex</td>
<td align="right">&#x02212;42</td>
<td align="right">&#x02212;80</td>
<td align="right">&#x02212;6</td>
<td align="left">6.67</td>
<td align="left">661</td>
</tr>
<tr>
<td align="left">STS</td>
<td align="right">&#x02212;51</td>
<td align="right">&#x02212;52</td>
<td align="right">10</td>
<td align="left">4.26</td>
<td align="left">54</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Note: Coordinates are in MNI space. ACC, Anterior Cingulate Cortex; MPFC, Medial Prefrontal Cortex; PCC, Posterior Cingulate Cortex; SII, Secondary Sensory Cortex.</italic></p>
</table-wrap-foot>
</table-wrap>
<p>The opposite comparison revealed that monitoring and rating targets&#x00027; eye-gaze, as opposed to their emotional states, recruited a network of brain regions involved in monitoring motor intentions, somatosensory states, and biological motion, including bilateral pre-motor cortex, pre- and post-central gyrus, superior temporal sulcus, and SII, as well as bilateral inferotemporal cortex extending into the fusiform gyrus (see Figure <xref ref-type="fig" rid="F1">1</xref> and Table <xref ref-type="table" rid="T1">1</xref>).</p>
</sec>
<sec>
<title>Expressivity during emotion rating</title>
<p>When perceivers were tasked with explicitly rating affective states, both targets&#x00027; trait and video-by-video expressive behaviors were associated with increasing activity brain regions involved in mental state inference, including dorsal and rostral MPFC, PCC, and lateral temporal cortex (see Figure <xref ref-type="fig" rid="F2">2A</xref> and Table <xref ref-type="table" rid="T2">2</xref>).</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p><bold>(A)</bold> Clusters whose activity tracked with targets&#x00027; trait or state expressivity during <italic>emotion rating</italic>. <bold>(B)</bold> Clusters whose activity tracked with targets&#x00027; trait or state expressivity during <italic>eye-gaze rating</italic>. FFA, fusiform face area; MPFC, medial prefrontal cortex; PCC, posterior cingulate cortex.</p></caption>
<graphic xlink:href="fnhum-06-00228-g0002.tif"/>
</fig>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p><bold>Modulation of brain activity by target expressivity</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left"><bold>Region</bold></th>
<th align="center" colspan="3"><bold>Coordinates</bold></th>
<th align="left"><bold>T-score</bold></th>
<th align="left"><bold>Volume (vox)</bold></th>
</tr>
<tr>
<th/>
<th align="left"><bold><italic>x</italic></bold></th>
<th align="left"><bold><italic>y</italic></bold></th>
<th align="left"><bold><italic>z</italic></bold></th>
<th/>
<th/>
</tr>
</thead>
<tbody>
<tr>
<td align="left" colspan="6"><bold>DURING EMOTION RATING (TRAIT EXPRESSIVITY)</bold></td>
</tr>
<tr>
<td align="left">MPFC</td>
<td align="right">0</td>
<td align="right">60</td>
<td align="right">28</td>
<td align="left">4.34</td>
<td align="left">560</td>
</tr>
<tr>
<td align="left">MPFC</td>
<td align="right">&#x02212;10</td>
<td align="right">38</td>
<td align="right">62</td>
<td align="left">4.91</td>
<td align="left">118</td>
</tr>
<tr>
<td align="left">PCC/Precuneus</td>
<td align="right">&#x02212;4</td>
<td align="right">&#x02212;59</td>
<td align="right">28</td>
<td align="left">4.41</td>
<td align="left">179</td>
</tr>
<tr>
<td align="left">Superior Frontal Gyrus</td>
<td align="right">&#x02212;38</td>
<td align="right">10</td>
<td align="right">44</td>
<td align="left">4.12</td>
<td align="left">94</td>
</tr>
<tr>
<td align="left">Middle Temporal Gyrus</td>
<td align="right">68</td>
<td align="right">&#x02212;24</td>
<td align="right">&#x02212;18</td>
<td align="left">4.82</td>
<td align="left">177</td>
</tr>
<tr>
<td align="left">Middle Temporal Gyrus</td>
<td align="right">&#x02212;60</td>
<td align="right">&#x02212;34</td>
<td align="right">&#x02212;22</td>
<td align="left">3.59</td>
<td align="left">106</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>DURING EMOTION RATING (STATE EXPRESSIVITY)</bold></td>
</tr>
<tr>
<td align="left">MPFC</td>
<td align="right">18</td>
<td align="right">57</td>
<td align="right">28</td>
<td align="left">4.73</td>
<td align="left">541</td>
</tr>
<tr>
<td align="left">MPFC</td>
<td align="right">4</td>
<td align="right">50</td>
<td align="right">0</td>
<td align="left">4.58</td>
<td align="left">48</td>
</tr>
<tr>
<td align="left">MPFC/ACC</td>
<td align="right">2</td>
<td align="right">36</td>
<td align="right">42</td>
<td align="left">4.13</td>
<td align="left">31</td>
</tr>
<tr>
<td align="left">PCC</td>
<td align="right">&#x02212;2</td>
<td align="right">&#x02212;32</td>
<td align="right">40</td>
<td align="left">4.14</td>
<td align="left">296</td>
</tr>
<tr>
<td align="left">Inferior Temporal Gyrus</td>
<td align="right">54</td>
<td align="right">&#x02212;30</td>
<td align="right">&#x02212;26</td>
<td align="left">4.57</td>
<td align="left">141</td>
</tr>
<tr>
<td align="left">Posterior Parietal Lobe</td>
<td align="right">&#x02212;48</td>
<td align="right">&#x02212;62</td>
<td align="right">38</td>
<td align="left">4.41</td>
<td align="left">74</td>
</tr>
<tr>
<td align="left">Posterior Parietal Lobe</td>
<td align="right">34</td>
<td align="right">&#x02212;78</td>
<td align="right">52</td>
<td align="left">4.28</td>
<td align="left">67</td>
</tr>
<tr>
<td align="left">Superior Frontal Gyrus</td>
<td align="right">42</td>
<td align="right">16</td>
<td align="right">44</td>
<td align="left">5.54</td>
<td align="left">217</td>
</tr>
<tr>
<td align="left">Precentral Gyrus</td>
<td align="right">&#x02212;42</td>
<td align="right">&#x02212;18</td>
<td align="right">36</td>
<td align="left">5.31</td>
<td align="left">106</td>
</tr>
<tr>
<td align="left">Inferior Temporal Gyrus</td>
<td align="right">&#x02212;70</td>
<td align="right">&#x02212;24</td>
<td align="right">&#x02212;18</td>
<td align="left">4.78</td>
<td align="left">60</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>DURING EYE-GAZE RATING (TRAIT EXPRESSIVITY)</bold></td>
</tr>
<tr>
<td align="left">Premotor Cortex</td>
<td align="right">18</td>
<td align="right">&#x02212;8</td>
<td align="right">72</td>
<td align="left">4.02</td>
<td align="left">81</td>
</tr>
<tr>
<td align="left">Precentral Gyrus</td>
<td align="right">&#x02212;19</td>
<td align="right">&#x02212;28</td>
<td align="right">68</td>
<td align="left">4.95</td>
<td align="left">302</td>
</tr>
<tr>
<td align="left">Precentral Gyrus</td>
<td align="right">&#x02212;36</td>
<td align="right">&#x02212;20</td>
<td align="right">36</td>
<td align="left">4.70</td>
<td align="left">31</td>
</tr>
<tr>
<td align="left">Fusiform Gyrus</td>
<td align="right">&#x02212;49</td>
<td align="right">&#x02212;42</td>
<td align="right">&#x02212;22</td>
<td align="left">4.98</td>
<td align="left">50</td>
</tr>
<tr>
<td align="left">Middle Frontal Gyrus</td>
<td align="right">50</td>
<td align="right">30</td>
<td align="right">32</td>
<td align="left">4.99</td>
<td align="left">57</td>
</tr>
<tr>
<td align="left">Extrastriate Visual Cortex</td>
<td align="right">&#x02212;14</td>
<td align="right">&#x02212;82</td>
<td align="right">30</td>
<td align="left">3.68</td>
<td align="left">71</td>
</tr>
<tr>
<td align="left">Posterior Occipital Lobe</td>
<td align="right">&#x02212;24</td>
<td align="right">&#x02212;100</td>
<td align="right">&#x02212;10</td>
<td align="left">4.34</td>
<td align="left">103</td>
</tr>
<tr>
<td align="left">Angular Gyrus</td>
<td align="right">&#x02212;52</td>
<td align="right">&#x02212;68</td>
<td align="right">44</td>
<td align="left">4.08</td>
<td align="left">50</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>DURING EYE-GAZE RATING (STATE EXPRESSIVITY)</bold></td>
</tr>
<tr>
<td align="left">Premotor Cortex</td>
<td align="right">20</td>
<td align="right">6</td>
<td align="right">64</td>
<td align="left">4.07</td>
<td align="left">59</td>
</tr>
<tr>
<td align="left">Precentral Gyrus</td>
<td align="right">18</td>
<td align="right">&#x02212;8</td>
<td align="right">72</td>
<td align="left">3.83</td>
<td align="left">45</td>
</tr>
<tr>
<td align="left">Pre/Postcentral Gyrus</td>
<td align="right">&#x02212;20</td>
<td align="right">&#x02212;32</td>
<td align="right">78</td>
<td align="left">4.30</td>
<td align="left">160</td>
</tr>
<tr>
<td align="left">Fusiform Gyrus</td>
<td align="right">&#x02212;52</td>
<td align="right">&#x02212;40</td>
<td align="right">&#x02212;24</td>
<td align="left">3.84</td>
<td align="left">55</td>
</tr>
<tr>
<td align="left">Inferior Frontal Gyrus</td>
<td align="right">&#x02212;24</td>
<td align="right">20</td>
<td align="right">&#x02212;32</td>
<td align="left">3.90</td>
<td align="left">43</td>
</tr>
<tr>
<td align="left">Caudate</td>
<td align="right">&#x02212;14</td>
<td align="right">0</td>
<td align="right">&#x02212;8</td>
<td align="left">4.08</td>
<td align="left">32</td>
</tr>
<tr>
<td align="left">Posterior Parietal Lobe</td>
<td align="right">&#x02212;50</td>
<td align="right">&#x02212;60</td>
<td align="right">42</td>
<td align="left">3.72</td>
<td align="left">31</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Note: Coordinates are in MNI space. ACC, Anterior Cingulate Cortex; MPFC, Medial Prefrontal Cortex; PCC, Posterior Cingulate Cortex.</italic></p>
</table-wrap-foot>
</table-wrap>
</sec>
<sec>
<title>Eye-gaze rating</title>
<p>When perceivers were instructed to monitor and rate eye-gaze direction&#x02014;a more &#x0201C;low level&#x0201D; feature of target behavior&#x02014;targets&#x00027; trait and state expressivity tracked parametrically with activity in a set of brain regions involved in monitoring sensorimotor states and perceiving faces, including pre- and post-central gyri and left inferotemporal cortex spanning the fusiform gyrus (See Figure <xref ref-type="fig" rid="F2">2B</xref> and Table <xref ref-type="table" rid="T2">2</xref>).</p>
</sec>
<sec>
<title>Direct comparisons across conditions</title>
<p>In order to compare expressivity related activity across eye gaze and emotion rating conditions, we computed a contrast isolating brain activity that was more responsive to target trait and state expressivity in the emotion rating, as compared to eye-gazing condition, and visa-versa. Broadly, the results of this analysis were consistent with the single-condition analyses. Critically, MPFC and several temporal lobe clusters originally identified as tracking expressivity during emotion rating were also significantly more responsive to target expressivity during emotion rating, as compared to eye gaze rating, regardless of whether expressivity was operationalized as a state or trait. The reverse analysis&#x02014;isolating brain regions that respond to target expressivity more during eye-gaze rating than emotion rating&#x02014;similarly identified regions found in the single-condition analysis, including the precentral gyrus and extrastriate visual cortex (Table <xref ref-type="table" rid="T3">3</xref>).</p>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption><p><bold>Direct comparisons of expressivity related effects across conditions</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left"><bold>Region</bold></th>
<th align="center" colspan="3"><bold>Coordinates</bold></th>
<th align="left"><bold>T-score</bold></th>
<th align="left"><bold>Volume (vox)</bold></th>
</tr>
<tr>
<th/>
<th align="left"><bold><italic>x</italic></bold></th>
<th align="left"><bold><italic>y</italic></bold></th>
<th align="left"><bold><italic>z</italic></bold></th>
<th/>
<th/>
</tr>
</thead>
<tbody>
<tr>
<td align="left" colspan="6"><bold>EMOTION RATING &#x0003E; EYE GAZE RATING (TRAIT EXPRESSIVITY)</bold></td>
</tr>
<tr>
<td align="left">MPFC</td>
<td align="right">&#x02212;2</td>
<td align="right">60</td>
<td align="right">30</td>
<td align="left">4.43</td>
<td align="left">409</td>
</tr>
<tr>
<td align="left">MPFC</td>
<td align="right">0</td>
<td align="right">36</td>
<td align="right">48</td>
<td align="left">3.49</td>
<td align="left">39</td>
</tr>
<tr>
<td align="left">Superior Frontal Gyrus</td>
<td align="right">&#x02212;40</td>
<td align="right">18</td>
<td align="right">46</td>
<td align="left">3.91</td>
<td align="left">139</td>
</tr>
<tr>
<td align="left">Superior Temporal Gyrus</td>
<td align="right">&#x02212;60</td>
<td align="right">&#x02212;38</td>
<td align="right">16</td>
<td align="left">4.71</td>
<td align="left">118</td>
</tr>
<tr>
<td align="left">MTG/ATL</td>
<td align="right">&#x02212;62</td>
<td align="right">&#x02212;14</td>
<td align="right">&#x02212;16</td>
<td align="left">4.06</td>
<td align="left">154</td>
</tr>
<tr>
<td align="left">Precentral/Postcentral Gyri</td>
<td align="right">48</td>
<td align="right">&#x02212;16</td>
<td align="right">36</td>
<td align="left">4.38</td>
<td align="left">62</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>EMOTION RATING &#x0003E; EYE GAZE RATING (STATE EXPRESSIVITY)</bold></td>
</tr>
<tr>
<td align="left">MPFC</td>
<td align="right">&#x02212;6</td>
<td align="right">58</td>
<td align="right">30</td>
<td align="left">4.23</td>
<td align="left">229</td>
</tr>
<tr>
<td align="left">Middle Frontal Gyrus</td>
<td align="right">&#x02212;42</td>
<td align="right">10</td>
<td align="right">44</td>
<td align="left">4.8</td>
<td align="left">171</td>
</tr>
<tr>
<td align="left">Anterior Temporal Lobe</td>
<td align="right">58</td>
<td align="right">0</td>
<td align="right">&#x02212;36</td>
<td align="left">4</td>
<td align="left">41</td>
</tr>
<tr>
<td align="left">Middle Temporal Gyrus</td>
<td align="right">52</td>
<td align="right">&#x02212;4</td>
<td align="right">&#x02212;12</td>
<td align="left">3.74</td>
<td align="left">44</td>
</tr>
<tr>
<td align="left">Inferior Temporal Gyrus</td>
<td align="right">&#x02212;52</td>
<td align="right">&#x02212;24</td>
<td align="right">&#x02212;26</td>
<td align="left">3.89</td>
<td align="left">75</td>
</tr>
<tr>
<td align="left">Precentral Gyrus</td>
<td align="right">44</td>
<td align="right">&#x02212;18</td>
<td align="right">36</td>
<td align="left">3.86</td>
<td align="left">55</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>EYE GAZE RATING &#x0003E; EMOTION RATING (TRAIT EXPRESSIVITY)</bold></td>
</tr>
<tr>
<td align="left">Precentral Gyrus</td>
<td align="right">28</td>
<td align="right">&#x02212;22</td>
<td align="right">64</td>
<td align="left">6.17</td>
<td align="left">36</td>
</tr>
<tr>
<td align="left">Ventral Striatum</td>
<td align="right">4</td>
<td align="right">2</td>
<td align="right">&#x02212;2</td>
<td align="left">4.56</td>
<td align="left">117</td>
</tr>
<tr>
<td align="left">Fusiform Gyrus</td>
<td align="right">36</td>
<td align="right">&#x02212;78</td>
<td align="right">&#x02212;2</td>
<td align="left">3.8</td>
<td align="left">30</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>DURING EYE-GAZE RATING (STATE EXPRESSIVITY)</bold></td>
</tr>
<tr>
<td align="left">Cerebellum</td>
<td align="right">&#x02212;2</td>
<td align="right">&#x02212;54</td>
<td align="right">&#x02212;42</td>
<td align="left">4.78</td>
<td align="left">148</td>
</tr>
<tr>
<td align="left">Fusiform Gyrus</td>
<td align="right">36</td>
<td align="right">&#x02212;76</td>
<td align="right">2</td>
<td align="left">4.6</td>
<td align="left">123</td>
</tr>
<tr>
<td align="left">Medial Occipital Lobe</td>
<td align="right">16</td>
<td align="right">&#x02212;88</td>
<td align="right">26</td>
<td align="left">3.65</td>
<td align="left">37</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Note: Coordinates are in MNI space. MPFC, Medial Prefrontal Cortex; MTG, Middle Temporal Gyrus; ATL, Anterior Temporal Lobe.</italic></p>
</table-wrap-foot>
</table-wrap>
<p>That said, this direct contrast did not entirely reproduce the findings of our single-condition analyses. Specifically, whereas activity in PCC was found to track expressivity during emotion rating, but not eye-gaze rating, this region was not significantly <italic>more</italic> responsive to expressivity under one condition, as compared to the other. Similarly, whereas the fusiform gyrus (corresponding to the so-called &#x0201C;face area&#x0201D;) was responsive to target expressivity under the eye-gaze rating, but not emotion rating condition, this region was not significantly more responsive to target expressivity under eye-gaze rating, as compared to emotion rating, under a direct comparison.</p>
<p>Finally, to isolate any regions whose activity commonly tracked expressivity across both tasks, we computed a conjunction analysis between both activation maps from our original parametric analysis (corresponding to expressivity-related activity under each condition), separately for trait and state expressivity. This analysis revealed very little common activation across tasks. In fact, only one cluster survived either conjunction: during both eye-gaze and emotion-rating, targets&#x00027; trait expressivity predicted activity in the postcentral gyrus (xyz coordinates: &#x02212;24, &#x02212;40, 60, <italic>t</italic> &#x0003D; 3.52, <italic>k</italic> &#x0003D; 41 voxels).</p>
</sec>
</sec>
</sec>
<sec sec-type="discussion" id="s4">
<title>Discussion</title>
<p>Perceivers do not employ social cognitive processes in a vacuum. On the contrary, social cognition is deeply interpersonal, and social psychologists have long studied the way that people&#x00027;s traits and states affect the cognitions, affect, and physiology of their interaction partners (Snodgrass et al., <xref ref-type="bibr" rid="B45">1998</xref>; Butler et al., <xref ref-type="bibr" rid="B4">2003</xref>). However, methodological constraints have often prevented neuroimaging researchers from studying the way that one person&#x00027;s traits or behaviors &#x0201C;get into perceivers&#x00027; heads,&#x0201D; and influence cognitive and neural processes they engage (although newer methods are increasingly circumventing these issues; see, for example Wilms et al., <xref ref-type="bibr" rid="B57">2010</xref>). Further, little work has examined how the intensity of social stimuli (including social targets&#x00027; expressivity) interacts with perceivers&#x00027; goals to affect information processing.</p>
<p>The current study addressed both of these gaps in knowledge. Perceivers watching videos of naturally expressive, as opposed to inexpressive, social targets demonstrated increased engagement of several brain regions, regardless of whether expressivity was measured as a trait (through self-report questionnaires) or as a state (through coding of targets&#x00027; video-by-video emotional behavior). However, the patterns of neural activity associated with target expressivity depended on perceivers&#x00027; information processing goals. If perceivers were actively evaluating targets&#x00027; emotions&#x02014;a task drawing on areas involved in drawing top-down inferences about internal states, such as the MPFC and PCC&#x02014;then expressivity modulated activity in these areas. If, instead, perceivers were attending to targets&#x00027; dynamic shifts in eye-gaze, then target expressivity correlated with activity in a wholly separate set of brain regions, including areas associated with processing faces and biological movement, as well as cortical regions involved in simulating targets&#x00027; sensorimotor states.</p>
<p>The positive relationship between target expressivity and perceivers&#x00027; engagement of key neural associated with social cognition suggests that more expressive targets somehow &#x0201C;amplify&#x0201D; processing related to decoding others&#x00027; internal states. This amplification could reflect at least two separable effects. First, expressive targets could produce clearer (i.e., more &#x0201C;readable&#x0201D;) social and affective signal, which in turn allow perceivers to mentalize more effectively. Second, expressive targets may produce the types of salient signals (e.g., intense facial expressions) that spontaneously draw perceivers&#x00027; attention, and thus cause those perceivers to engage more deeply in subsequent mentalizing and processing of sensorimotor social cues. Further research should examine the extent to which expressivity-driven amplification reflects each or both of these effects.</p>
<sec>
<title>Implications and future directions</title>
<sec>
<title>Expressivity as a window into social cognitive &#x0201C;processing streams&#x0201D;</title>
<p>Perhaps the most striking finding of the current study is that perceivers&#x00027; task set strongly determined the neural correlates of target expressivity, and that expressivity effects recapitulated the main effect differences between top-down and bottom-up social information processing. When perceivers attended to targets&#x00027; affect they preferentially drew on brain regions involved in drawing explicit inferences about targets, whereas attention to target eye gaze engaged regions involved in more automatically processing faces, biological motion, and sensorimotor cues.</p>
<p>Critically, this dissociation was broadly paralleled by the effects of target expressivity, which drove activity in regions associated with explicit mental state attribution or bottom up processing of social stimuli when perceivers attended to targets&#x00027; emotions or eye gaze, respectively. A direct comparison across tasks revealed that activity in some of these key regions was significantly more related to target expressivity under bottom-up or top-down social cognitive processing goals. MPFC and several lateral temporal regions were more strongly engaged by target expressivity during emotion rating, as compared to eye gaze rating, whereas the precentral gyrus and extrastriate visual cortex demonstrated the opposite pattern. Other regions&#x02014;such as the PCC and fusiform gyrus (adjacent to the so-called &#x0201C;face area&#x0201D;) tracked expressivity in only one of these conditions, but did not significantly differentiate between conditions. These regions may be somewhat engaged across both conditions, but fail to meet a significance threshold under one condition. Consistent with this idea, a conjunction analysis revealed that almost no clusters of brain activity significantly tracked target expressivity across both conditions. Together, these data suggest that the effects of target expressivity on perceivers&#x00027; brain activity strongly&#x02014;but not entirely&#x02014;depends on perceivers&#x00027; information processing goals.</p>
<p>This finding lends converging support to the idea of separable social cognitive &#x0201C;processing streams&#x0201D; (Zaki and Ochsner, <xref ref-type="bibr" rid="B63">2012</xref>; Zaki, under revision). The first, centered in midline and lateral temporal cortex, is likely involved in perceivers&#x00027; ability to simulate targets&#x00027; experiences (Buckner and Carroll, <xref ref-type="bibr" rid="B3">2007</xref>; Spreng et al., <xref ref-type="bibr" rid="B46">2009</xref>), and likely requires perceivers to explicitly attend to targets (de Lange et al., <xref ref-type="bibr" rid="B8">2008</xref>; Spunt and Lieberman, <xref ref-type="bibr" rid="B50">in press</xref>). The second, distributed among regions involved in processing low-level social visual cues (e.g., faces and biological movement) and engaging somatosensory states expressed by targets, is engaged in a task-independent fashion (Chong et al., <xref ref-type="bibr" rid="B6">2008</xref>), and deployed whenever the environment contains relevant social cues (Spunt and Lieberman, <xref ref-type="bibr" rid="B50">in press</xref>). In fact, this second processing stream is sometimes most engaged when perceivers do <italic>not</italic> explicitly attend to targets&#x00027; internal states (Lieberman et al., <xref ref-type="bibr" rid="B23">2007</xref>). The dissociation between these social cognitive processing streams has now been established across a number of studies (Brass et al., <xref ref-type="bibr" rid="B2">2007</xref>; Gobbini et al., <xref ref-type="bibr" rid="B14">2007</xref>; Wheatley et al., <xref ref-type="bibr" rid="B56">2007</xref>; Spunt and Lieberman, <xref ref-type="bibr" rid="B50">in press</xref>), and meta-analyses (Van Overwalle, <xref ref-type="bibr" rid="B51">2009</xref>; Van Overwalle and Baetens, <xref ref-type="bibr" rid="B52">2009</xref>). Here, we extend this finding by demonstrating that not only are top down and bottom up processing streams dissociable, but that identical variance in the intensity of social cues (here instantiated through target expressivity) will affect one of these processing stream or the other, independently, as a function of perceivers&#x00027; current goals and cognitive resources.</p>
<p>The relationship between target expressivity and perceiver goals in predicting brain activity further bolsters an &#x0201C;interactionist&#x0201D; (Mischel and Shoda, <xref ref-type="bibr" rid="B26">1995</xref>) model of social cognition as a fundamentally interpersonal phenomenon: depending on the states and traits of not one person, but of both targets and perceivers. This framework has been used to fruitfully capture variance in social judgments and behaviors (Snodgrass et al., <xref ref-type="bibr" rid="B45">1998</xref>; Zayas et al., <xref ref-type="bibr" rid="B65">2002</xref>; Zaki et al., <xref ref-type="bibr" rid="B59">2008</xref>, <xref ref-type="bibr" rid="B60">2009</xref>; Zaki and Ochsner, <xref ref-type="bibr" rid="B62">2011</xref>). Here we extend this approach to modeling brain activity. Importantly, the paradigm used here was not &#x0201C;interactive,&#x0201D; in that it did not include online interactions between&#x02014;or record brain activity from&#x02014;both targets and perceivers (Schilbach et al., <xref ref-type="bibr" rid="B40">2006</xref>, <xref ref-type="bibr" rid="B38">2011</xref>; Schippers and Keysers, <xref ref-type="bibr" rid="B41">2011</xref>). However, interactionist models of social cognition like the one supported here dovetail nicely with interactive paradigms to support more holistic models of social cognition and interaction (Zaki and Ochsner, <xref ref-type="bibr" rid="B61">2009</xref>; Schilbach et al., <xref ref-type="bibr" rid="B39">2012</xref>).</p>
</sec>
<sec>
<title>Stimulus intensity and naturalistic social cues</title>
<p>Although prior work has almost never focused on the neural bases of processing information about expressive vs. inexpressive social targets, a few prior studies have examined the effects of affective stimulus intensity on brain activity, in the domains of odor (Small et al., <xref ref-type="bibr" rid="B44a">2003</xref>), words (Cunningham et al., <xref ref-type="bibr" rid="B6a">2007</xref>), and faces (Winston et al., <xref ref-type="bibr" rid="B58">2003</xref>). In all of these cases, stimulus intensity predicted amygdala activity, whereas in the current study it did not. One possibility is that our design&#x02014;which employed a relatively small number of stimuli and a parametric analysis&#x02014;may have been underpowered to detect effects in the amygdala. A second possibility is that a lack of amygdala activity in our task could reflect differences between the types of cues employed in previous studies of emotion perception and more &#x0201C;naturalistic&#x0201D; cues produced by real social targets (Zaki and Ochsner, <xref ref-type="bibr" rid="B61">2009</xref>). Even during the most intense emotional experiences (e.g., after winning an Olympic gold medal) targets typically produce complex, nuanced facial expressions that differ fundamentally from the posed, canonical displays often used in research (Russell et al., <xref ref-type="bibr" rid="B36">2003</xref>). Thus, while the amygdala is clearly important to forming fast and computationally efficient evaluations of many affective stimuli, its role in reacting to and interpreting the more subtle cues produced by social targets in many other situations may be more limited.</p>
<p>More broadly, our data connect with the literature on processing affective cues under different levels of attention. Specifically, prior work has demonstrated that affective stimuli engage several neural structures&#x02014;including the amygdala and sensorimotor cortex&#x02014;when perceivers do not attend to target affect (Spunt and Lieberman, <xref ref-type="bibr" rid="B49">2012</xref>; Whalen et al., <xref ref-type="bibr" rid="B55">1998</xref>; Winston et al., <xref ref-type="bibr" rid="B58">2003</xref>), attend to low-level target features including eye gaze (Adams and Franklin, <xref ref-type="bibr" rid="B1">2009</xref>), or draw inferences about targets based on non-verbal cues (Kuzmanovic et al., <xref ref-type="bibr" rid="B22">2011</xref>). Although researchers have debated the extent to which neural responses to affective cues are truly automatic (Pessoa et al., <xref ref-type="bibr" rid="B32">2002</xref>; Pessoa, <xref ref-type="bibr" rid="B31">2005</xref>), the modulation of affect-related neural processing by, for instance, top down vs. bottom up processing goals is rapidly becoming an established feature of the neuroscientific literature. Here, we extend this insight to demonstrate that naturally occurring variance in target expressivity modulates neural activity in a manner broadly consistent with such task dependency.</p>
</sec>
<sec>
<title>Target expressivity as a buffer against social cognitive dysfunction</title>
<p>One especially interesting application of the current approach surrounds illnesses that involve social cognitive and behavioral dysfunctions. Such difficulties characterize a raft of psychiatric disorders, such as schizophrenia, borderline personality disorder, and social phobia. In almost all cases, social deficits in these conditions are studied using standardized social stimuli and paradigms. However, social deficits in these conditions could critically depend not only on the cognitive or affective characteristics of affected perceivers, but also on the dispositions and behaviors of the targets they encounter. Consider a condition heavily associated with social cognitive dysfunction: Autism Spectrum Disorders (ASD). Individuals with ASD perform poorly on social cognitive tasks such as mental state inference (Roeyers et al., <xref ref-type="bibr" rid="B35">2001</xref>), a deficit that has been tied to attenuated activation of several brain regions including the MPFC and FFA (Schultz et al., <xref ref-type="bibr" rid="B42">2000</xref>, <xref ref-type="bibr" rid="B43">2003</xref>; Wang et al., <xref ref-type="bibr" rid="B54">2007</xref>). However, perceivers with ASD perform as well as control participants at a social inference task when social cues are presented in a clear and structured manner (Ponnet et al., <xref ref-type="bibr" rid="B33">2007</xref>). One intriguing possibility is that expressive targets may provide exactly these types of clear social cues, and perceivers with ASD may demonstrate more normative behavior and patterns of brain activity when observing expressive targets (Zaki and Ochsner, <xref ref-type="bibr" rid="B62">2011</xref>). Such a finding would have implications for potential intervention approaches focused on teaching caretakers and peers of individuals with ASD to structure their social cues in a manner that drives social cognitive processing and performance in those individuals. Such an approach has the potential to expand ASD interventions to encompass both perceivers&#x00027; and targets&#x00027; roles in producing accurate and adaptive social cognition.</p>
</sec>
</sec>
</sec>
<sec sec-type="conclusions" id="s5">
<title>Conclusions</title>
<p>The current study demonstrates that the neural bases of social inference are modulated by interpersonal factors. Social targets&#x00027; trait expressivity affected perceivers&#x00027; deployment of social cognitive processing, but in ways that depended on the task perceivers were performing. These data provide an early step toward using neuroimaging to unpack the processes involved in fundamentally interpersonal social cognition.</p>
<sec>
<title>Conflict of interest statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</sec>
</body>
<back>
<ack>
<p>This work was supported by Autism Speaks Grant 4787 (to Jamil Zaki) and NIDA Grant 1R01DA022541-01 (to Kevin Ochsner).</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adams</surname> <given-names>R. B.</given-names></name> <name><surname>Franklin</surname> <given-names>R. G.</given-names></name></person-group> (<year>2009</year>). <article-title>Influence of emotional expression on the processing of gaze direction</article-title>. <source>Motiv. Emot</source>. <volume>33</volume>, <fpage>106</fpage>&#x02013;<lpage>112</lpage>.</citation>
</ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brass</surname> <given-names>M.</given-names></name> <name><surname>Schmitt</surname> <given-names>R. M.</given-names></name> <name><surname>Spengler</surname> <given-names>S.</given-names></name> <name><surname>Gergely</surname> <given-names>G.</given-names></name></person-group> (<year>2007</year>). <article-title>Investigating action understanding: inferential processes versus action simulation</article-title>. <source>Curr. Biol</source>. <volume>17</volume>, <fpage>2117</fpage>&#x02013;<lpage>2121</lpage>. <pub-id pub-id-type="doi">10.1016/j.cub.2007.11.057</pub-id><pub-id pub-id-type="pmid">18083518</pub-id></citation>
</ref>
<ref id="B3">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Buckner</surname> <given-names>R. L.</given-names></name> <name><surname>Carroll</surname> <given-names>D. C.</given-names></name></person-group> (<year>2007</year>). <article-title>Self-projection and the brain</article-title>. <source>Trends Cogn. Sci</source>. <volume>11</volume>, <fpage>49</fpage>&#x02013;<lpage>57</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2006.11.004</pub-id><pub-id pub-id-type="pmid">17188554</pub-id></citation>
</ref>
<ref id="B4">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Butler</surname> <given-names>E. A.</given-names></name> <name><surname>Egloff</surname> <given-names>B.</given-names></name> <name><surname>Wilhelm</surname> <given-names>F. H.</given-names></name> <name><surname>Smith</surname> <given-names>N. C.</given-names></name> <name><surname>Erickson</surname> <given-names>E. A.</given-names></name> <name><surname>Gross</surname> <given-names>J. J.</given-names></name></person-group> (<year>2003</year>). <article-title>The social consequences of expressive suppression</article-title>. <source>Emotion</source> <volume>3</volume>, <fpage>48</fpage>&#x02013;<lpage>67</lpage>. <pub-id pub-id-type="pmid">12899316</pub-id></citation>
</ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Calder</surname> <given-names>A. J.</given-names></name> <name><surname>Lawrence</surname> <given-names>A. D.</given-names></name> <name><surname>Keane</surname> <given-names>J.</given-names></name> <name><surname>Scott</surname> <given-names>S. K.</given-names></name> <name><surname>Owen</surname> <given-names>A. M.</given-names></name> <name><surname>Christoffels</surname> <given-names>I.</given-names></name> <name><surname>Young</surname> <given-names>A. W.</given-names></name></person-group> (<year>2002</year>). <article-title>Reading the mind from eye gaze</article-title>. <source>Neuropsychologia</source> <volume>40</volume>, <fpage>1129</fpage>&#x02013;<lpage>1138</lpage>. <pub-id pub-id-type="doi">10.1016/S0028-3932(02)00008-8</pub-id><pub-id pub-id-type="pmid">11931917</pub-id></citation>
</ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chong</surname> <given-names>T. T.</given-names></name> <name><surname>Williams</surname> <given-names>M. A.</given-names></name> <name><surname>Cunnington</surname> <given-names>R.</given-names></name> <name><surname>Mattingley</surname> <given-names>J. B.</given-names></name></person-group> (<year>2008</year>). <article-title>Selective attention modulates inferior frontal gyrus activity during action observation</article-title>. <source>Neuroimage</source> <volume>40</volume>, <fpage>298</fpage>&#x02013;<lpage>307</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2007.11.030</pub-id><pub-id pub-id-type="pmid">18178107</pub-id></citation>
</ref>
<ref id="B6a">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cunningham</surname> <given-names>W. A.</given-names></name> <name><surname>Zelazo</surname> <given-names>P.</given-names></name> <name><surname>Packer</surname> <given-names>D.</given-names></name> <name><surname>van Bavel</surname> <given-names>J.</given-names></name></person-group> (<year>2007</year>). <article-title>The iterative reprocessing model: a multilevel framework for attitudes and evaluations</article-title>. <source>Soc. Cogn</source>. <volume>25</volume>, <fpage>736</fpage>&#x02013;<lpage>760</lpage>.</citation>
</ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Decety</surname> <given-names>J.</given-names></name></person-group> (<year>2011</year>). <article-title>Dissecting the neural mechanisms mediating empathy</article-title>. <source>Emot. Rev</source>. <volume>3</volume>, <fpage>92</fpage>&#x02013;<lpage>108</lpage>.</citation>
</ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Lange</surname> <given-names>F. P.</given-names></name> <name><surname>Spronk</surname> <given-names>M.</given-names></name> <name><surname>Willems</surname> <given-names>R. M.</given-names></name> <name><surname>Toni</surname> <given-names>I.</given-names></name> <name><surname>Bekkering</surname> <given-names>H.</given-names></name></person-group> (<year>2008</year>). <article-title>Complementary systems for understanding action intentions</article-title>. <source>Curr. Biol</source>. <volume>18</volume>, <fpage>454</fpage>&#x02013;<lpage>457</lpage>. <pub-id pub-id-type="doi">10.1016/j.cub.2008.02.057</pub-id><pub-id pub-id-type="pmid">18356050</pub-id></citation>
</ref>
<ref id="B9">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name> <name><surname>Friesen</surname> <given-names>W.</given-names></name></person-group> (<year>1975/2003</year>). <source>Unmasking the Face</source>. <publisher-loc>Cambridge, MA</publisher-loc>: <publisher-name>Malor Books</publisher-name>.</citation>
</ref>
<ref id="B10">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Epley</surname> <given-names>N.</given-names></name> <name><surname>Waytz</surname> <given-names>A.</given-names></name></person-group> (<year>2009</year>). <article-title>Mind perception</article-title>, in <source>The Handbook of Social Psychology</source>, <edition>5th Edn</edition>. eds <person-group person-group-type="editor"><name><surname>Fiske</surname> <given-names>S.</given-names></name> <name><surname>Gilbert</surname> <given-names>D.</given-names></name> <name><surname>Lindzey</surname> <given-names>G.</given-names></name></person-group> (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>Wiley</publisher-name>), <fpage>498</fpage>&#x02013;<lpage>541</lpage>.</citation>
</ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fletcher</surname> <given-names>P. C.</given-names></name> <name><surname>Happe</surname> <given-names>F.</given-names></name> <name><surname>Frith</surname> <given-names>U.</given-names></name> <name><surname>Baker</surname> <given-names>S. C.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name> <name><surname>Frackowiak</surname> <given-names>R. S.</given-names></name> <name><surname>Frith</surname> <given-names>C. D.</given-names></name></person-group> (<year>1995</year>). <article-title>Other minds in the brain: a functional imaging study of &#x0201C;theory of mind&#x0201D; in story comprehension</article-title>. <source>Cognition</source> <volume>57</volume>, <fpage>109</fpage>&#x02013;<lpage>128</lpage>. <pub-id pub-id-type="doi">10.1016/0010-0277(95)00692-R</pub-id><pub-id pub-id-type="pmid">8556839</pub-id></citation>
</ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gallagher</surname> <given-names>H. L.</given-names></name> <name><surname>Happe</surname> <given-names>F.</given-names></name> <name><surname>Brunswick</surname> <given-names>N.</given-names></name> <name><surname>Fletcher</surname> <given-names>P. C.</given-names></name> <name><surname>Frith</surname> <given-names>U.</given-names></name> <name><surname>Frith</surname> <given-names>C. D.</given-names></name></person-group> (<year>2000</year>). <article-title>Reading the mind in cartoons and stories: an fMRI study of &#x02018;theory of mind&#x02019; in verbal and nonverbal tasks</article-title>. <source>Neuropsychologia</source> <volume>38</volume>, <fpage>11</fpage>&#x02013;<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1016/S0028-3932(99)00053-6</pub-id><pub-id pub-id-type="pmid">10617288</pub-id></citation>
</ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gilbert</surname> <given-names>D.</given-names></name> <name><surname>Pelham</surname> <given-names>B.</given-names></name> <name><surname>Krull</surname> <given-names>D.</given-names></name></person-group> (<year>1989</year>). <article-title>On cognitive busyness: when person perceivers meet persons perceived</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>54</volume>, <fpage>733</fpage>&#x02013;<lpage>740</lpage>.</citation>
</ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gobbini</surname> <given-names>M. I.</given-names></name> <name><surname>Koralek</surname> <given-names>A. C.</given-names></name> <name><surname>Bryan</surname> <given-names>R. E.</given-names></name> <name><surname>Montgomery</surname> <given-names>K. J.</given-names></name> <name><surname>Haxby</surname> <given-names>J. V.</given-names></name></person-group> (<year>2007</year>). <article-title>Two takes on the social brain: a comparison of theory of mind tasks</article-title>. <source>J. Cogn. Neurosci</source>. <volume>19</volume>, <fpage>1803</fpage>&#x02013;<lpage>1814</lpage>. <pub-id pub-id-type="doi">10.1162/jocn.2007.19.11.1803</pub-id><pub-id pub-id-type="pmid">17958483</pub-id></citation>
</ref>
<ref id="B16">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gross</surname> <given-names>J.</given-names></name> <name><surname>John</surname> <given-names>O. P.</given-names></name></person-group> (<year>1997</year>). <article-title>Revealing feelings: facets of emotional expressivity in self-reports, peer ratings, and behavior</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>72</volume>, <fpage>435</fpage>&#x02013;<lpage>448</lpage>. <pub-id pub-id-type="pmid">9107009</pub-id></citation>
</ref>
<ref id="B17">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gross</surname> <given-names>J.</given-names></name> <name><surname>John</surname> <given-names>O.</given-names></name> <name><surname>Richards</surname> <given-names>J.</given-names></name></person-group> (<year>2000</year>). <article-title>The dissociation of emotion expression from emotion experience: a personality perspective</article-title>. <source>Pers. Soc. Psychol. Bull</source>. <volume>26</volume>, <fpage>712</fpage>&#x02013;<lpage>726</lpage>.</citation>
</ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gross</surname> <given-names>J.</given-names></name> <name><surname>Levenson</surname> <given-names>R.</given-names></name></person-group> (<year>1993</year>). <article-title>Emotional suppression: physiology, self-report, and expressive behavior</article-title>. <source>J. Pers. Soc. Psychol</source>. <volume>64</volume>, <fpage>970</fpage>&#x02013;<lpage>986</lpage>. <pub-id pub-id-type="pmid">8326473</pub-id></citation>
</ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kanwisher</surname> <given-names>N.</given-names></name> <name><surname>McDermott</surname> <given-names>J.</given-names></name> <name><surname>Chun</surname> <given-names>M. M.</given-names></name></person-group> (<year>1997</year>). <article-title>The fusiform face area: a module in human extrastriate cortex specialized for face perception</article-title>. <source>J. Neurosci</source>. <volume>17</volume>, <fpage>4302</fpage>&#x02013;<lpage>4311</lpage>. <pub-id pub-id-type="pmid">9151747</pub-id></citation>
</ref>
<ref id="B20">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Keysers</surname> <given-names>C.</given-names></name> <name><surname>Kaas</surname> <given-names>J. H.</given-names></name> <name><surname>Gazzola</surname> <given-names>V.</given-names></name></person-group> (<year>2010</year>). <article-title>Somatosensation in social perception</article-title>. <source>Nat. Rev. Neurosci</source>. <volume>11</volume>, <fpage>417</fpage>&#x02013;<lpage>428</lpage>. <pub-id pub-id-type="doi">10.1038/nrn2833</pub-id><pub-id pub-id-type="pmid">20445542</pub-id></citation>
</ref>
<ref id="B21">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kunda</surname> <given-names>Z.</given-names></name></person-group> (<year>1990</year>). <article-title>The case for motivated reasoning</article-title>. <source>Psychol. Bull</source>. <volume>108</volume>, <fpage>480</fpage>&#x02013;<lpage>498</lpage>. <pub-id pub-id-type="pmid">2270237</pub-id></citation>
</ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kuzmanovic</surname> <given-names>B.</given-names></name> <name><surname>Bente</surname> <given-names>G.</given-names></name> <name><surname>von Cramon</surname> <given-names>D. Y.</given-names></name> <name><surname>Schilbach</surname> <given-names>L.</given-names></name> <name><surname>Tittgemeyer</surname> <given-names>M.</given-names></name> <name><surname>Vogeley</surname> <given-names>K.</given-names></name></person-group> (<year>2011</year>). <article-title>Imaging first impressions: distinct neural processing of verbal and nonverbal social information</article-title>. <source>Neuroimage</source> <volume>60</volume>, <fpage>179</fpage>&#x02013;<lpage>188</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2011.12.046</pub-id><pub-id pub-id-type="pmid">22227133</pub-id></citation>
</ref>
<ref id="B23">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lieberman</surname> <given-names>M. D.</given-names></name> <name><surname>Eisenberger</surname> <given-names>N. I.</given-names></name> <name><surname>Crockett</surname> <given-names>M. J.</given-names></name> <name><surname>Tom</surname> <given-names>S. M.</given-names></name> <name><surname>Pfeifer</surname> <given-names>J. H.</given-names></name> <name><surname>Way</surname> <given-names>B. M.</given-names></name></person-group> (<year>2007</year>). <article-title>Putting feelings into words: affect labeling disrupts amygdala activity in response to affective stimuli</article-title>. <source>Psychol. Sci</source>. <volume>18</volume>, <fpage>421</fpage>&#x02013;<lpage>428</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-9280.2007.01916.x</pub-id><pub-id pub-id-type="pmid">17576282</pub-id></citation>
</ref>
<ref id="B24">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Macrae</surname> <given-names>C. N.</given-names></name> <name><surname>Hood</surname> <given-names>B. M.</given-names></name> <name><surname>Milne</surname> <given-names>A. B.</given-names></name> <name><surname>Rowe</surname> <given-names>A. C.</given-names></name> <name><surname>Mason</surname> <given-names>M. F.</given-names></name></person-group> (<year>2002</year>). <article-title>Are you looking at me? Eye gaze and person perception</article-title>. <source>Psychol. Sci</source>. <volume>13</volume>, <fpage>460</fpage>&#x02013;<lpage>464</lpage>. <pub-id pub-id-type="pmid">12219814</pub-id></citation>
</ref>
<ref id="B25">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mason</surname> <given-names>M. F.</given-names></name> <name><surname>Tatkow</surname> <given-names>E. P.</given-names></name> <name><surname>Macrae</surname> <given-names>C. N.</given-names></name></person-group> (<year>2005</year>). <article-title>The look of love: gaze shifts and person perception</article-title>. <source>Psychol. Sci</source>. <volume>16</volume>, <fpage>236</fpage>&#x02013;<lpage>239</lpage>. <pub-id pub-id-type="doi">10.1111/j.0956-7976.2005.00809.x</pub-id><pub-id pub-id-type="pmid">15733205</pub-id></citation>
</ref>
<ref id="B26">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mischel</surname> <given-names>W.</given-names></name> <name><surname>Shoda</surname> <given-names>Y.</given-names></name></person-group> (<year>1995</year>). <article-title>A cognitive-affective system theory of personality: reconceptualizing situations, dispositions, dynamics, and invariance in personality structure</article-title>. <source>Psychol. Rev</source>. <volume>102</volume>, <fpage>246</fpage>&#x02013;<lpage>268</lpage>. <pub-id pub-id-type="pmid">7740090</pub-id></citation>
</ref>
<ref id="B27">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mitchell</surname> <given-names>J. P.</given-names></name></person-group> (<year>2009</year>). <article-title>Inferences about mental states</article-title>. <source>Philos. Trans. R. Soc. Lond. B Biol. Sci</source>. <volume>364</volume>, <fpage>1309</fpage>&#x02013;<lpage>1316</lpage>. <pub-id pub-id-type="doi">10.1098/rstb.2008.0318</pub-id><pub-id pub-id-type="pmid">19528012</pub-id></citation>
</ref>
<ref id="B28">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mitchell</surname> <given-names>J. P.</given-names></name> <name><surname>Heatherton</surname> <given-names>T. F.</given-names></name> <name><surname>Macrae</surname> <given-names>C. N.</given-names></name></person-group> (<year>2002</year>). <article-title>Distinct neural systems subserve person and object knowledge</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A</source>. <volume>99</volume>, <fpage>15238</fpage>&#x02013;<lpage>15243</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.232395699</pub-id><pub-id pub-id-type="pmid">12417766</pub-id></citation>
</ref>
<ref id="B29">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nichols</surname> <given-names>T.</given-names></name> <name><surname>Brett</surname> <given-names>M.</given-names></name> <name><surname>Andersson</surname> <given-names>J.</given-names></name> <name><surname>Wager</surname> <given-names>T.</given-names></name> <name><surname>Poline</surname> <given-names>J. B.</given-names></name></person-group> (<year>2005</year>). <article-title>Valid conjunction inference with the minimum statistic</article-title>. <source>Neuroimage</source> <volume>25</volume>, <fpage>653</fpage>&#x02013;<lpage>660</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2004.12.005</pub-id><pub-id pub-id-type="pmid">15808966</pub-id></citation>
</ref>
<ref id="B30">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ochsner</surname> <given-names>K. N.</given-names></name> <name><surname>Knierim</surname> <given-names>K.</given-names></name> <name><surname>Ludlow</surname> <given-names>D. H.</given-names></name> <name><surname>Hanelin</surname> <given-names>J.</given-names></name> <name><surname>Ramachandran</surname> <given-names>T.</given-names></name> <name><surname>Glover</surname> <given-names>G.</given-names></name> <name><surname>Mackey</surname> <given-names>S. C.</given-names></name></person-group> (<year>2004</year>). <article-title>Reflecting upon feelings: an fMRI study of neural systems supporting the attribution of emotion to self and other</article-title>. <source>J. Cogn. Neurosci</source>. <volume>16</volume>, <fpage>1746</fpage>&#x02013;<lpage>1772</lpage>. <pub-id pub-id-type="doi">10.1162/0898929042947829</pub-id><pub-id pub-id-type="pmid">15701226</pub-id></citation>
</ref>
<ref id="B31">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pessoa</surname> <given-names>L.</given-names></name></person-group> (<year>2005</year>). <article-title>To what extent are emotional visual stimuli processed without attention and awareness?</article-title> <source>Curr. Opin. Neurobiol</source>. <volume>15</volume>, <fpage>188</fpage>&#x02013;<lpage>196</lpage>. <pub-id pub-id-type="doi">10.1016/j.conb.2005.03.002</pub-id><pub-id pub-id-type="pmid">15831401</pub-id></citation>
</ref>
<ref id="B32">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pessoa</surname> <given-names>L.</given-names></name> <name><surname>McKenna</surname> <given-names>M.</given-names></name> <name><surname>Gutierrez</surname> <given-names>E.</given-names></name> <name><surname>Ungerleider</surname> <given-names>L. G.</given-names></name></person-group> (<year>2002</year>). <article-title>Neural processing of emotional faces requires attention</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A</source>. <volume>99</volume>, <fpage>11458</fpage>&#x02013;<lpage>11463</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.172403899</pub-id><pub-id pub-id-type="pmid">12177449</pub-id></citation>
</ref>
<ref id="B33">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ponnet</surname> <given-names>K.</given-names></name> <name><surname>Buysse</surname> <given-names>A.</given-names></name> <name><surname>Roeyers</surname> <given-names>H.</given-names></name> <name><surname>De Clercq</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>Mind-reading in young adults with ASD: does structure matter?</article-title> <source>J. Autism Dev. Disord</source>. <volume>38</volume>, <fpage>905</fpage>&#x02013;<lpage>918</lpage>. <pub-id pub-id-type="doi">10.1007/s10803-007-0462-5</pub-id><pub-id pub-id-type="pmid">17929156</pub-id></citation>
</ref>
<ref id="B34">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rizzolatti</surname> <given-names>G.</given-names></name> <name><surname>Craighero</surname> <given-names>L.</given-names></name></person-group> (<year>2004</year>). <article-title>The mirror-neuron system</article-title>. <source>Annu. Rev. Neurosci</source>. <volume>27</volume>, <fpage>169</fpage>&#x02013;<lpage>192</lpage>. <pub-id pub-id-type="doi">10.1146/annurev.neuro.27.070203.144230</pub-id><pub-id pub-id-type="pmid">15217330</pub-id></citation>
</ref>
<ref id="B35">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roeyers</surname> <given-names>H.</given-names></name> <name><surname>Buysse</surname> <given-names>A.</given-names></name> <name><surname>Ponnet</surname> <given-names>K.</given-names></name> <name><surname>Pichal</surname> <given-names>B.</given-names></name></person-group> (<year>2001</year>). <article-title>Advancing advanced mind-reading tests: empathic accuracy in adults with a pervasive developmental disorder</article-title>. <source>J. Child Psychol. Psychiatry</source> <volume>42</volume>, <fpage>271</fpage>&#x02013;<lpage>278</lpage>. <pub-id pub-id-type="pmid">11280423</pub-id></citation>
</ref>
<ref id="B36">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Russell</surname> <given-names>J. A.</given-names></name> <name><surname>Bachorowski</surname> <given-names>J. A.</given-names></name> <name><surname>Fernandez-Dols</surname> <given-names>J. M.</given-names></name></person-group> (<year>2003</year>). <article-title>Facial and vocal expressions of emotion</article-title>. <source>Annu. Rev. Psychol</source>. <volume>54</volume>, <fpage>329</fpage>&#x02013;<lpage>349</lpage>. <pub-id pub-id-type="doi">10.1146/annurev.psych.54.101601.145102</pub-id><pub-id pub-id-type="pmid">12415074</pub-id></citation>
</ref>
<ref id="B37">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Saxe</surname> <given-names>R.</given-names></name> <name><surname>Powell</surname> <given-names>L. J.</given-names></name></person-group> (<year>2006</year>). <article-title>It&#x00027;s the thought that counts: specific brain regions for one component of theory of mind</article-title>. <source>Psychol. Sci</source>. <volume>17</volume>, <fpage>692</fpage>&#x02013;<lpage>699</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-9280.2006.01768.x</pub-id><pub-id pub-id-type="pmid">16913952</pub-id></citation>
</ref>
<ref id="B38">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schilbach</surname> <given-names>L.</given-names></name> <name><surname>Eickhoff</surname> <given-names>S. B.</given-names></name> <name><surname>Cieslik</surname> <given-names>E. C.</given-names></name> <name><surname>Kuzmanovic</surname> <given-names>B.</given-names></name> <name><surname>Vogeley</surname> <given-names>K.</given-names></name></person-group> (<year>2011</year>). <article-title>Shall we do this together? Social gaze influences action control in a comparison group, but not in individuals with high-functioning autism</article-title>. <source>Autism</source> <fpage>1</fpage>&#x02013;<lpage>15</lpage> <pub-id pub-id-type="doi">10.1177/1362361311409258</pub-id><pub-id pub-id-type="pmid">21810910</pub-id></citation>
</ref>
<ref id="B39">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schilbach</surname> <given-names>L.</given-names></name> <name><surname>Timmermans</surname> <given-names>B.</given-names></name> <name><surname>Reddy</surname> <given-names>V.</given-names></name> <name><surname>Costall</surname> <given-names>A.</given-names></name> <name><surname>Bente</surname> <given-names>G.</given-names></name> <name><surname>Schlicht</surname> <given-names>T.</given-names></name> <name><surname>Vogeley</surname> <given-names>K.</given-names></name></person-group> (<year>2012</year>). <article-title>Toward a second-person neuroscience</article-title>. <source>Behav. Brain Res</source>. (in press).</citation>
</ref>
<ref id="B40">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schilbach</surname> <given-names>L.</given-names></name> <name><surname>Wohlschlaeger</surname> <given-names>A. M.</given-names></name> <name><surname>Kraemer</surname> <given-names>N. C.</given-names></name> <name><surname>Newen</surname> <given-names>A.</given-names></name> <name><surname>Shah</surname> <given-names>N. J.</given-names></name> <name><surname>Fink</surname> <given-names>G. R.</given-names></name> <name><surname>Vogeley</surname> <given-names>K.</given-names></name></person-group> (<year>2006</year>). <article-title>Being with virtual others: neural correlates of social interaction</article-title>. <source>Neuropsychologia</source> <volume>44</volume>, <fpage>718</fpage>&#x02013;<lpage>730</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2005.07.017</pub-id><pub-id pub-id-type="pmid">16171833</pub-id></citation>
</ref>
<ref id="B41">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schippers</surname> <given-names>M. B.</given-names></name> <name><surname>Keysers</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). <article-title>Mapping the flow of information within the putative mirror neuron system during gesture observation</article-title>. <source>Neuroimage</source> <volume>57</volume>, <fpage>37</fpage>&#x02013;<lpage>44</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2011.02.018</pub-id><pub-id pub-id-type="pmid">21316466</pub-id></citation>
</ref>
<ref id="B42">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schultz</surname> <given-names>R. T.</given-names></name> <name><surname>Gauthier</surname> <given-names>I.</given-names></name> <name><surname>Klin</surname> <given-names>A.</given-names></name> <name><surname>Fulbright</surname> <given-names>R. K.</given-names></name> <name><surname>Anderson</surname> <given-names>A. W.</given-names></name> <name><surname>Volkmar</surname> <given-names>F.</given-names></name> <name><surname>Skudlarski</surname> <given-names>P.</given-names></name> <name><surname>Lacadie</surname> <given-names>C.</given-names></name> <name><surname>Cohen</surname> <given-names>D. J.</given-names></name> <name><surname>Gore</surname> <given-names>J. C.</given-names></name></person-group> (<year>2000</year>). <article-title>Abnormal ventral temporal cortical activity during face discrimination among individuals with autism and Asperger syndrome</article-title>. <source>Arch. Gen. Psychiatry</source> <volume>57</volume>, <fpage>331</fpage>&#x02013;<lpage>340</lpage>. <pub-id pub-id-type="doi">10.1001/archpsyc.57.4.331</pub-id><pub-id pub-id-type="pmid">10768694</pub-id></citation>
</ref>
<ref id="B43">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schultz</surname> <given-names>R. T.</given-names></name> <name><surname>Grelotti</surname> <given-names>D. J.</given-names></name> <name><surname>Klin</surname> <given-names>A.</given-names></name> <name><surname>Kleinman</surname> <given-names>J.</given-names></name> <name><surname>Van der Gaag</surname> <given-names>C.</given-names></name> <name><surname>Marois</surname> <given-names>R.</given-names></name> <name><surname>Skudlarski</surname> <given-names>P.</given-names></name></person-group> (<year>2003</year>). <article-title>The role of the fusiform face area in social cognition: implications for the pathobiology of autism</article-title>. <source>Philos. Trans. R. Soc. Lond. B Biol. Sci</source>. <volume>358</volume>, <fpage>415</fpage>&#x02013;<lpage>427</lpage>. <pub-id pub-id-type="doi">10.1098/rstb.2002.1208</pub-id><pub-id pub-id-type="pmid">12639338</pub-id></citation>
</ref>
<ref id="B44">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Slotnick</surname> <given-names>S. D.</given-names></name> <name><surname>Moo</surname> <given-names>L. R.</given-names></name> <name><surname>Segal</surname> <given-names>J. B.</given-names></name> <name><surname>Hart</surname> <given-names>J.</given-names> <suffix>Jr.</suffix></name></person-group> (<year>2003</year>). <article-title>Distinct prefrontal cortex activity associated with item memory and source memory for visual shapes</article-title>. <source>Brain Res. Cogn. Brain Res</source>. <volume>17</volume>, <fpage>75</fpage>&#x02013;<lpage>82</lpage>. <pub-id pub-id-type="doi">10.1016/S0926-6410(03)00082-X</pub-id><pub-id pub-id-type="pmid">12763194</pub-id></citation>
</ref>
<ref id="B44a">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Small</surname> <given-names>D. M.</given-names></name> <name><surname>Gregory</surname> <given-names>M. D.</given-names></name> <name><surname>Mak</surname> <given-names>Y. E.</given-names></name> <name><surname>Gitelman</surname> <given-names>D.</given-names></name> <name><surname>Mesulam</surname> <given-names>M. M.</given-names></name> <name><surname>Parrish</surname> <given-names>T.</given-names></name></person-group> (<year>2003</year>). <article-title>Dissociation of neural representation of intensity and affective valuation in human gustation</article-title>. <source>Neuron</source> <volume>39</volume>, <fpage>701</fpage>&#x02013;<lpage>711</lpage>. <pub-id pub-id-type="doi">10.1016/S0896-6273(03)00467-7</pub-id><pub-id pub-id-type="pmid">12925283</pub-id></citation>
</ref>
<ref id="B45">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Snodgrass</surname> <given-names>S. E.</given-names></name> <name><surname>Hecht</surname> <given-names>M. A.</given-names></name> <name><surname>Ploutz-Snyder</surname> <given-names>R.</given-names></name></person-group> (<year>1998</year>). <article-title>Interpersonal sensitivity: expressivity or perceptivity?</article-title> <source>J. Pers. Soc. Psychol</source>. <volume>74</volume>, <fpage>238</fpage>&#x02013;<lpage>249</lpage>. <pub-id pub-id-type="pmid">9457785</pub-id></citation>
</ref>
<ref id="B46">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spreng</surname> <given-names>R. N.</given-names></name> <name><surname>Mar</surname> <given-names>R. A.</given-names></name> <name><surname>Kim</surname> <given-names>A. S.</given-names></name></person-group> (<year>2009</year>). <article-title>The common neural basis of autobiographical memory, prospection, navigation, theory of mind, and the default mode: a quantitative meta-analysis</article-title>. <source>J. Cogn. Neurosci</source>. <volume>21</volume>, <fpage>489</fpage>&#x02013;<lpage>510</lpage>. <pub-id pub-id-type="doi">10.1162/jocn.2008.21029</pub-id><pub-id pub-id-type="pmid">18510452</pub-id></citation>
</ref>
<ref id="B47">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spunt</surname> <given-names>R. P.</given-names></name> <name><surname>Falk</surname> <given-names>E. B.</given-names></name> <name><surname>Lieberman</surname> <given-names>M. D.</given-names></name></person-group> (<year>2010</year>). <article-title>Dissociable neural systems support retrieval of how and why action knowledge</article-title>. <source>Psychol. Sci</source>. <volume>21</volume>, <fpage>1593</fpage>&#x02013;<lpage>1598</lpage>. <pub-id pub-id-type="doi">10.1177/0956797610386618</pub-id><pub-id pub-id-type="pmid">20959510</pub-id></citation>
</ref>
<ref id="B50">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spunt</surname> <given-names>R. P.</given-names></name> <name><surname>Lieberman</surname> <given-names>M. D.</given-names></name></person-group> (<year>in press</year>). <article-title>The busy social brain: an fMRI study of cognitive load during action observation</article-title>. <source>Psychol. Sci</source>.</citation>
</ref>
<ref id="B49">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spunt</surname> <given-names>R. P.</given-names></name> <name><surname>Lieberman</surname> <given-names>M. D.</given-names></name></person-group> (<year>2012</year>). <article-title>An integrative model of the neural systems supporting the comprehension of observed emotional behavior</article-title>. <source>Neuroimage</source> <volume>59</volume>, <fpage>3050</fpage>&#x02013;<lpage>3059</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2011.10.005</pub-id><pub-id pub-id-type="pmid">22019857</pub-id></citation>
</ref>
<ref id="B51">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Van Overwalle</surname> <given-names>F.</given-names></name></person-group> (<year>2009</year>). <article-title>Social cognition and the brain: a meta-analysis</article-title>. <source>Hum. Brain Mapp</source>. <volume>30</volume>, <fpage>829</fpage>&#x02013;<lpage>858</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.20547</pub-id><pub-id pub-id-type="pmid">18381770</pub-id></citation>
</ref>
<ref id="B52">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Van Overwalle</surname> <given-names>F.</given-names></name> <name><surname>Baetens</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>Understanding others&#x00027; actions and goals by mirror and mentalizing systems: a meta-analysis</article-title>. <source>Neuroimage</source> <volume>48</volume>, <fpage>564</fpage>&#x02013;<lpage>584</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.06.009</pub-id><pub-id pub-id-type="pmid">19524046</pub-id></citation>
</ref>
<ref id="B53">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vuilleumier</surname> <given-names>P.</given-names></name> <name><surname>Armony</surname> <given-names>J. L.</given-names></name> <name><surname>Driver</surname> <given-names>J.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2001</year>). <article-title>Effects of attention and emotion on face processing in the human brain: an event-related fMRI study</article-title>. <source>Neuron</source> <volume>30</volume>, <fpage>829</fpage>&#x02013;<lpage>841</lpage>. <pub-id pub-id-type="pmid">11430815</pub-id></citation>
</ref>
<ref id="B54">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>A. T.</given-names></name> <name><surname>Lee</surname> <given-names>S. S.</given-names></name> <name><surname>Sigman</surname> <given-names>M.</given-names></name> <name><surname>Dapretto</surname> <given-names>M.</given-names></name></person-group> (<year>2007</year>). <article-title>Reading affect in the face and voice: neural correlates of interpreting communicative intent in children and adolescents with autism spectrum disorders</article-title>. <source>Arch. Gen. Psychiatry</source> <volume>64</volume>, <fpage>698</fpage>&#x02013;<lpage>708</lpage>. <pub-id pub-id-type="doi">10.1001/archpsyc.64.6.698</pub-id><pub-id pub-id-type="pmid">17548751</pub-id></citation>
</ref>
<ref id="B55">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Whalen</surname> <given-names>P. J.</given-names></name> <name><surname>Rauch</surname> <given-names>S. L.</given-names></name> <name><surname>Etcoff</surname> <given-names>N. L.</given-names></name> <name><surname>McInerney</surname> <given-names>S. C.</given-names></name> <name><surname>Lee</surname> <given-names>M. B.</given-names></name> <name><surname>Jenike</surname> <given-names>M. A.</given-names></name></person-group> (<year>1998</year>). <article-title>Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge</article-title>. <source>J. Neurosci</source>. <volume>18</volume>, <fpage>411</fpage>&#x02013;<lpage>418</lpage>. <pub-id pub-id-type="pmid">9412517</pub-id></citation>
</ref>
<ref id="B56">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wheatley</surname> <given-names>T.</given-names></name> <name><surname>Milleville</surname> <given-names>S. C.</given-names></name> <name><surname>Martin</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>Understanding animate agents: distinct roles for the social network and mirror system</article-title>. <source>Psychol. Sci</source>. <volume>18</volume>, <fpage>469</fpage>&#x02013;<lpage>474</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-9280.2007.01923.x</pub-id><pub-id pub-id-type="pmid">17576256</pub-id></citation>
</ref>
<ref id="B57">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wilms</surname> <given-names>M.</given-names></name> <name><surname>Schilbach</surname> <given-names>L.</given-names></name> <name><surname>Pfeiffer</surname> <given-names>U.</given-names></name> <name><surname>Bente</surname> <given-names>G.</given-names></name> <name><surname>Fink</surname> <given-names>G. R.</given-names></name> <name><surname>Vogeley</surname> <given-names>K.</given-names></name></person-group> (<year>2010</year>). <article-title>It, &#x000C4;&#x000F4;s in your eyes, &#x000C4;&#x000EE;using gaze-contingent stimuli to create truly interactive paradigms for social cognitive and affective neuroscience</article-title>. <source>Soc. Cogn. Affect. Neurosci</source>. <volume>5</volume>, <fpage>98</fpage>. <pub-id pub-id-type="doi">10.1093/scan/nsq024</pub-id><pub-id pub-id-type="pmid">20223797</pub-id></citation>
</ref>
<ref id="B58">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Winston</surname> <given-names>J. S.</given-names></name> <name><surname>O&#x00027;Doherty</surname> <given-names>J.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2003</year>). <article-title>Common and distinct neural responses during direct and incidental processing of multiple facial emotions</article-title>. <source>Neuroimage</source> <volume>20</volume>, <fpage>84</fpage>&#x02013;<lpage>97</lpage>. <pub-id pub-id-type="doi">10.1016/S1053-8119(03)00303-3</pub-id><pub-id pub-id-type="pmid">14527572</pub-id></citation>
</ref>
<ref id="B59">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zaki</surname> <given-names>J.</given-names></name> <name><surname>Bolger</surname> <given-names>N.</given-names></name> <name><surname>Ochsner</surname> <given-names>K.</given-names></name></person-group> (<year>2008</year>). <article-title>It takes two: the interpersonal nature of empathic accuracy</article-title>. <source>Psychol. Sci</source>. <volume>19</volume>, <fpage>399</fpage>&#x02013;<lpage>404</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-9280.2008.02099.x</pub-id><pub-id pub-id-type="pmid">18399894</pub-id></citation>
</ref>
<ref id="B60">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zaki</surname> <given-names>J.</given-names></name> <name><surname>Bolger</surname> <given-names>N.</given-names></name> <name><surname>Ochsner</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>Unpacking the informational bases of empathic accuracy</article-title>. <source>Emotion</source> <volume>9</volume>, <fpage>478</fpage>&#x02013;<lpage>487</lpage>. <pub-id pub-id-type="doi">10.1037/a0016551</pub-id><pub-id pub-id-type="pmid">19653768</pub-id></citation>
</ref>
<ref id="B60a">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zaki</surname> <given-names>J.</given-names></name> <name><surname>Davis</surname> <given-names>J.</given-names></name> <name><surname>Ochsner</surname> <given-names>K.</given-names></name></person-group> (<year>2012</year>). <article-title>Overlapping activity in anterior insula during interoception and emotional experience</article-title>. <source>Neuroimage</source> <volume>62</volume>, <fpage>493</fpage>&#x02013;<lpage>499</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2012.05.012</pub-id><pub-id pub-id-type="pmid">22587900</pub-id></citation>
</ref>
<ref id="B61">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zaki</surname> <given-names>J.</given-names></name> <name><surname>Ochsner</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>The need for a cognitive neuroscience of naturalistic social cognition</article-title>. <source>Ann. N.Y. Acad. Sci</source>. <volume>1167</volume>, <fpage>16</fpage>&#x02013;<lpage>30</lpage>. <pub-id pub-id-type="doi">10.1111/j.1749-6632.2009.04601.x</pub-id><pub-id pub-id-type="pmid">19580548</pub-id></citation>
</ref>
<ref id="B62">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zaki</surname> <given-names>J.</given-names></name> <name><surname>Ochsner</surname> <given-names>K.</given-names></name></person-group> (<year>2011</year>). <article-title>Reintegrating accuracy into the study of social cognition</article-title>. <source>Psychol. Inq</source>. <volume>22</volume>, <fpage>159</fpage>&#x02013;<lpage>182</lpage>.</citation>
</ref>
<ref id="B63">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zaki</surname> <given-names>J.</given-names></name> <name><surname>Ochsner</surname> <given-names>K.</given-names></name></person-group> (<year>2012</year>). <article-title>The neuroscience of empathy: progress, pitfalls, and promise</article-title>. <source>Nat. Neurosci</source>. <volume>15</volume>, <fpage>675</fpage>&#x02013;<lpage>680</lpage>.</citation>
</ref>
<ref id="B64">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zaki</surname> <given-names>J.</given-names></name> <name><surname>Weber</surname> <given-names>J.</given-names></name> <name><surname>Bolger</surname> <given-names>N.</given-names></name> <name><surname>Ochsner</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>The neural bases of empathic accuracy</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A</source>. <volume>106</volume>, <fpage>11382</fpage>&#x02013;<lpage>11387</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.0902666106</pub-id><pub-id pub-id-type="pmid">19549849</pub-id></citation>
</ref>
<ref id="B65">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zayas</surname> <given-names>V.</given-names></name> <name><surname>Shoda</surname> <given-names>Y.</given-names></name> <name><surname>Ayduk</surname> <given-names>O. N.</given-names></name></person-group> (<year>2002</year>). <article-title>Personality in context: an interpersonal systems perspective</article-title>. <source>J. Pers</source>. <volume>70</volume>, <fpage>851</fpage>&#x02013;<lpage>900</lpage>. <pub-id pub-id-type="doi">10.1111/1467-6494.05026</pub-id><pub-id pub-id-type="pmid">12498358</pub-id></citation>
</ref>
</ref-list>
<fn-group>
<fn id="fn0001"><p><sup>1</sup>Eye-gaze and eye-gaze direction are, at some level, social cues (Macrae et al., <xref ref-type="bibr" rid="B24">2002</xref>; Mason et al., <xref ref-type="bibr" rid="B25">2005</xref>), which, in this case, might pertain to emotions expressed by the individuals in the video, and attending to eye-gaze can engage some neural structures commonly associated with social inference (Calder et al., <xref ref-type="bibr" rid="B5">2002</xref>). As such, comparing emotion rating with eye-gaze rating provided an especially conservative contrast that focused specifically on explicit attention to emotion, as opposed to incidental processing of social information (see &#x0201C;Discussion&#x0201D;) or attentional and motoric demands.</p></fn>
</fn-group>
</back>
</article>