<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Hum. Neurosci.</journal-id>
<journal-title>Frontiers in Human Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Hum. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5161</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnhum.2019.00034</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Mental Simulation of Facial Expressions: Mu Suppression to the Viewing of Dynamic Neutral Face Videos</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Karakale</surname> <given-names>Ozge</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/636345/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Moore</surname> <given-names>Matthew R.</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/659374/overview"/>
</contrib> 
<contrib contrib-type="author">
<name><surname>Kirk</surname> <given-names>Ian J.</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/68740/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>School of Psychology, The University of Auckland</institution>, <addr-line>Auckland</addr-line>, <country>New Zealand</country></aff>
<aff id="aff2"><sup>2</sup><institution>School of Medicine, The University of Auckland</institution>, <addr-line>Auckland</addr-line>, <country>New Zealand</country></aff>
<aff id="aff3"><sup>3</sup><institution>Brain Research New Zealand</institution>, <addr-line>Auckland</addr-line>, <country>New Zealand</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Kaat Alaerts, KU Leuven, Belgium</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Hidetoshi Takahashi, National Center of Neurology and Psychiatry, Japan; Maria Serena Panasiti, Sapienza University of Rome, Italy</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Ozge Karakale <email>okar164&#x00040;aucklanduni.ac.nz</email></corresp>
</author-notes>
<pub-date pub-type="epub">
<day>08</day>
<month>02</month>
<year>2019</year>
</pub-date>
<pub-date pub-type="collection">
<year>2019</year>
</pub-date>
<volume>13</volume>
<elocation-id>34</elocation-id>
<history>
<date date-type="received">
<day>05</day>
<month>11</month>
<year>2018</year>
</date>
<date date-type="accepted">
<day>22</day>
<month>01</month>
<year>2019</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2019 Karakale, Moore and Kirk.</copyright-statement>
<copyright-year>2019</copyright-year>
<copyright-holder>Karakale, Moore and Kirk</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract><p>The mirror neuron network (MNN) has been proposed as a neural substrate of action understanding. Electroencephalography (EEG) mu suppression has commonly been studied as an index of MNN activity during execution and observation of hand and finger movements. However, in order to establish its role in higher order processes, such as recognizing and sharing emotions, more research using social emotional stimuli is needed. The current study aims to contribute to our understanding of the sensitivity of mu suppression to facial expressions. Modulation of the mu and occipital alpha (8&#x02013;13 Hz) rhythms was calculated in 22 participants while they observed dynamic video stimuli, including emotional (happy and sad) and neutral (mouth opening) facial expressions, and non-biological stimulus (kaleidoscope pattern). Across the four types of stimuli, only the neutral face was associated with a significantly stronger mu suppression than the non-biological stimulus. Occipital alpha suppression was significantly greater in the non-biological stimulus than all the face conditions. Source estimation standardized low resolution electromagnetic tomography (sLORETA) analysis comparing the neural sources of mu/alpha modulation between neutral face and non-biological stimulus showed more suppression in the central regions, including the supplementary motor and somatosensory areas, than the more posterior regions. EEG and source estimation results may indicate that reduced availability of emotional information in the neutral face condition requires more sensorimotor engagement in deciphering emotion-related information than the full-blown happy or sad expressions that are more readily recognized.</p></abstract>
<kwd-group>
<kwd>EEG</kwd>
<kwd>mirror neuron</kwd>
<kwd>mu rhythm</kwd>
<kwd>face emotion</kwd>
<kwd>source estimation</kwd>
<kwd>sLORETA</kwd>
</kwd-group>
<counts>
<fig-count count="3"/>
<table-count count="0"/>
<equation-count count="3"/>
<ref-count count="64"/>
<page-count count="9"/>
<word-count count="7493"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="introduction" id="s1">
<title>Introduction</title>
<p>Nonverbal communication is a crucial component of human social behavior, but its neural mechanisms are poorly understood. The ability to understand others&#x02019; mental states from their facial and bodily gestures allows us to respond effectively during social communication. Gallese and Goldman (<xref ref-type="bibr" rid="B21">1998</xref>) proposed a simulation theory of action understanding to account for the complexity of this process. Under this model, on observing an action, the observer subconsciously and automatically employs a specialized neural circuitry to simulate the action using their own motor system, in turn activating mental states associated with execution of the action, and providing insight into the mental state of the actor. The neural substrate of the simulation theory is proposed to be the <italic>mirror neuron</italic> (Gallese and Goldman, <xref ref-type="bibr" rid="B21">1998</xref>).</p>
<p>Mirror neurons were first discovered in the motor areas of the monkey brain (di Pellegrino et al., <xref ref-type="bibr" rid="B16">1992</xref>). They were observed to fire during both execution and observation of actions, such as grasping an object, putting it in mouth or breaking it. Moreover, the sensory modality by which the action was experienced did not seem to matter for a subset of these neurons: they were triggered by the sound of the action, even when the action was not seen (Kohler et al., <xref ref-type="bibr" rid="B35">2002</xref>). The implication of these findings was that the mirror neurons could be coding the representations of the actions, allowing for recognizing the movements involved in an action and inferring the intention behind the action. Evidence for a similar mirroring mechanism in the human brain has come from functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) studies (Caspers et al., <xref ref-type="bibr" rid="B9">2010</xref>; Molenberghs et al., <xref ref-type="bibr" rid="B39">2012</xref>) as well as single neuron recordings during surgery in humans (Mukamel et al., <xref ref-type="bibr" rid="B42">2010</xref>).</p>
<p>In addition to metabolic brain imaging and <italic>in vivo</italic> cellular studies, Electroencephalography (EEG) studies have measured mu rhythm desynchronization to infer mirroring activity. Mu rhythm, characterized by 8&#x02013;13 Hz oscillations detected over the sensorimotor area, is mostly associated with the functions of the sensorimotor cortex (Niedermeyer, <xref ref-type="bibr" rid="B45">2005</xref>). Increased mu rhythm power indicates physical inactivity and resting, with movement execution as well as observation leading to its suppression (Hari and Salmelin, <xref ref-type="bibr" rid="B25">1997</xref>; Cochin et al., <xref ref-type="bibr" rid="B11">1998</xref>, <xref ref-type="bibr" rid="B12">1999</xref>; Fecteau et al., <xref ref-type="bibr" rid="B17">2004</xref>; Muthukumaraswamy et al., <xref ref-type="bibr" rid="B43">2004</xref>; Lepage and Th&#x000E9;oret, <xref ref-type="bibr" rid="B37">2006</xref>). Due to the responsivity of the mu rhythm to action observation, it has been proposed to reflect mirror neuron activity related to viewing of biological action with or without object interaction, including finger movements (Babiloni et al., <xref ref-type="bibr" rid="B4">1999</xref>; Cochin et al., <xref ref-type="bibr" rid="B12">1999</xref>) and hand grip movements (Muthukumaraswamy et al., <xref ref-type="bibr" rid="B43">2004</xref>), as well as hearing sounds that are linked to actions, such as piano melodies (Wu et al., <xref ref-type="bibr" rid="B63">2016</xref>).</p>
<p>Since the initial discovery of mirror neurons, research has focused on their potential role in social cognitive processes that rely on an ability to understand actions and intentions, such as empathy. Similar activity in the brain regions observed in fMRI during execution and observation of facial expressions has been suggested to provide evidence for the existence of a single mechanism of action representation which allows people to empathize with others (Carr et al., <xref ref-type="bibr" rid="B8">2003</xref>). In order to further the knowledge about the role of the mirror neuron network (MNN) in social emotional information processing, a group of researchers used EEG mu suppression as a proxy of the MNN to investigate the network&#x02019;s sensitivity to emotional information using body parts in painful and non-painful situations, and found greater mu suppression in the painful compared to non-painful conditions (Yang et al., <xref ref-type="bibr" rid="B64">2009</xref>; Cheng et al., <xref ref-type="bibr" rid="B10">2014</xref>; Hoenen et al., <xref ref-type="bibr" rid="B28">2015</xref>). In contrast to findings which suggest a heightened sensitivity of the mu rhythm to emotional information, others found similar levels of mu modulation during gender discrimination and emotion recognition tasks which entailed viewing point-light displays of human figures&#x02019; walk (Perry et al., <xref ref-type="bibr" rid="B51">2010b</xref>). Facial expressions have been used as stimuli in EEG mu suppression research only in a handful of studies (Moore et al., <xref ref-type="bibr" rid="B40">2012</xref>; Cooper et al., <xref ref-type="bibr" rid="B14">2013</xref>; Rayson et al., <xref ref-type="bibr" rid="B55">2016</xref>, <xref ref-type="bibr" rid="B56">2017</xref>; Moore and Franz, <xref ref-type="bibr" rid="B41">2017</xref>). Further research using different types of facial movements depicting varying levels of emotional information as the visual stimuli is necessary to investigate the differential sensitivity of the sensorimotor cortex to emotion-related information processing.</p>
<p>It is crucial to note that findings from some mu suppression studies indicate that mu can easily be confounded with occipital alpha activity, yielding alpha suppression at the central electrodes that is not only similar while viewing biological and non-biological motion, but also more pronounced to biological than non-biological motion when the observed action depicts pain. As pointed out by Milston et al. (<xref ref-type="bibr" rid="B38">2013</xref>), most of the studies that have explored the relation between mu suppression and empathy have used stimuli eliciting pain only (e.g., Yang et al., <xref ref-type="bibr" rid="B64">2009</xref>; Perry et al., <xref ref-type="bibr" rid="B49">2010a</xref>; Hoenen et al., <xref ref-type="bibr" rid="B28">2015</xref>). Researchers have highlighted that processes other than empathy, such as attention, may be at work while viewing painful stimuli due to their threatening nature (Hoenen et al., <xref ref-type="bibr" rid="B29">2013</xref>) or salience (Perry et al., <xref ref-type="bibr" rid="B49">2010a</xref>). It may still be difficult to disentangle mu from alpha in tasks using non-emotional biological motion. For example, Aleksandrov and Tugin (<xref ref-type="bibr" rid="B2">2012</xref>) did not find any systematic differences in mu suppression to the observation of hand movements, non-biological objects or mental counting. Similarly, Perry and Bentin (<xref ref-type="bibr" rid="B48">2010</xref>) observed that alpha suppression at the mu and the occipital areas were very similar to the observation of hand movements toward an object. A recent study conducted by Hobson and Bishop (<xref ref-type="bibr" rid="B27">2016</xref>) showed that different types of baseline used to measure mu suppression engage the attention system differently, thus directly impacting the degree of suppression recorded. They found that mu and occipital alpha modulation while viewing hand movements and kaleidoscope movements were consistent with the MNN activity only when the static video of the image that immediately preceded the dynamic video of the image was used as the baseline. Due to the posterior alpha confound associated with attentional processes, the baseline and the control conditions need to be chosen carefully.</p>
<p>The current study aims to contribute to our understanding of the simulation account by investigating the responsiveness of the sensorimotor cortex to emotional and non-emotional facial expressions. Our goal is to examine the differential sensitivity of the mu rhythm to different types of facial movements. To our knowledge, this is the first study to examine the sensitivity of mu rhythm, while controling for occipital alpha activity, to dynamic neutral and emotional facial expressions not depicting pain. A within-trial baseline method was adopted as per Hobson and Bishop (<xref ref-type="bibr" rid="B27">2016</xref>): the 1,100 ms static image epoch was used as the baseline for quantifying activity in the subsequent 2,050 ms dynamic image epoch. It was hypothesized that mu suppression would be greater in the: (1) happy, sad and neutral face conditions than the non-biological stimulus condition; and (2) happy and sad face conditions than the neutral face condition, without a corresponding difference in occipital alpha suppression.</p>
</sec>
<sec sec-type="materials and methods" id="s2">
<title>Materials and Methods</title>
<sec id="s2-1">
<title>Participants</title>
<p>Twenty-five participants (16 female) between the ages of 19 and 36 (<italic>M</italic> = 26.5, <italic>SD</italic> = &#x000B1;6) were recruited through flyers placed around the University of Auckland campus. Each participant was compensated with a $20 supermarket voucher. Prior to data collection, a pre-screening questionnaire was emailed to the volunteers to identify whether they met the criteria for participation. Exclusion criteria included self-reported major head injury, psychiatric diagnosis, psychoactive medication use, or sensorimotor problems. This study was carried out in accordance with the recommendations of the University of Auckland Human Participants Ethics Committee with written informed consent from all participants. All participants gave written informed consent in accordance with the Declaration of Helsinki. The protocol was approved by the The University of Auckland Human Participants Ethics Committee.</p>
</sec>
<sec id="s2-2">
<title>Stimuli and Design</title>
<p>EEG was recorded during a 30-min computer task, which entailed the viewing of four types of dynamic image videos: happy face, sad face, neutral face (i.e., mouth opening) and non-biological stimulus (i.e., kaleidoscope). There were four blocks of 40 trials (160 total). In each block, there were 10 happy face, 10 sad face, 10 neutral face and 10 non-biological stimulus videos, presented in random order. Each video was 6,000 ms long. Participants were free to rest between the blocks for as long as they wanted.</p>
<p>Happy and sad face videos were taken from the Amsterdam Dynamic Facial Expression Set (ADFES; van der Schalk et al., <xref ref-type="bibr" rid="B60">2011</xref>). The ADFES is freely available for research from the Psychology Research Unit at the University of Amsterdam. Neutral faces were recorded by OK. Past research has validated mouth opening videos of actors as non-emotional (Rayson et al., <xref ref-type="bibr" rid="B55">2016</xref>). Videos used in the present study were made similar to Rayson et al.&#x02019;s (<xref ref-type="bibr" rid="B55">2016</xref>) and the ADFES stimuli in terms of duration, brightness, size, and contrast. Kaleidoscope images presented as the non-biological stimulus were those used in a previous study (Hobson and Bishop, <xref ref-type="bibr" rid="B27">2016</xref>). All stimuli were grayscaled.</p>
<p>Participants were instructed to minimize movement throughout the experiment, and blinking during trials. As <xref ref-type="fig" rid="F1">Figure 1</xref> illustrates, each trial started with a 1,000 ms fixation cross against a white background. After the fixation cross, the static image stimulus was presented for 2,000 ms, followed by a 2,000 ms dynamic image in which the expression changed, and ending with a 2,000 ms static image of the last frame of the video. Then, a two-alternative forced-choice response slide showing the correct label alongside one of the other three labels prompted the participant to categorize the stimulus as <italic>happy</italic>, <italic>sad</italic>, <italic>neutral</italic> or <italic>other</italic>. The response slide remained on the screen until the participant gave a response using the keyboard. The participant pressed &#x0201C;d&#x0201D; if the label on the left was correct and &#x0201C;k&#x0201D; if the label on the right was correct. In half of the trials, the correct label was on the right, and in half, on the left. Each trial ended with a 1,000 ms feedback slide. The feedback slide displayed the word &#x0201C;Correct&#x0201D; or &#x0201C;Incorrect&#x0201D; depending on the key press.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>An example of a trial showing the duration of each section in ms. Each condition video was presented for 6,000 ms of which the first 2,000 ms was static, the second 2,000 ms was dynamic, and the last 2,000 ms was static.</p></caption>
<graphic xlink:href="fnhum-13-00034-g0001.tif"/>
</fig>
<p>Accuracy and reaction time were not analyzed. The feedback slide was only used to gauge attention. The highest number of incorrect answers observed for a participant was six (i.e., 4.5%), indicating sustained attention to the stimuli for all participants.</p>
</sec>
<sec id="s2-3">
<title>EEG Data Recording</title>
<p>EEG recording was conducted in an electrically shielded room (IAC Noise Lock Acoustic&#x02014;Model 1375, Hampshire, United Kingdom) using 128-channel Ag/AgCl electrode nets (Tucker, <xref ref-type="bibr" rid="B58">1993</xref>) from Electrical Geodesics Inc. (Eugene, OR, USA). EEG was recorded continuously (1,000 Hz sample rate) with Electrical Geodesics Inc. amplifiers (300-M&#x003A9; input impedance). Electrode impedances were kept below 40 k&#x003A9;, an acceptable level for this system (Ferree et al., <xref ref-type="bibr" rid="B19">2001</xref>). Common vertex (Cz) was used as a reference. Electrolytic gel was applied before the recording started. Each session consisted of two continuous recordings. After the first two blocks, recording was paused and electrolytic gel was re-applied to ensure the impedance was kept low.</p>
</sec>
<sec id="s2-4">
<title>EEG Data Preprocessing and Analysis</title>
<p>EEG processing was performed using EEGLAB, an open-source MATLAB toolbox (Delorme and Makeig, <xref ref-type="bibr" rid="B15">2004</xref>). For each participant, the continuous data were downsampled to 250 Hz and then high-passed filtered at 0.1 Hz. 6,000 ms conditions starting from the dynamic image onset at time zero were created to get rid of between-session data. Line noise occurring at the harmonics of 50 Hz was removed. Bad channels were identified using the EEGLAB pop_rejchan function (absolute threshold or activity probability limit of 5 SD, based on kurtosis) and interpolated. Data were re-referenced to the average of all electrodes. Infomax ICA was run on each of the preprocessed dataset with EEGLAB default settings. Eye movement and large muscle artifact components were visually identified and rejected for each participant. EEG recordings of three participants were identified as very noisy during the cleaning stage and excluded from further processing.</p>
<p>For each condition, from the 6,000 ms image video, 800 ms to 1,900 ms early epochs corresponding to the static image and 1,950&#x02013;4,000 ms late epochs corresponding to the dynamic image were extracted. The analysis was conducted for the mu/alpha band of 8&#x02013;13 Hz over two central clusters of electrodes, six located around C3 on the left hemisphere (i.e., electrodes 30, 31, 37, 41, 42) and six around C4 on the right hemisphere (i.e., electrodes 80, 87, 93, 103, 105), and over six occipital electrodes (O1, Oz and O2). For each of the 15 electrodes, Fast Fourier Transform (FFT) was used to calculate the power spectral density (PSD) in each trial, separately for early and late epochs. For each trial, mu/alpha suppression at each of the 15 electrodes was calculated by taking the ratio of the late epoch PSD relative to the early epoch PSD. Ratio values instead of subtraction values were used as a measure of suppression to control for mu/alpha power variability between individuals that are due to differences in scalp thickness and electrode impedance (Cohen, <xref ref-type="bibr" rid="B13">2014</xref>). Across the central and occipital electrode clusters separately, if a trial had a PSD ratio value greater than three scaled median absolute deviations from the median PSD ratio value of the cluster, that trial was excluded as an outlier. For each of the four conditions, the average PSD ratio of the 12 central electrodes was calculated to get a single mu value, and of the three occipital electrodes to get a single alpha value, resulting in eight power scores (i.e., suppression for happy, sad, neutral face and non-biological stimulus images at central and occipital areas) for each participant.</p>
<p>Since ratio data are non-normal, a log transform was used for statistical analysis. A log ratio value of less than zero indicates suppression, zero indicates no change, and greater than zero indicates facilitation.</p>
</sec>
<sec id="s2-5">
<title>Source Estimation</title>
<p>The 128-channel EEG data were analyzed using standardized low resolution electromagnetic tomography method (sLORETA) source localization (Pascual-Marqui, <xref ref-type="bibr" rid="B47">2002</xref>; free academic software available at <ext-link ext-link-type="uri" xlink:href="http://www.uzh.ch/keyinst/loreta.htm">http://www.uzh.ch/keyinst/loreta.htm</ext-link>). sLORETA is an inverse solution that produces images of standardized current density at each of the 6,430 cortical voxels (spatial resolution 5 mm) in Montreal Neurological Institute (MNI) space (Pascual-Marqui, <xref ref-type="bibr" rid="B47">2002</xref>). sLORETA images of the mu/alpha band (8&#x02013;13 Hz) activity during the late epochs of the neutral face and non-biological stimulus conditions were computed for each participant, and then the group averages for the two conditions were extracted. Mu/alpha band power associated with late epochs of the neutral face and non-biological stimulus conditions was compared. A whole-brain analysis was conducted to provide evidence that the reduced mu/alpha band power during the late epoch in the neutral face condition compared to the non-biological stimulus condition was localized to the central instead of posterior regions, indicating stimulus-related differences in sensory and motor activity rather than a cortex-wide activity tapping attention. Voxel-wise <italic>t</italic>-tests were done on the frequency band-wise normalized and log-transformed sLORETA images. For all <italic>t</italic>-tests, the variance of each image was smoothed by combining half the variance at each voxel with half the mean variance for the image. Correction for multiple testing was applied using statistical nonparametric mapping (SnPM) with 5,000 permutations.</p>
</sec>
</sec>
<sec sec-type="results" id="s3">
<title>Results</title>
<sec id="s3-1">
<title>EEG Results</title>
<p>All statistical analyses were performed using <italic>R studio</italic> (R Studio Team., <xref ref-type="bibr" rid="B52">2016</xref>).</p>
<p>Data from 22 participants were included in the analysis. As explained in the &#x0201C;Materials and Methods&#x0201D; section, there were four conditions (i.e., happy face, sad face, neutral face, non-biological stimulus) and two brain regions (i.e., central, occipital) of interest. Before hypothesis testing, a <italic>t</italic>-test for each condition at each brain region was conducted to ensure that the PSD was significantly reduced during the late compared to the early epoch (all <italic>p</italic>-values &#x0003C; 0.001). Upon confirming suppression in each condition at both brain regions, a 4 &#x000D7; 2 repeated measures ANOVA was conducted. Mauchly&#x02019;s test indicated that the assumption of sphericity was met for the condition variable. There was a non-significant main effect of condition (<italic>F</italic><sub>(3,63)</sub> = 2.74, <italic>p</italic> = 0.051, <inline-formula><mml:math id="M1"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.115), and a non-significant main effect of region (<italic>F</italic><sub>(1,21)</sub> = 1.520, <italic>p</italic> = 0.231, <inline-formula><mml:math id="M2"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.067). However, interpretation of these main effects is qualified by the significant interaction between condition and region (<italic>F</italic><sub>(3,63)</sub> = 10.734, <italic>p</italic> &#x0003C; 0.001, <inline-formula><mml:math id="M3"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.338). The interaction effect was investigated further with two sets of pairwise comparisons across conditions at each brain region. Benjamini and Hochberg&#x02019;s (<xref ref-type="bibr" rid="B5">1995</xref>) false discovery rate (FDR) correction was applied to correct for multiple comparisons between suppression values across the conditions at the central (<italic>p</italic> &#x0003C; 0.05, FDR corrected) and occipital regions (<italic>p</italic> &#x0003C; 0.05, FDR corrected). At the central region, only the neutral face condition showed significantly greater suppression than the non-biological stimulus (<italic>p</italic> &#x0003C; 0.05). Neutral face also showed greater central suppression than the sad face condition (<italic>p</italic> &#x0003C; 0.05). At the occipital region, suppression was significantly greater in the non-biological stimulus condition than all the other conditions (all <italic>p</italic>-values &#x0003C; 0.05). The distribution of the data points can be seen in <xref ref-type="fig" rid="F2">Figure 2</xref>. Three participants had at least one ratio score greater than 1.5 times the interquartile range below the 25th or above the 75th quartile. Removing them did not change the pattern of results, so the analyses are reported including these outliers.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p>Distribution of the individual mean log ratio scores of 22 participants (dots) in the happy, sad, neutral face and non-biological stimulus conditions at the central and occipital regions. Outliers ( &#x0003E;1.5&#x000D7; interquartile range) are represented by the red disks. Inside the boxplots, dots represent the means and horizontal lines represent the medians. The density plots around the data points represent the kernel probability density of the data at different values. CEN, central region; OCC, occipital region. Significant differences are marked by an asterisk [<italic>p</italic> &#x0003C; 0.05, false discovery rate (FDR) corrected].</p></caption>
<graphic xlink:href="fnhum-13-00034-g0002.tif"/>
</fig>
</sec>
<sec id="s3-2">
<title>Source Estimation Results</title>
<p>The neural sources of the difference in the mu/alpha band current density power between the neutral and non-biological stimulus conditions during the late epoch (neutral face minus non-biological stimulus) were analyzed using sLORETA with a one-tailed test (neutral face &#x0003C; non-biological stimulus). Exceedance proportion test output from sLORETA analysis was used to identify the voxels at which the difference in mu/alpha power between the two conditions was significant (<italic>p</italic> &#x0003C; 0.05). Based on the exceedance proportion test results which showed a threshold of &#x02212;3.599 for a <italic>p</italic>-value of 0.0524, differences in alpha power were localized to the fusiform gyrus (BA20) <italic>t</italic> = &#x02212;4.03 (<italic>X</italic> = &#x02212;55, <italic>Y</italic> = &#x02212;40, <italic>Z</italic> = &#x02212;30; MNI coordinates), primary somatosensory cortex (BA3) <italic>t</italic> = &#x02212;3.80 (<italic>X</italic> = &#x02212;40, <italic>Y</italic> = &#x02212;25, <italic>Z</italic> = 40), prefrontal cortex (BA9) <italic>t</italic> = &#x02212;5.14 (<italic>X</italic> = 10, <italic>Y</italic> = 45, <italic>Z</italic> = 35), and medial premotor cortex (supplementary motor area; BA6) <italic>t</italic> = &#x02212;3.99 (<italic>X</italic> = 10, <italic>Y</italic> = &#x02212;30, <italic>Z</italic> = 70; see <xref ref-type="fig" rid="F3">Figure 3</xref>). In the color scale, blue indicates less alpha power while red indicates the opposite.</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p>Current density power analysis in the mu/alpha band (8&#x02013;13 Hz), averaged across 22 participants, between the neutral face and non-biological stimulus conditions during the late epoch found significant voxels (<italic>p</italic> &#x0003C; 0.05) best matched to the supplementary motor area (top) and the primary somatosensory area (bottom). Horizontal (left), sagittal (middle), and coronal (right) sections through the voxel with the maximal <italic>t</italic>-statistic (local maximum) are displayed. Blue indicates less power in the alpha band in the neutral face than the non-biological stimulus condition.</p></caption>
<graphic xlink:href="fnhum-13-00034-g0003.tif"/>
</fig>
</sec>
</sec>
<sec sec-type="discussion" id="s4">
<title>Discussion</title>
<p>The current study investigated the modulation of mu rhythm while participants observed videos of emotional and neutral face movements and non-biological stimulus movements. Mu suppression, but not occipital alpha suppression, was predicted to be greater in the face conditions than the non-biological stimulus condition, with greater suppression in the emotional faces than the neutral face condition. In contrast to our prediction, only the neutral faces were associated with stronger mu activity than that for the non-biological stimulus condition. A lack of difference in mu/alpha band power between the emotional faces and the non-biological stimulus at the central region made it difficult to distinguish mu from posterior alpha modulation during emotional face observation. Greater suppression in the neutral face than the non-biological stimulus condition at the central region accompanied with an opposite pattern at the occipital region suggests that mu rhythm modulation associated with neutral face processing is distinct from the attenuation of the overall alpha activity power associated with information processing and attention. Similar opposing trends of alpha and mu suppression between biological and non-biological movement was observed by Hobson and Bishop (<xref ref-type="bibr" rid="B27">2016</xref>). Greater occipital alpha suppression in the non-biological stimulus than the neutral face condition may be explained by low-level visual differences between the two conditions, such as the contrast and the frequency domain information in the stimuli, and/or disparate demands on attention.</p>
<p>In addition to the results from the scalp-recorded EEG activity, source analysis data provide further support for a more localized than an overall difference in the mu/alpha band power between neutral face and non-biological stimulus conditions, suggesting different levels of activity between conditions in the face-related (i.e., fusiform gyrus) and MNN areas, specifically, the primary somatosensory cortex, prefrontal cortex and supplementary motor area. Greater activity in the fusiform gyrus in response to faces than non-biological stimulus was expected as this area responds more to faces than objects (Haxby et al., <xref ref-type="bibr" rid="B26">2000</xref>). The premotor areas, including the supplementary motor area, and the primary somatosensory cortex are the key regions implicated in sensorimotor simulation during action observation (for a review, see Wood et al., <xref ref-type="bibr" rid="B61">2016</xref>) and motor imagery (Burianov&#x000E1; et al., <xref ref-type="bibr" rid="B7">2013</xref>; Filgueiras et al., <xref ref-type="bibr" rid="B20">2018</xref>). The premotor cortex has been a primary region investigated in studies of action observation Buccino et al., <xref ref-type="bibr" rid="B6">2001</xref>; Johnson-Frey et al., <xref ref-type="bibr" rid="B30">2003</xref>; Raos et al., <xref ref-type="bibr" rid="B53">2004</xref>, <xref ref-type="bibr" rid="B54">2007</xref>). While the motor representations of actions are stored in the premotor areas, the somatosensory areas may be involved in storing tactile and proprioceptive representations of these actions (Gazzola and Keysers, <xref ref-type="bibr" rid="B23">2009</xref>). In addition to the role of the somatosensory activity in hand actions (Avikainen et al., <xref ref-type="bibr" rid="B3">2002</xref>; Raos et al., <xref ref-type="bibr" rid="B53">2004</xref>), there is evidence for the involvement of somatosensory representations in our ability to simulate basic emotions while observing facial expressions (Adolphs et al., <xref ref-type="bibr" rid="B1">2000</xref>). Wood et al. (<xref ref-type="bibr" rid="B61">2016</xref>) review highlights the role of sensory simulation in addition to motor simulation in emotion recognition, pointing to a large overlap between brain areas involved in production and observation of facial expressions. Signaling from the somatosensory cortex to the premotor cortex may be a necessary step for action understanding and imitation (Gazzola and Keysers, <xref ref-type="bibr" rid="B23">2009</xref>). This signaling may explain the significantly less mu/alpha band power present source estimation results show in these two brain areas in the neutral face movement compared to the non-biological stimulus condition.</p>
<p>We offer a number of possible explanations for the EEG results showing the strongest mu suppression to the neutral face movement in the form of mouth opening. Firstly, the results may be attributed to the sensitivity of the sensorimotor cortex to human-object interaction. Most research that has investigated the role of MNN in action observation involves hand and finger movements that almost always suggest some sort of interaction with an object, such as pincer movement with the thumb and the index finger (e.g., Cochin et al., <xref ref-type="bibr" rid="B12">1999</xref>), manipulating objects (e.g., Gazzola and Keysers, <xref ref-type="bibr" rid="B23">2009</xref>), or bringing food to mouth (Ferrari et al., <xref ref-type="bibr" rid="B18">2003</xref>). In addition to limb movements, viewing oro-facial movements has also been observed to induce mu power decrease, with greatest suppression to viewing object-directed actions compared to undirected sucking and biting movements, and least suppression to the viewing of speech-like mouth movements (Muthukumaraswamy et al., <xref ref-type="bibr" rid="B44">2006</xref>). In the present study, the sensorimotor cortex could be engaged by the mouth opening gesture which may have been perceived as an action associated with eating, an action implying interaction with an object (i.e., food), thereby supporting intention understanding (i.e., eating). Second, the MNN may be involved in the recognition of deliberate, voluntary gestures rather than involuntary communicative actions. Yet, mu suppression is reported to be modulated by contextual information, such as the actor&#x02019;s familiarity (Oberman et al., <xref ref-type="bibr" rid="B46">2008</xref>) or their reward value (Gros et al., <xref ref-type="bibr" rid="B24">2015</xref>), or gaming context in which the hand gestures are viewed (Perry et al., <xref ref-type="bibr" rid="B50">2011</xref>). In addition, viewing facial gestures that do not suggest object interaction or deliberate action also seems to modulate mu rhythm (Moore et al., <xref ref-type="bibr" rid="B40">2012</xref>; Rayson et al., <xref ref-type="bibr" rid="B55">2016</xref>, <xref ref-type="bibr" rid="B56">2017</xref>; Moore and Franz, <xref ref-type="bibr" rid="B41">2017</xref>). Thus, explanations which restrict mu suppression to voluntary or object-related actions are unlikely.</p>
<p>A third explanation is that different types of facial movements may tap different MNN areas. An fMRI experiment conducted by van der Gaag et al. (<xref ref-type="bibr" rid="B59">2007</xref>) found bilateral inferior frontal operculum activation to viewing emotional facial expressions but somatosensory activation to neutral movements (i.e., blowing up the cheeks). The authors attributed their findings to distinct processing pathways, more visceral in the former and more proprioceptive in the latter. A similar differential pathway may explain the current findings. Alternatively, if a single mirroring pathway underlies all types of facial movements, greater ambiguity of the action and/or the emotion in the mouth opening image may require the MNN more than the full-blown, easy to recognize emotional expressions. In other words, when the emotion information is presented in high intensity, the cognitive task of recognition may not be demanding enough to activate the MNN, thereby bypassing the whole system, as indexed by the lack of or reduced mu suppression.</p>
<p>Based on recent findings from connectivity research (see, for example, Gardner et al., <xref ref-type="bibr" rid="B22">2015</xref>), our last explanation argues that rather than a global increase/decrease of activity in the totality of the network, a differential modulation of the signaling between the key MNN nodes is more likely to be at work during action observation. There is evidence for the existence of a subgroup of neurons in the human supplementary motor area that is excited by execution, but inhibited by observation of hand grasping actions and facial emotional expressions (Mukamel et al., <xref ref-type="bibr" rid="B42">2010</xref>). These observation-inhibited neurons may be the mechanism for self-other discrimination process related to observing others&#x02019; actions, and the strength of their activity may modulate the amount of input from premotor areas to the sensorimotor cortex during action observation (Mukamel et al., <xref ref-type="bibr" rid="B42">2010</xref>; Woodruff et al., <xref ref-type="bibr" rid="B62">2011</xref>). Readily recognizable emotion-related information may activate the observation-inhibited mirror neurons in the premotor areas, leading to less excitatory input to the sensorimotor cortex. On the other hand, neutral facial movements that lack social and emotional information, as in mouth opening, may not activate observation-inhibited neurons as much as easily recognizable expressions do, resulting in stronger excitatory input to the sensorimotor cortex. Thus, in the face of subtle expressions, increased sensorimotor activity may aid action and emotion recognition.</p>
<p>Signaling between and within the key MNN areas during action observation and execution has recently been approached from a Bayesian perspective that suggests the existence of an updating mechanism which continuously attempts to minimize the difference (i.e., the error) between the predicted action and the observed or executed action to achieve an understanding of the most likely cause of an action (Keysers and Perrett, <xref ref-type="bibr" rid="B32">2004</xref>; Kilner et al., <xref ref-type="bibr" rid="B34">2007b</xref>). According to a predictive coding model of mirror neurons, when the mismatch between the predicted and observed actions of others is large due to the unfamiliarity, unusualness and unexpectedness of the observed action, the network generates a new prediction model, resulting in stronger motor activation (Kilner et al., <xref ref-type="bibr" rid="B33">2007a</xref>,<xref ref-type="bibr" rid="B34">b</xref>). In line with this account, several studies have reported greater mu suppression in infants during observation of extraordinary actions (e.g., turning on a lamp with one&#x02019;s forehead or lifting a cup to the ear) compared to ordinary actions (e.g., turning on a lamp with one&#x02019;s hand or lifting a cup to the mouth), suggesting that as the deviation of the observed action from the expected action increases, motor activation increases (Stapel et al., <xref ref-type="bibr" rid="B57">2010</xref>; Langeloh et al., <xref ref-type="bibr" rid="B36">2018</xref>). In the current study, the unfamiliarity of the mouth-opening movement as a neutral gesture may have resulted in a greater error signal between the predicted, usual neutral gesture the participants would expect to see, and the observed, unusual neutral gesture they were instructed to categorize as such. Additional predictions that required updating in the mouth opening condition may have activated the sensorimotor areas more than the familiar and ordinary happy and sad gestures. Future research may examine the coordinated activity of the involved brain regions by connectivity analyses to quantify the differences in their associations or dependencies under different conditions.</p>
<sec id="s4-1">
<title>Limitations</title>
<p>There are several important limitations of the current study that must be noted. Low-level visual properties, such as the contrast and the frequency domain composition of the images, in the face and the non-biological stimulus conditions were not matched. Future studies should aim to match contrast and frequency components of stimuli across conditions in order to mitigate the effect of these non-task related factors on mu/alpha activity. Second, in the face videos, every actor performed only one facial expression. This might have led the participants to learn the movement that followed each static image, leading to habituation across the blocks. Using the same actors for different facial expressions might help avoid habituation-driven mu/alpha activity changes. Another important limitation is related to the uncontroled degree of movement viewed in each condition. Variability in the amount of movement displayed in videos may have influenced mu and alpha power modulation across conditions. Thus, it is possible that the greater mu suppression to neutral faces reflects more the more pronounced movement in the mouth opening action compared to the happy and sad expressions rather than the differences in social emotional content. Furthermore, face videos used as stimuli may not induce mu modulation that would naturally be observed in real life settings. Finally, the limited sample size and lack of<italic> a priori</italic> power analysis require further replication studies to shed light on the modulatory influence of observed facial movements on the mu rhythm.</p>
</sec>
</sec>
<sec sec-type="conclusion" id="s5">
<title>Conclusion</title>
<p>In conclusion, ambiguity or complexity of emotional information may result in greater activity in the sensorimotor areas if difficulty of the emotion recognition task requires a stronger engagement of the simulation system. Present findings provide support for the involvement of the MNN in face simulation, and indicate a complex relationship between sensorimotor activity and facial expression processing. Current data call for further research on the observation-related activity within and between the key brain areas involved in mimicry and social information processing. The explanations offered above which attribute the observed effect to the ambiguity of emotion may be addressed in future studies by comparing the level of activity in the premotor, motor and somatosensory areas in response to social stimuli depicting different intensities of various emotions. High spatial resolution neuroimaging techniques, such as fMRI, can be employed to investigate the involvement of the main MNN areas as well as deeper brain regions in the simulation of ambiguous motor and emotion information.</p>
</sec>
<sec id="s6">
<title>Data Availability</title>
<p>The datasets for this study will not be made publicly available because raw per-participant EEG data cannot be made publicly available as ethical clearance for sharing raw individual data on public repositories has not been received and it is not possible to do so retrospectively. Preprocessed EEG data, in a form as close to the raw data as possible, will be made available upon request of researchers.</p>
</sec>
<sec id="s7">
<title>Author Contributions</title>
<p>OK performed data acquisition, analysis, interpretation of the data and wrote the first and final drafts of the manuscript. IK contributed substantially to the design of the experiment and critical revision of the manuscript. MM contributed substantially to data analysis and critical revision of the manuscript. All authors approved the final draft of the manuscript for publication (OK, MM, IK).</p>
</sec>
<sec id="s8">
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ack>
<p>We thank Veema Lodhia for assistance during data collection and all the participants who took part in the study. This article has been published as a pre-print in bioRxiv (Karakale et al., <xref ref-type="bibr" rid="B31">2019</xref>).</p>
</ack>
<fn-group>
<fn fn-type="financial-disclosure">
<p><bold>Funding.</bold> This work was supported by The University of Auckland Postgraduate Research Student Support (PReSS) Account (OK).</p>
</fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adolphs</surname> <given-names>R.</given-names></name> <name><surname>Damasio</surname> <given-names>H.</given-names></name> <name><surname>Tranel</surname> <given-names>D.</given-names></name> <name><surname>Cooper</surname> <given-names>G.</given-names></name> <name><surname>Damasio</surname> <given-names>A. R.</given-names></name></person-group> (<year>2000</year>). <article-title>A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping</article-title>. <source>J. Neurosci.</source> <volume>20</volume>, <fpage>2683</fpage>&#x02013;<lpage>2690</lpage>. <pub-id pub-id-type="doi">10.1523/jneurosci.20-07-02683.2000</pub-id><pub-id pub-id-type="pmid">10729349</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Aleksandrov</surname> <given-names>A. A.</given-names></name> <name><surname>Tugin</surname> <given-names>S. M.</given-names></name></person-group> (<year>2012</year>). <article-title>Changes in the mu rhythm in different types of motor activity and on observation of movements</article-title>. <source>Neurosci. Behav. Physiol.</source> <volume>42</volume>, <fpage>302</fpage>&#x02013;<lpage>307</lpage>. <pub-id pub-id-type="doi">10.1007/s11055-012-9566-2</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Avikainen</surname> <given-names>S.</given-names></name> <name><surname>Forss</surname> <given-names>N.</given-names></name> <name><surname>Hari</surname> <given-names>R.</given-names></name></person-group> (<year>2002</year>). <article-title>Modulated activation of the human SI and SII cortices during observation of hand actions</article-title>. <source>Neuroimage</source> <volume>15</volume>, <fpage>640</fpage>&#x02013;<lpage>646</lpage>. <pub-id pub-id-type="doi">10.1006/nimg.2001.1029</pub-id><pub-id pub-id-type="pmid">11848707</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Babiloni</surname> <given-names>C.</given-names></name> <name><surname>Carducci</surname> <given-names>F.</given-names></name> <name><surname>Cincotti</surname> <given-names>F.</given-names></name> <name><surname>Rossini</surname> <given-names>P. M.</given-names></name> <name><surname>Neuper</surname> <given-names>C.</given-names></name> <name><surname>Pfurtscheller</surname> <given-names>G.</given-names></name> <etal/></person-group>. (<year>1999</year>). <article-title>Human movement-related potentials vs desynchronization of EEG alpha rhythm: a high-resolution EEG study</article-title>. <source>Neuroimage</source> <volume>10</volume>, <fpage>658</fpage>&#x02013;<lpage>665</lpage>. <pub-id pub-id-type="doi">10.1006/nimg.1999.0504</pub-id><pub-id pub-id-type="pmid">10600411</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Benjamini</surname> <given-names>Y.</given-names></name> <name><surname>Hochberg</surname> <given-names>Y.</given-names></name></person-group> (<year>1995</year>). <article-title>Controlling the false discovery rate: a practical and powerful approach to multiple testing</article-title>. <source>J. R Stat. Soc. Series B Methodol.</source> <volume>57</volume>, <fpage>289</fpage>&#x02013;<lpage>300</lpage>. <pub-id pub-id-type="doi">10.1111/j.2517-6161.1995.tb02031.x</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Buccino</surname> <given-names>G.</given-names></name> <name><surname>Binkofski</surname> <given-names>F.</given-names></name> <name><surname>Fink</surname> <given-names>G. R.</given-names></name> <name><surname>Fadiga</surname> <given-names>L.</given-names></name> <name><surname>Fogassi</surname> <given-names>L.</given-names></name> <name><surname>Gallese</surname> <given-names>V.</given-names></name> <etal/></person-group>. (<year>2001</year>). <article-title>Action observation activates premotor and parietal areas in a somatotopic manner: an fMRI study</article-title>. <source>Eur. J. Neurosci.</source> <volume>13</volume>, <fpage>400</fpage>&#x02013;<lpage>404</lpage>. <pub-id pub-id-type="doi">10.1111/j.1460-9568.2001.01385.x</pub-id><pub-id pub-id-type="pmid">11168545</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Burianov&#x000E1;</surname> <given-names>H.</given-names></name> <name><surname>Marstaller</surname> <given-names>L.</given-names></name> <name><surname>Sowman</surname> <given-names>P.</given-names></name> <name><surname>Tesan</surname> <given-names>G.</given-names></name> <name><surname>Rich</surname> <given-names>A. N.</given-names></name> <name><surname>Williams</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Multimodal functional imaging of motor imagery using a novel paradigm</article-title>. <source>Neuroimage</source> <volume>71</volume>, <fpage>50</fpage>&#x02013;<lpage>58</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2013.01.001</pub-id><pub-id pub-id-type="pmid">23319043</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carr</surname> <given-names>L.</given-names></name> <name><surname>Iacoboni</surname> <given-names>M.</given-names></name> <name><surname>Dubeau</surname> <given-names>M.-C.</given-names></name> <name><surname>Mazziotta</surname> <given-names>J. C.</given-names></name> <name><surname>Lenzi</surname> <given-names>G. L.</given-names></name></person-group> (<year>2003</year>). <article-title>Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas</article-title>. <source>Proc. Natl. Acad. Sci. U S A</source> <volume>100</volume>, <fpage>5497</fpage>&#x02013;<lpage>5502</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.0935845100</pub-id><pub-id pub-id-type="pmid">12682281</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caspers</surname> <given-names>S.</given-names></name> <name><surname>Zilles</surname> <given-names>K.</given-names></name> <name><surname>Laird</surname> <given-names>A. R.</given-names></name> <name><surname>Eickhoff</surname> <given-names>S. B.</given-names></name></person-group> (<year>2010</year>). <article-title>ALE meta-analysis of action observation and imitation in the human brain</article-title>. <source>Neuroimage</source> <volume>50</volume>, <fpage>1148</fpage>&#x02013;<lpage>1167</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.12.112</pub-id><pub-id pub-id-type="pmid">20056149</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cheng</surname> <given-names>Y.</given-names></name> <name><surname>Chen</surname> <given-names>C.</given-names></name> <name><surname>Decety</surname> <given-names>J.</given-names></name></person-group> (<year>2014</year>). <article-title>An EEG/ERP investigation of the development of empathy in early and middle childhood</article-title>. <source>Dev. Cogn. Neurosci.</source> <volume>10</volume>, <fpage>160</fpage>&#x02013;<lpage>169</lpage>. <pub-id pub-id-type="doi">10.1016/j.dcn.2014.08.012</pub-id><pub-id pub-id-type="pmid">25261920</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cochin</surname> <given-names>S.</given-names></name> <name><surname>Barthelemy</surname> <given-names>C.</given-names></name> <name><surname>Lejeune</surname> <given-names>B.</given-names></name> <name><surname>Roux</surname> <given-names>S.</given-names></name> <name><surname>Martineau</surname> <given-names>J.</given-names></name></person-group> (<year>1998</year>). <article-title>Perception of motion and qEEG activity in human adults</article-title>. <source>Electroencephalogr. Clin. Neurophysiol.</source> <volume>107</volume>, <fpage>287</fpage>&#x02013;<lpage>295</lpage>. <pub-id pub-id-type="doi">10.1016/s0013-4694(98)00071-6</pub-id><pub-id pub-id-type="pmid">9872446</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cochin</surname> <given-names>S.</given-names></name> <name><surname>Barthelemy</surname> <given-names>C.</given-names></name> <name><surname>Roux</surname> <given-names>S.</given-names></name> <name><surname>Martineau</surname> <given-names>J.</given-names></name></person-group> (<year>1999</year>). <article-title>Observation and execution of movement: similarities demonstrated by quantified electroencephalography</article-title>. <source>Eur. J. Neurosci.</source> <volume>11</volume>, <fpage>1839</fpage>&#x02013;<lpage>1842</lpage>. <pub-id pub-id-type="doi">10.1046/j.1460-9568.1999.00598.x</pub-id><pub-id pub-id-type="pmid">10215938</pub-id></citation></ref>
<ref id="B13"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Cohen</surname> <given-names>M. X.</given-names></name></person-group> (<year>2014</year>). <source>Analyzing Neural Time Series Data: Theory And Practice.</source> <publisher-loc>Cambridge, CA</publisher-loc>: <publisher-name>MIT press</publisher-name>.</citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cooper</surname> <given-names>N. R.</given-names></name> <name><surname>Simpson</surname> <given-names>A.</given-names></name> <name><surname>Till</surname> <given-names>A.</given-names></name> <name><surname>Simmons</surname> <given-names>K.</given-names></name> <name><surname>Puzzo</surname> <given-names>I.</given-names></name></person-group> (<year>2013</year>). <article-title>Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults</article-title>. <source>Front. Hum. Neurosci.</source> <volume>7</volume>:<fpage>159</fpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2013.00159</pub-id><pub-id pub-id-type="pmid">23630489</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Delorme</surname> <given-names>A.</given-names></name> <name><surname>Makeig</surname> <given-names>S.</given-names></name></person-group> (<year>2004</year>). <article-title>EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis</article-title>. <source>J. Neurosci. Methods</source> <volume>134</volume>, <fpage>9</fpage>&#x02013;<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1016/j.jneumeth.2003.10.009</pub-id><pub-id pub-id-type="pmid">15102499</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>di Pellegrino</surname> <given-names>G.</given-names></name> <name><surname>Fadiga</surname> <given-names>L.</given-names></name> <name><surname>Fogassi</surname> <given-names>L.</given-names></name> <name><surname>Gallese</surname> <given-names>V.</given-names></name> <name><surname>Rizzolatti</surname> <given-names>G.</given-names></name></person-group> (<year>1992</year>). <article-title>Understanding motor events: a neurophysiological study</article-title>. <source>Exp. Brain Res.</source> <volume>91</volume>, <fpage>176</fpage>&#x02013;<lpage>180</lpage>. <pub-id pub-id-type="doi">10.1007/bf00230027</pub-id><pub-id pub-id-type="pmid">1301372</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fecteau</surname> <given-names>S.</given-names></name> <name><surname>Carmant</surname> <given-names>L.</given-names></name> <name><surname>Tremblay</surname> <given-names>C.</given-names></name> <name><surname>Robert</surname> <given-names>M.</given-names></name> <name><surname>Bouthillier</surname> <given-names>A.</given-names></name> <name><surname>Th&#x000E9;oret</surname> <given-names>H.</given-names></name></person-group> (<year>2004</year>). <article-title>A motor resonance mechanism in children? Evidence from subdural electrodes in a 36-month-old child</article-title>. <source>Neuroreport</source> <volume>15</volume>, <fpage>2625</fpage>&#x02013;<lpage>2627</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-200412030-00013</pub-id><pub-id pub-id-type="pmid">15570165</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ferrari</surname> <given-names>P. F.</given-names></name> <name><surname>Gallese</surname> <given-names>V.</given-names></name> <name><surname>Rizzolatti</surname> <given-names>G.</given-names></name> <name><surname>Fogassi</surname> <given-names>L.</given-names></name></person-group> (<year>2003</year>). <article-title>Mirror neurons responding to the observation of ingestive and communicative mouth actions in the monkey ventral premotor cortex</article-title>. <source>Eur. J. Neurosci.</source> <volume>17</volume>, <fpage>1703</fpage>&#x02013;<lpage>1714</lpage>. <pub-id pub-id-type="doi">10.1046/j.1460-9568.2003.02601.x</pub-id><pub-id pub-id-type="pmid">12752388</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ferree</surname> <given-names>T. C.</given-names></name> <name><surname>Clay</surname> <given-names>M. T.</given-names></name> <name><surname>Tucker</surname> <given-names>D. M.</given-names></name></person-group> (<year>2001</year>). <article-title>The spatial resolution of scalp EEG</article-title>. <source>Neurocomputing</source> <volume>38</volume>, <fpage>1209</fpage>&#x02013;<lpage>1216</lpage>. <pub-id pub-id-type="doi">10.1016/s0925-2312(01)00568-9</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Filgueiras</surname> <given-names>A.</given-names></name> <name><surname>Conde</surname> <given-names>E. F. Q.</given-names></name> <name><surname>Hall</surname> <given-names>C. R.</given-names></name></person-group> (<year>2018</year>). <article-title>The neural basis of kinesthetic and visual imagery in sports: an ALE meta&#x02212; analysis</article-title>. <source>Brain Imaging Behav.</source> <volume>12</volume>, <fpage>1513</fpage>&#x02013;<lpage>1523</lpage>. <pub-id pub-id-type="doi">10.1007/s11682-017-9813-9</pub-id><pub-id pub-id-type="pmid">29260381</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gallese</surname> <given-names>V.</given-names></name> <name><surname>Goldman</surname> <given-names>A.</given-names></name></person-group> (<year>1998</year>). <article-title>Mirror neurons and the simulation theory of mind-reading</article-title>. <source>Trends Cogn. Sci.</source> <volume>2</volume>, <fpage>493</fpage>&#x02013;<lpage>501</lpage>. <pub-id pub-id-type="doi">10.1016/s1364-6613(98)01262-5</pub-id><pub-id pub-id-type="pmid">21227300</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gardner</surname> <given-names>T.</given-names></name> <name><surname>Goulden</surname> <given-names>N.</given-names></name> <name><surname>Cross</surname> <given-names>E. S.</given-names></name></person-group> (<year>2015</year>). <article-title>Dynamic modulation of the action observation network by movement familiarity</article-title>. <source>J. Neurosci.</source> <volume>35</volume>, <fpage>1561</fpage>&#x02013;<lpage>1572</lpage>. <pub-id pub-id-type="doi">10.1523/jneurosci.2942-14.2015</pub-id><pub-id pub-id-type="pmid">25632133</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gazzola</surname> <given-names>V.</given-names></name> <name><surname>Keysers</surname> <given-names>C.</given-names></name></person-group> (<year>2009</year>). <article-title>The observation and execution of actions share motor and somatosensory voxels in all tested subjects: single-subject analyses of unsmoothed fMRI data</article-title>. <source>Cereb. Cortex</source> <volume>19</volume>, <fpage>1239</fpage>&#x02013;<lpage>1255</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhn181</pub-id><pub-id pub-id-type="pmid">19020203</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gros</surname> <given-names>I. T.</given-names></name> <name><surname>Panasiti</surname> <given-names>M. S.</given-names></name> <name><surname>Chakrabarti</surname> <given-names>B.</given-names></name></person-group> (<year>2015</year>). <article-title>The plasticity of the mirror system: How reward learning modulates cortical motor simulation of others</article-title>. <source>Neuropsychologia</source> <volume>70</volume>, <fpage>255</fpage>&#x02013;<lpage>262</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2015.02.033</pub-id><pub-id pub-id-type="pmid">25744871</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hari</surname> <given-names>R.</given-names></name> <name><surname>Salmelin</surname> <given-names>R.</given-names></name></person-group> (<year>1997</year>). <article-title>Human cortical oscillations: a neuromagnetic view through the skull</article-title>. <source>Trends Neurosci.</source> <volume>20</volume>, <fpage>44</fpage>&#x02013;<lpage>49</lpage>. <pub-id pub-id-type="doi">10.1016/s0166-2236(96)10065-5</pub-id><pub-id pub-id-type="pmid">9004419</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haxby</surname> <given-names>J. V.</given-names></name> <name><surname>Hoffman</surname> <given-names>E. A.</given-names></name> <name><surname>Gobbini</surname> <given-names>M. I.</given-names></name></person-group> (<year>2000</year>). <article-title>The distributed human neural system for face perception</article-title>. <source>Trends Cogn. Sci.</source> <volume>4</volume>, <fpage>223</fpage>&#x02013;<lpage>233</lpage>. <pub-id pub-id-type="doi">10.1016/s1364-6613(00)01482-0</pub-id><pub-id pub-id-type="pmid">10827445</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hobson</surname> <given-names>H. M.</given-names></name> <name><surname>Bishop</surname> <given-names>D. V.</given-names></name></person-group> (<year>2016</year>). <article-title>Mu suppression-A good measure of the human mirror neuron system?</article-title> <source>Cortex</source> <volume>82</volume>, <fpage>290</fpage>&#x02013;<lpage>310</lpage>. <pub-id pub-id-type="doi">10.1016/j.cortex.2016.03.019</pub-id><pub-id pub-id-type="pmid">27180217</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hoenen</surname> <given-names>M.</given-names></name> <name><surname>L&#x000FC;bke</surname> <given-names>K. T.</given-names></name> <name><surname>Pause</surname> <given-names>B. M.</given-names></name></person-group> (<year>2015</year>). <article-title>Somatosensory mu activity reflects imagined pain intensity of others</article-title>. <source>Psychophysiology</source> <volume>52</volume>, <fpage>1551</fpage>&#x02013;<lpage>1558</lpage>. <pub-id pub-id-type="doi">10.1111/psyp.12522</pub-id><pub-id pub-id-type="pmid">26379210</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hoenen</surname> <given-names>M.</given-names></name> <name><surname>Schain</surname> <given-names>C.</given-names></name> <name><surname>Pause</surname> <given-names>B. M.</given-names></name></person-group> (<year>2013</year>). <article-title>Down-modulation of mu-activity through empathic top-down processes</article-title>. <source>Soc. Neurosci.</source> <volume>8</volume>, <fpage>515</fpage>&#x02013;<lpage>524</lpage>. <pub-id pub-id-type="doi">10.1080/17470919.2013.833550</pub-id><pub-id pub-id-type="pmid">24028313</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Johnson-Frey</surname> <given-names>S. H.</given-names></name> <name><surname>Maloof</surname> <given-names>F. R.</given-names></name> <name><surname>Newman-Norlund</surname> <given-names>R.</given-names></name> <name><surname>Farrer</surname> <given-names>C.</given-names></name> <name><surname>Inati</surname> <given-names>S.</given-names></name> <name><surname>Grafton</surname> <given-names>S. T.</given-names></name></person-group> (<year>2003</year>). <article-title>Actions or hand-object interactions? Human inferior frontal cortex and action observation</article-title>. <source>Neuron</source> <volume>39</volume>, <fpage>1053</fpage>&#x02013;<lpage>1058</lpage>. <pub-id pub-id-type="doi">10.1016/s0896-6273(03)00524-5</pub-id><pub-id pub-id-type="pmid">12971903</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Karakale</surname> <given-names>O.</given-names></name> <name><surname>Moore</surname> <given-names>M. R.</given-names></name> <name><surname>Kirk</surname> <given-names>I. J.</given-names></name></person-group> (<year>2019</year>). <article-title>Mental simulation of facial expressions: Mu suppression to the viewing of dynamic neutral face videos</article-title>. <source>bioRxivorg</source> <volume>457846</volume>. <pub-id pub-id-type="doi">10.1101/457846</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Keysers</surname> <given-names>C.</given-names></name> <name><surname>Perrett</surname> <given-names>D. I.</given-names></name></person-group> (<year>2004</year>). <article-title>Demystifying social cognition: a Hebbian perspective</article-title>. <source>Trends Cogn. Sci. Regul. Ed.</source> <volume>8</volume>, <fpage>501</fpage>&#x02013;<lpage>507</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2004.09.005</pub-id><pub-id pub-id-type="pmid">15491904</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kilner</surname> <given-names>J. M.</given-names></name> <name><surname>Friston</surname> <given-names>K. J.</given-names></name> <name><surname>Frith</surname> <given-names>C. D.</given-names></name></person-group> (<year>2007a</year>). <article-title>The mirror-neuron system: a Bayesian perspective</article-title>. <source>Neuroreport</source> <volume>18</volume>, <fpage>619</fpage>&#x02013;<lpage>623</lpage>. <pub-id pub-id-type="doi">10.1097/wnr.0b013e3281139ed0</pub-id><pub-id pub-id-type="pmid">17413668</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kilner</surname> <given-names>J. M.</given-names></name> <name><surname>Friston</surname> <given-names>K. J.</given-names></name> <name><surname>Frith</surname> <given-names>C. D.</given-names></name></person-group> (<year>2007b</year>). <article-title>Predictive coding: an account of the mirror neuron system</article-title>. <source>Cogn. Process.</source> <volume>8</volume>, <fpage>159</fpage>&#x02013;<lpage>166</lpage>. <pub-id pub-id-type="doi">10.1007/s10339-007-0170-2</pub-id><pub-id pub-id-type="pmid">17429704</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kohler</surname> <given-names>E.</given-names></name> <name><surname>Keysers</surname> <given-names>C.</given-names></name> <name><surname>Umilta</surname> <given-names>M. A.</given-names></name> <name><surname>Fogassi</surname> <given-names>L.</given-names></name> <name><surname>Gallese</surname> <given-names>V.</given-names></name> <name><surname>Rizzolatti</surname> <given-names>G.</given-names></name></person-group> (<year>2002</year>). <article-title>Hearing sounds, understanding actions: action representation in mirror neurons</article-title>. <source>Science</source> <volume>297</volume>, <fpage>846</fpage>&#x02013;<lpage>848</lpage>. <pub-id pub-id-type="doi">10.3410/f.1010294.150158</pub-id><pub-id pub-id-type="pmid">12161656</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Langeloh</surname> <given-names>M.</given-names></name> <name><surname>Buttelmann</surname> <given-names>D.</given-names></name> <name><surname>Matthes</surname> <given-names>D.</given-names></name> <name><surname>Grassmann</surname> <given-names>S.</given-names></name> <name><surname>Pauen</surname> <given-names>S.</given-names></name> <name><surname>Hoehl</surname> <given-names>S.</given-names></name></person-group> (<year>2018</year>). <article-title>Reduced mu power in response to unusual actions is context-dependent in 1-year-olds</article-title>. <source>Front. Psychol.</source> <volume>9</volume>:<fpage>36</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2018.00036</pub-id><pub-id pub-id-type="pmid">29441034</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lepage</surname> <given-names>J. F.</given-names></name> <name><surname>Th&#x000E9;oret</surname> <given-names>H.</given-names></name></person-group> (<year>2006</year>). <article-title>EEG evidence for the presence of an action observation-execution matching system in children</article-title>. <source>Eur. J. Neurosci.</source> <volume>23</volume>, <fpage>2505</fpage>&#x02013;<lpage>2510</lpage>. <pub-id pub-id-type="doi">10.1111/j.1460-9568.2006.04769.x</pub-id><pub-id pub-id-type="pmid">16706857</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Milston</surname> <given-names>S. I.</given-names></name> <name><surname>Vanman</surname> <given-names>E. J.</given-names></name> <name><surname>Cunnington</surname> <given-names>R.</given-names></name></person-group> (<year>2013</year>). <article-title>Cognitive empathy and motor activity during observed actions</article-title>. <source>Neuropsychologia</source> <volume>51</volume>, <fpage>1103</fpage>&#x02013;<lpage>1108</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2013.02.020</pub-id><pub-id pub-id-type="pmid">23499724</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Molenberghs</surname> <given-names>P.</given-names></name> <name><surname>Cunnington</surname> <given-names>R.</given-names></name> <name><surname>Mattingley</surname> <given-names>J. B.</given-names></name></person-group> (<year>2012</year>). <article-title>Brain regions with mirror properties: a meta-analysis of 125 human fMRI studies</article-title>. <source>Neurosci. Biobehav. Rev.</source> <volume>36</volume>, <fpage>341</fpage>&#x02013;<lpage>349</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2011.07.004</pub-id><pub-id pub-id-type="pmid">21782846</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Moore</surname> <given-names>A.</given-names></name> <name><surname>Gorodnitsky</surname> <given-names>I.</given-names></name> <name><surname>Pineda</surname> <given-names>J.</given-names></name></person-group> (<year>2012</year>). <article-title>EEG mu component responses to viewing emotional faces</article-title>. <source>Behav. Brain Res.</source> <volume>226</volume>, <fpage>309</fpage>&#x02013;<lpage>316</lpage>. <pub-id pub-id-type="doi">10.1016/j.bbr.2011.07.048</pub-id><pub-id pub-id-type="pmid">21835208</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Moore</surname> <given-names>M. R.</given-names></name> <name><surname>Franz</surname> <given-names>E. A.</given-names></name></person-group> (<year>2017</year>). <article-title>Mu rhythm suppression is associated with the classification of emotion in faces</article-title>. <source>Cogn. Affect.Behav. Neurosci.</source> <volume>17</volume>, <fpage>224</fpage>&#x02013;<lpage>234</lpage>. <pub-id pub-id-type="doi">10.3758/s13415-016-0476-6</pub-id><pub-id pub-id-type="pmid">27815729</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mukamel</surname> <given-names>R.</given-names></name> <name><surname>Ekstrom</surname> <given-names>A. D.</given-names></name> <name><surname>Kaplan</surname> <given-names>J.</given-names></name> <name><surname>Iacoboni</surname> <given-names>M.</given-names></name> <name><surname>Fried</surname> <given-names>I.</given-names></name></person-group> (<year>2010</year>). <article-title>Single-neuron responses in humans during execution and observation of actions</article-title>. <source>Curr. Biol.</source> <volume>20</volume>, <fpage>750</fpage>&#x02013;<lpage>756</lpage>. <pub-id pub-id-type="doi">10.3410/f.3038956.2718054</pub-id><pub-id pub-id-type="pmid">20381353</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Muthukumaraswamy</surname> <given-names>S. D.</given-names></name> <name><surname>Johnson</surname> <given-names>B. W.</given-names></name> <name><surname>McNair</surname> <given-names>N. A.</given-names></name></person-group> (<year>2004</year>). <article-title>Mu rhythm modulation during observation of an object-directed grasp</article-title>. <source>Brain Res. Cogn.Brain Res.</source> <volume>19</volume>, <fpage>195</fpage>&#x02013;<lpage>201</lpage>. <pub-id pub-id-type="doi">10.1016/j.cogbrainres.2003.12.001</pub-id><pub-id pub-id-type="pmid">15019715</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Muthukumaraswamy</surname> <given-names>S. D.</given-names></name> <name><surname>Johnson</surname> <given-names>B. W.</given-names></name> <name><surname>Gaetz</surname> <given-names>W. C.</given-names></name> <name><surname>Cheyne</surname> <given-names>D. O.</given-names></name></person-group> (<year>2006</year>). <article-title>Neural processing of observed oro-facial movements reflects multiple action encoding strategies in the human brain</article-title>. <source>Brain Res.</source> <volume>1071</volume>, <fpage>105</fpage>&#x02013;<lpage>112</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2005.11.053</pub-id><pub-id pub-id-type="pmid">16405872</pub-id></citation></ref>
<ref id="B45"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Niedermeyer</surname> <given-names>E.</given-names></name></person-group> (<year>2005</year>). &#x0201C;<article-title>The normal EEG of the waking adult</article-title>,&#x0201D; in <source>Electroencephalography: Basic principles, Clinical Applications and Related Fields</source>, eds <person-group person-group-type="editor"><name><surname>Niedermeyer</surname> <given-names>E.</given-names></name> <name><surname>Lopes da Silva</surname> <given-names>F. H.</given-names></name></person-group> (<publisher-loc>Baltimore</publisher-loc>: <publisher-name>Williams and Wilkins</publisher-name>), <fpage>167</fpage>&#x02013;<lpage>192</lpage>. <pub-id pub-id-type="doi">10.1016/b978-0-323-04233-8.50007-x</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oberman</surname> <given-names>L. M.</given-names></name> <name><surname>Ramachandran</surname> <given-names>V. S.</given-names></name> <name><surname>Pineda</surname> <given-names>J. A.</given-names></name></person-group> (<year>2008</year>). <article-title>Modulation of mu suppression in children with autism spectrum disorders in response to familiar or unfamiliar stimuli: the mirror neuron hypothesis</article-title>. <source>Neuropsychologia</source> <volume>46</volume>, <fpage>1558</fpage>&#x02013;<lpage>1565</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2008.01.010</pub-id><pub-id pub-id-type="pmid">18304590</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pascual-Marqui</surname> <given-names>R. D.</given-names></name></person-group> (<year>2002</year>). <article-title>Standardized low-resolution brain electromagnetic tomography (sLORETA): technical details</article-title>. <source>Methods Find. Exp. Clin. Pharmacol.</source> <volume>24</volume>, <fpage>5</fpage>&#x02013;<lpage>12</lpage>. <pub-id pub-id-type="pmid">12575463</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perry</surname> <given-names>A.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name></person-group> (<year>2010</year>). <article-title>Does focusing on hand-grasping intentions modulate electroencephalogram &#x003BC; and &#x003B1; suppressions?</article-title> <source>Neuroreport</source> <volume>21</volume>, <fpage>1050</fpage>&#x02013;<lpage>1054</lpage>. <pub-id pub-id-type="doi">10.1097/wnr.0b013e32833fcb71</pub-id><pub-id pub-id-type="pmid">20838261</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perry</surname> <given-names>A.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name> <name><surname>Bartal</surname> <given-names>I. B.-A.</given-names></name> <name><surname>Lamm</surname> <given-names>C.</given-names></name> <name><surname>Decety</surname> <given-names>J.</given-names></name></person-group> (<year>2010a</year>). <article-title>&#x0201C;Feeling&#x0201D; the pain of those who are different from us: Modulation of EEG in the mu/alpha range</article-title>. <source>Cogn. Affect. Behav. Neurosci.</source> <volume>10</volume>, <fpage>493</fpage>&#x02013;<lpage>504</lpage>. <pub-id pub-id-type="doi">10.3758/CABN.10.4.493</pub-id><pub-id pub-id-type="pmid">21098810</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perry</surname> <given-names>A.</given-names></name> <name><surname>Troje</surname> <given-names>N. F.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name></person-group> (<year>2010b</year>). <article-title>Exploring motor system contributions to the perception of social information: Evidence from EEG activity in the mu/alpha frequency range</article-title>. <source>Soc. Neurosci.</source> <volume>5</volume>, <fpage>272</fpage>&#x02013;<lpage>284</lpage>. <pub-id pub-id-type="doi">10.1080/17470910903395767</pub-id><pub-id pub-id-type="pmid">20169504</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perry</surname> <given-names>A.</given-names></name> <name><surname>Stein</surname> <given-names>L.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name></person-group> (<year>2011</year>). <article-title>Motor and attentional mechanisms involved in social interaction&#x02014;Evidence from mu and alpha EEG suppression</article-title>. <source>Neuroimage</source> <volume>58</volume>, <fpage>895</fpage>&#x02013;<lpage>904</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2011.06.060</pub-id><pub-id pub-id-type="pmid">21742042</pub-id></citation></ref>
<ref id="B52"><citation citation-type="book"><person-group person-group-type="author"><name><surname>R Studio Team.</surname></name></person-group> (<year>2016</year>). <source>R Studio: Integrated Development for R</source>. <publisher-loc>Boston, MA</publisher-loc>: <publisher-name>R Studio, Inc</publisher-name>. Available online at: <ext-link ext-link-type="uri" xlink:href="http://www.rstudio.com/">http://www.rstudio.com/</ext-link> </citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Raos</surname> <given-names>V.</given-names></name> <name><surname>Evangeliou</surname> <given-names>M. N.</given-names></name> <name><surname>Savaki</surname> <given-names>H. E.</given-names></name></person-group> (<year>2004</year>). <article-title>Observation of action: grasping with the mind&#x02019;s hand</article-title>. <source>Neuroimage</source> <volume>23</volume>, <fpage>193</fpage>&#x02013;<lpage>201</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2004.04.024</pub-id><pub-id pub-id-type="pmid">15325366</pub-id></citation></ref>
<ref id="B54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Raos</surname> <given-names>V.</given-names></name> <name><surname>Evangeliou</surname> <given-names>M. N.</given-names></name> <name><surname>Savaki</surname> <given-names>H. E.</given-names></name></person-group> (<year>2007</year>). <article-title>Mental simulation of action in the service of action perception</article-title>. <source>J. Neurosci.</source> <volume>27</volume>, <fpage>12675</fpage>&#x02013;<lpage>12683</lpage>. <pub-id pub-id-type="doi">10.1523/jneurosci.2988-07.2007</pub-id><pub-id pub-id-type="pmid">18003847</pub-id></citation></ref>
<ref id="B55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rayson</surname> <given-names>H.</given-names></name> <name><surname>Bonaiuto</surname> <given-names>J. J.</given-names></name> <name><surname>Ferrari</surname> <given-names>P. F.</given-names></name> <name><surname>Murray</surname> <given-names>L.</given-names></name></person-group> (<year>2016</year>). <article-title>Mu desynchronization during observation and execution of facial expressions in 30-month-old children</article-title>. <source>Dev. Cogn. Neurosc.</source> <volume>19</volume>, <fpage>279</fpage>&#x02013;<lpage>287</lpage>. <pub-id pub-id-type="doi">10.1016/j.dcn.2016.05.003</pub-id><pub-id pub-id-type="pmid">27261926</pub-id></citation></ref>
<ref id="B56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rayson</surname> <given-names>H.</given-names></name> <name><surname>Bonaiuto</surname> <given-names>J. J.</given-names></name> <name><surname>Ferrari</surname> <given-names>P. F.</given-names></name> <name><surname>Murray</surname> <given-names>L.</given-names></name></person-group> (<year>2017</year>). <article-title>Early maternal mirroring predicts infant motor system activation during facial expression observation</article-title>. <source>Sci. Rep.</source> <volume>7</volume>:<fpage>11738</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-017-12097-w</pub-id><pub-id pub-id-type="pmid">28916786</pub-id></citation></ref>
<ref id="B57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stapel</surname> <given-names>J. C.</given-names></name> <name><surname>Hunnius</surname> <given-names>S.</given-names></name> <name><surname>van Elk</surname> <given-names>M.</given-names></name> <name><surname>Bekkering</surname> <given-names>H.</given-names></name></person-group> (<year>2010</year>). <article-title>Motor activation during observation of unusual versus ordinary actions in infancy</article-title>. <source>Soc. Neurosci.</source> <volume>5</volume>, <fpage>451</fpage>&#x02013;<lpage>460</lpage>. <pub-id pub-id-type="doi">10.1080/17470919.2010.490667</pub-id><pub-id pub-id-type="pmid">20602285</pub-id></citation></ref>
<ref id="B58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tucker</surname> <given-names>D. M.</given-names></name></person-group> (<year>1993</year>). <article-title>Spatial sampling of head electrical fields: the geodesic sensor net</article-title>. <source>Electroencephalogr. Clin. Neurophysiol.</source> <volume>87</volume>, <fpage>154</fpage>&#x02013;<lpage>163</lpage>. <pub-id pub-id-type="doi">10.1016/0013-4694(93)90121-b</pub-id><pub-id pub-id-type="pmid">7691542</pub-id></citation></ref>
<ref id="B59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van der Gaag</surname> <given-names>C.</given-names></name> <name><surname>Minderaa</surname> <given-names>R. B.</given-names></name> <name><surname>Keysers</surname> <given-names>C.</given-names></name></person-group> (<year>2007</year>). <article-title>Facial expressions: what the mirror neuron system can and cannot tell us</article-title>. <source>Soc. Neurosci.</source> <volume>2</volume>, <fpage>179</fpage>&#x02013;<lpage>222</lpage>. <pub-id pub-id-type="doi">10.1080/17470910701376878</pub-id><pub-id pub-id-type="pmid">18633816</pub-id></citation></ref>
<ref id="B60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van der Schalk</surname> <given-names>J.</given-names></name> <name><surname>Hawk</surname> <given-names>S. T.</given-names></name> <name><surname>Fischer</surname> <given-names>A. H.</given-names></name> <name><surname>Doosje</surname> <given-names>B.</given-names></name></person-group> (<year>2011</year>). Moving faces, looking places: validation of the amsterdam dynamic facial expression set (ADFES). <source>Emotion</source> <volume>11</volume>, <fpage>907</fpage>&#x02013;<lpage>920</lpage>. <pub-id pub-id-type="doi">10.1037/a0023853</pub-id><pub-id pub-id-type="pmid">21859206</pub-id></citation></ref>
<ref id="B61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wood</surname> <given-names>A.</given-names></name> <name><surname>Rychlowska</surname> <given-names>M.</given-names></name> <name><surname>Korb</surname> <given-names>S.</given-names></name> <name><surname>Niedenthal</surname> <given-names>P.</given-names></name></person-group> (<year>2016</year>). <article-title>Fashioning the face: sensorimotor simulation contributes to facial expression recognition</article-title>. <source>Trends Cogn. Sci.</source> <volume>20</volume>, <fpage>227</fpage>&#x02013;<lpage>240</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2015.12.010</pub-id><pub-id pub-id-type="pmid">26876363</pub-id></citation></ref>
<ref id="B62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Woodruff</surname> <given-names>C. C.</given-names></name> <name><surname>Martin</surname> <given-names>T.</given-names></name> <name><surname>Bilyk</surname> <given-names>N.</given-names></name></person-group> (<year>2011</year>). <article-title>Differences in self- and other-induced mu suppression are correlated with empathic abilities</article-title>. <source>Brain Res.</source> <volume>1405</volume>, <fpage>69</fpage>&#x02013;<lpage>76</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2011.05.046</pub-id><pub-id pub-id-type="pmid">21741034</pub-id></citation></ref>
<ref id="B63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wu</surname> <given-names>C. C.</given-names></name> <name><surname>Hamm</surname> <given-names>J. P.</given-names></name> <name><surname>Lim</surname> <given-names>V. K.</given-names></name> <name><surname>Kirk</surname> <given-names>I. J.</given-names></name></person-group> (<year>2016</year>). <article-title>Mu rhythm suppression demonstrates action representation in pianists during passive listening of piano melodies</article-title>. <source>Exp. Brain Res.</source> <volume>234</volume>, <fpage>2133</fpage>&#x02013;<lpage>2139</lpage>. <pub-id pub-id-type="doi">10.1007/s00221-016-4615-7</pub-id><pub-id pub-id-type="pmid">26993491</pub-id></citation></ref>
<ref id="B64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>C.-Y.</given-names></name> <name><surname>Decety</surname> <given-names>J.</given-names></name> <name><surname>Lee</surname> <given-names>S.</given-names></name> <name><surname>Chen</surname> <given-names>C.</given-names></name> <name><surname>Cheng</surname> <given-names>Y.</given-names></name></person-group> (<year>2009</year>). <article-title>Gender differences in the mu rhythm during empathy for pain: an electroencephalographic study</article-title>. <source>Brain Res.</source> <volume>1251</volume>, <fpage>176</fpage>&#x02013;<lpage>184</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2008.11.062</pub-id><pub-id pub-id-type="pmid">19083993</pub-id></citation></ref>
</ref-list>
</back>
</article>