<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.3 20210610//EN" "JATS-journalpublishing1-3-mathml3.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ali="http://www.niso.org/schemas/ali/1.0/" article-type="brief-report" dtd-version="1.3" xml:lang="EN">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Hum. Neurosci.</journal-id>
<journal-title-group>
<journal-title>Frontiers in Human Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Hum. Neurosci.</abbrev-journal-title>
</journal-title-group>
<issn pub-type="epub">1662-5161</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnhum.2026.1754371</article-id>
<article-version article-version-type="Version of Record" vocab="NISO-RP-8-2008"/>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Brief Research Report</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Complex motor imagery in elite female ice hockey players: a cortical arena of imagination revealed by magnetoencephalography</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Potts</surname>
<given-names>Audrey Alice</given-names>
</name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/3384454"/>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Data curation" vocab-term-identifier="https://credit.niso.org/contributor-roles/data-curation/">Data curation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="visualization" vocab-term-identifier="https://credit.niso.org/contributor-roles/visualization/">Visualization</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="investigation" vocab-term-identifier="https://credit.niso.org/contributor-roles/investigation/">Investigation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="conceptualization" vocab-term-identifier="https://credit.niso.org/contributor-roles/conceptualization/">Conceptualization</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Project administration" vocab-term-identifier="https://credit.niso.org/contributor-roles/project-administration/">Project administration</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Formal analysis" vocab-term-identifier="https://credit.niso.org/contributor-roles/formal-analysis/">Formal analysis</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Writing &#x2013; review &#x0026; editing" vocab-term-identifier="https://credit.niso.org/contributor-roles/writing-review-editing/">Writing &#x2013; review &#x0026; editing</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="methodology" vocab-term-identifier="https://credit.niso.org/contributor-roles/methodology/">Methodology</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="validation" vocab-term-identifier="https://credit.niso.org/contributor-roles/validation/">Validation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Writing &#x2013; original draft" vocab-term-identifier="https://credit.niso.org/contributor-roles/writing-original-draft/">Writing &#x2013; original draft</role>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Garcia Dominguez</surname>
<given-names>Luis</given-names>
</name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/3384240"/>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Formal analysis" vocab-term-identifier="https://credit.niso.org/contributor-roles/formal-analysis/">Formal analysis</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="conceptualization" vocab-term-identifier="https://credit.niso.org/contributor-roles/conceptualization/">Conceptualization</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="visualization" vocab-term-identifier="https://credit.niso.org/contributor-roles/visualization/">Visualization</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="software" vocab-term-identifier="https://credit.niso.org/contributor-roles/software/">Software</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="methodology" vocab-term-identifier="https://credit.niso.org/contributor-roles/methodology/">Methodology</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="validation" vocab-term-identifier="https://credit.niso.org/contributor-roles/validation/">Validation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Writing &#x2013; original draft" vocab-term-identifier="https://credit.niso.org/contributor-roles/writing-original-draft/">Writing &#x2013; original draft</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="resources" vocab-term-identifier="https://credit.niso.org/contributor-roles/resources/">Resources</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="investigation" vocab-term-identifier="https://credit.niso.org/contributor-roles/investigation/">Investigation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Data curation" vocab-term-identifier="https://credit.niso.org/contributor-roles/data-curation/">Data curation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Project administration" vocab-term-identifier="https://credit.niso.org/contributor-roles/project-administration/">Project administration</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Writing &#x2013; review &#x0026; editing" vocab-term-identifier="https://credit.niso.org/contributor-roles/writing-review-editing/">Writing &#x2013; review &#x0026; editing</role>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Gold</surname>
<given-names>David</given-names>
</name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Writing &#x2013; review &#x0026; editing" vocab-term-identifier="https://credit.niso.org/contributor-roles/writing-review-editing/">Writing &#x2013; review &#x0026; editing</role>
</contrib>
<contrib contrib-type="author">
<name>
<surname>McAndrews</surname>
<given-names>Mary Pat</given-names>
</name>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
<xref ref-type="aff" rid="aff4"><sup>4</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/176252"/>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Writing &#x2013; review &#x0026; editing" vocab-term-identifier="https://credit.niso.org/contributor-roles/writing-review-editing/">Writing &#x2013; review &#x0026; editing</role>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Wennberg</surname>
<given-names>Richard</given-names>
</name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff5"><sup>5</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/762689"/>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="supervision" vocab-term-identifier="https://credit.niso.org/contributor-roles/supervision/">Supervision</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Writing &#x2013; review &#x0026; editing" vocab-term-identifier="https://credit.niso.org/contributor-roles/writing-review-editing/">Writing &#x2013; review &#x0026; editing</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="software" vocab-term-identifier="https://credit.niso.org/contributor-roles/software/">Software</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Writing &#x2013; original draft" vocab-term-identifier="https://credit.niso.org/contributor-roles/writing-original-draft/">Writing &#x2013; original draft</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="resources" vocab-term-identifier="https://credit.niso.org/contributor-roles/resources/">Resources</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="investigation" vocab-term-identifier="https://credit.niso.org/contributor-roles/investigation/">Investigation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Project administration" vocab-term-identifier="https://credit.niso.org/contributor-roles/project-administration/">Project administration</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Formal analysis" vocab-term-identifier="https://credit.niso.org/contributor-roles/formal-analysis/">Formal analysis</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="Data curation" vocab-term-identifier="https://credit.niso.org/contributor-roles/data-curation/">Data curation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="validation" vocab-term-identifier="https://credit.niso.org/contributor-roles/validation/">Validation</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="methodology" vocab-term-identifier="https://credit.niso.org/contributor-roles/methodology/">Methodology</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="conceptualization" vocab-term-identifier="https://credit.niso.org/contributor-roles/conceptualization/">Conceptualization</role>
<role vocab="credit" vocab-identifier="https://credit.niso.org/" vocab-term="visualization" vocab-term-identifier="https://credit.niso.org/contributor-roles/visualization/">Visualization</role>
</contrib>
</contrib-group>
<aff id="aff1"><label>1</label><institution>Mitchell Goldhar Magnetoencephalography (MEG) Unit, Krembil Brain Institute, University Health Network</institution>, <city>Toronto</city>, <state>ON</state>, <country country="ca">Canada</country></aff>
<aff id="aff2"><label>2</label><institution>Department of Psychiatry, University of Toronto</institution>, <city>Toronto</city>, <state>ON</state>, <country country="ca">Canada</country></aff>
<aff id="aff3"><label>3</label><institution>Division of Clinical and Computational Neuroscience, Krembil Brain Institute, University Health Network</institution>, <city>Toronto</city>, <state>ON</state>, <country country="ca">Canada</country></aff>
<aff id="aff4"><label>4</label><institution>Department of Psychology, University of Toronto</institution>, <city>Toronto</city>, <state>ON</state>, <country country="ca">Canada</country></aff>
<aff id="aff5"><label>5</label><institution>Division of Neurology, Department of Medicine, University of Toronto</institution>, <city>Toronto</city>, <state>ON</state>, <country country="ca">Canada</country></aff>
<author-notes>
<corresp id="c001"><label>&#x002A;</label>Correspondence: Richard Wennberg, <email xlink:href="mailto:r.wennberg@utoronto.ca">r.wennberg@utoronto.ca</email></corresp>
</author-notes>
<pub-date publication-format="electronic" date-type="pub" iso-8601-date="2026-02-27">
<day>27</day>
<month>02</month>
<year>2026</year>
</pub-date>
<pub-date publication-format="electronic" date-type="collection">
<year>2026</year>
</pub-date>
<volume>20</volume>
<elocation-id>1754371</elocation-id>
<history>
<date date-type="received">
<day>25</day>
<month>11</month>
<year>2025</year>
</date>
<date date-type="rev-recd">
<day>10</day>
<month>01</month>
<year>2026</year>
</date>
<date date-type="accepted">
<day>22</day>
<month>01</month>
<year>2026</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2026 Potts, Garcia Dominguez, Gold, McAndrews and Wennberg.</copyright-statement>
<copyright-year>2026</copyright-year>
<copyright-holder>Potts, Garcia Dominguez, Gold, McAndrews and Wennberg</copyright-holder>
<license>
<ali:license_ref start_date="2026-02-27">https://creativecommons.org/licenses/by/4.0/</ali:license_ref>
<license-p>This is an open-access article distributed under the terms of the <ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution License (CC BY)</ext-link>. The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</license-p>
</license>
</permissions>
<abstract>
<p>Motor imagery (MI), or &#x201C;visualization,&#x201D; as practiced by elite athletes to improve performance, provides a model of how covert thought&#x2014;imagination&#x2014;can affect subsequent behavior. In this exploratory magnetoencephalography (MEG) study, we aimed to identify the brain regions involved in complex MI in a small sample of elite female ice hockey players experienced in visualization. Using an experimental block design, the athletes visualized a specific PETTLEP (physical, environment, task, timing, learning, emotion, perspective)-guided scripted ice hockey play while being monitored with MEG. A frequency-domain beamformer was then calculated to contrast the MEG data from the imagery condition with two different control (resting state or mental counting) conditions. Significance was assessed using a cluster-based permutation test. The beamforming results identified a principal hub of neural activity during the imagery condition in a posterior left hemisphere cortical region surrounding the intraparietal sulcus. The same brain region was reliably activated in all eight participants and may hypothetically demarcate the neural substrate of this type of conscious thought.</p>
</abstract>
<kwd-group>
<kwd>cognition</kwd>
<kwd>dreaming</kwd>
<kwd>MEG</kwd>
<kwd>neuroscience</kwd>
<kwd>sport</kwd>
<kwd>visualization</kwd>
</kwd-group>
<funding-group>
<funding-statement>The author(s) declared that financial support was not received for this work and/or its publication.</funding-statement>
</funding-group>
<counts>
<fig-count count="3"/>
<table-count count="0"/>
<equation-count count="0"/>
<ref-count count="31"/>
<page-count count="9"/>
<word-count count="6068"/>
</counts>
<custom-meta-group>
<custom-meta>
<meta-name>section-at-acceptance</meta-name>
<meta-value>Interacting Minds and Brains</meta-value>
</custom-meta>
</custom-meta-group>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="sec1">
<title>Introduction</title>
<p>Sometimes referred to in sports as &#x201C;visualization,&#x201D; motor imagery (MI) involves the imagining and vivid mental rehearsal of movements without overt motor output (<xref ref-type="bibr" rid="ref19">Moran et al., 2012</xref>). It is commonly classified according to predominant type as kinesthetic (&#x201C;feeling&#x201D;) or visual (&#x201C;seeing&#x201D;), with visual taking either the first- or third-person perspective. In practice, multiple modalities may be used concurrently. Psychological evidence for the use and effectiveness of visualization in sports is well established (<xref ref-type="bibr" rid="ref31">Weinberg, 2008</xref>). Early sport psychology research focused on methods for optimizing MI practices to maximize performance, leading to the development of the well-known kinesthetic method referred to by the acronym PETTLEP (<xref ref-type="bibr" rid="ref11">Holmes and Collins, 2001</xref>). Conceptualized as a neuroscientific approach to MI in sport, the PETTLEP model proposed that seven key features&#x2014;physical, environment, task, timing, learning, emotion, and perspective&#x2014;be incorporated into an athlete&#x2019;s imagery practices to be most effective. Grounded in the theory of functional equivalence, which presumes that overt and covert movement share similar patterns of activation in the brain (<xref ref-type="bibr" rid="ref15">Lang, 1979</xref>; <xref ref-type="bibr" rid="ref12">Jeannerod, 1995</xref>; <xref ref-type="bibr" rid="ref14">Kraeutner et al., 2014</xref>), parallel to visual imagery and perception (<xref ref-type="bibr" rid="ref4">Farah, 1989</xref>; <xref ref-type="bibr" rid="ref22">Pearson et al., 2015</xref>), PETTLEP has been widely adopted as a tool in sport psychology, although questions persist regarding underlying neurophysiological mechanisms and the concept of neuroanatomical functional equivalence (<xref ref-type="bibr" rid="ref30">Wakefield et al., 2013</xref>).</p>
<p>For &#x201C;open&#x201D; sports, defined by unpredictable environments and a need for athletes to react rapidly to complex behaviors, visualization of potential scenarios requires a multimodal approach to MI that integrates manifold aspects of human perception, experience, and action. Ice hockey is a fast, open team sport in which players make rapid decisions in response to an infinite number of different situations, decisions based on cognitive&#x2014;conscious or subconscious&#x2014;associations between multiple types of sensory stimuli and motor schema. One might intuitively expect that MI would be most useful in &#x201C;closed&#x201D; sports (e.g., diving, figure skating, and golf), and experimental studies of MI have commonly investigated simple movements (e.g., sequential finger tapping and object grasping) in a laboratory setting or, in sport-specific contexts, have focused on closed sport actions (e.g., golf swing) (<xref ref-type="bibr" rid="ref2">Decety et al., 1994</xref>; <xref ref-type="bibr" rid="ref24">Ross et al., 2003</xref>; <xref ref-type="bibr" rid="ref9">Guillot et al., 2008</xref>; <xref ref-type="bibr" rid="ref14">Kraeutner et al., 2014</xref>).</p>
<p>We were particularly interested in the more complex form of MI used by open sport athletes, where PETTLEP-based &#x201C;scripts&#x201D; of multifaceted scenarios are used as the foundation for an athlete&#x2019;s imagery practice. At a fundamental level, complex MI provides a model of how covert thought&#x2014;imagination&#x2014;can affect subsequent behavior, a mind&#x2013;brain phenomenon that presents an opportunity to explore the relationship between consciousness and action (<xref ref-type="bibr" rid="ref19">Moran et al., 2012</xref>). The specific goal of this exploratory study was to determine if the brain region(s) most involved in sustaining this form of dynamic imagination could be revealed by magnetoencephalography (MEG).</p>
</sec>
<sec sec-type="materials|methods" id="sec2">
<title>Materials and methods</title>
<p>The script used in this study of eight elite female ice hockey players incorporated all PETTLEP features with as much detail as possible (see <xref rid="app1" ref-type="app">Appendix</xref>: Visualization Script). Using an experimental block design, the athletes visualized a specific 2-on-1 ice hockey play (a goaltender used a different script) while their brain activity was monitored using MEG. In brief, 3-min baseline recording (resting state, no task) was followed by 10 trials of 30&#x202F;s of imagery, alternating with 30&#x202F;s of a mental counting task. MEG beamforming was then used to contrast the data from the imagery condition with the two different control (resting state or mental counting) conditions.</p>
<sec id="sec3">
<title>Participants</title>
<p>A convenience sample of eight right-handed female elite ice hockey players (age range 20&#x2013;26&#x202F;years, active or less than 2&#x202F;years retired at the time of study) was recruited from the Canadian Women&#x2019;s Hockey League, the Canadian Women&#x2019;s National Ice Hockey Team, the National Collegiate Athletic Association, or U Sports to participate in the study. All of the athletes were experienced practitioners of complex sport-specific MI. Participants rehearsed their imagery script the night before and/or the morning of the MEG recording, and all of them reported that they felt comfortable with the script and confident in their ability to imagine it. Participants provided written informed consent, and the study was approved by the local institutional Research Ethics Board (19-5,755).</p>
</sec>
<sec id="sec4">
<title>Study design</title>
<p>There were three parts to the study. (1) The Movement Imagery Questionnaire&#x2014;Revised second version (MIQ-RS): completed by all participants to assess baseline visual and kinesthetic (seeing and feeling) imagery ability (see <xref rid="SM1" ref-type="supplementary-material">Supplementary material</xref>). (2) A 13-min MEG recording (after ~20&#x202F;min of set up and calibration): participants underwent MEG recording inside a magnetically shielded room, beginning with 3&#x202F;min of baseline resting state acquisition, followed by 10 30-s trials during which they performed the MI/visualization procedure, alternating with 10 30-s trials of a control task wherein participants performed a mental counting exercise (counting up by 3s &#x201C;in their head&#x201D;). Auditory &#x201C;GO&#x201D; cues triggered the 30-s visualization trials, and auditory &#x201C;STOP&#x201D; cues triggered the 30-s mental counting trials. (3) A structural magnetic resonance imaging (MRI) scan of the brain.</p>
<p>For this exploratory study, it was not clear what the optimal control condition would be to characterize the core components of the MI task. Areas of intrinsic network activity that may be detected at rest include bilateral sensorimotor and default mode regions, both of which might be expected to overlap with regions engaged by the MI task, given the frequent occurrence of motor imagination and episodic simulation (&#x201C;day-dreaming&#x201D;) during the resting state. Conversely, an active control task, such as mental counting, necessitates sustained engagement and attention but would be less likely to engage the visuomotor, executive, or emotional components of the MI task. Thus, we decided to conduct the same analyses with each of these controls.</p>
</sec>
<sec id="sec5">
<title>Motor imagery</title>
<p>As aforementioned, the specific ice hockey visualization procedure that participants imagined upon hearing the &#x201C;GO&#x201D; cues during their MEG recordings was created using the PETTLEP guide to best mimic the real-life experience (see <xref rid="app1" ref-type="app">Appendix</xref>).</p>
</sec>
<sec id="sec6">
<title>MEG recordings</title>
<p>Recordings were acquired using an Elekta Neuromag TRIUX 306-channel MEG system (Helsinki, Finland). Sampling frequency was 1,000&#x202F;Hz with an online filter bandwidth of 0.1&#x2013;330&#x202F;Hz. Offline data processing commenced with artifact suppression using the default parameters (10-s time window, subspace correlation: 0.980) of the spatiotemporal signal space separation algorithm and head movement compensation, as implemented within the Elekta MaxFilter suite. Head position inside the MEG sensor array was determined from five coils attached to the scalp; a Polhemus 3D-Fastrak system (Colchester, United States) was used to digitize head shape and location of the head position indicator coils.</p>
</sec>
<sec id="sec7">
<title>MEG analyses</title>
<sec id="sec8">
<title>Forward model</title>
<p>To build the source model and co-register the individual anatomies to a single template, T1-weighted anatomical MRI scans (TR&#x202F;=&#x202F;8&#x202F;ms, TE&#x202F;=&#x202F;3&#x202F;ms, 146 slices, 220&#x202F;mm FOV, 256&#x00D7;256 matrix, 0.9x.0.9&#x00D7;1.0&#x202F;mm voxels) were acquired for each participant using a GE 3-Tesla Signa MR System (Chicago, United States). Co-registration between MEG and MRI was performed using fiducial-based alignment supplemented with head&#x2013;shape matching. Each MRI scan was warped to a template defined in a normalized space to allow for group analysis of functional data. Lead fields were computed using a single-shell volume conduction model (<xref ref-type="bibr" rid="ref20">Nolte, 2003</xref>) derived from each participant&#x2019;s segmented MRI scan. This model was chosen for its numerical stability and validity, given the magnetic transparency of the skull. Individual MRI scans were spatially normalized to the MNI template using non-linear spatial normalization (<xref ref-type="bibr" rid="ref1">Ashburner and Friston, 2005</xref>) to facilitate group-level averaging. The source space comprised a regular 3D grid with 5-mm spacing between adjacent nodes.</p>
</sec>
<sec id="sec9">
<title>Data processing</title>
<p>The gradiometer data from the MI task and the two control conditions (resting state and mental counting contrasts) were segmented into 1-s trials. Then, imagery and control epochs were inspected for artifacts using different statistics such as variance and kurtosis. Trials were removed if they appeared as gross outliers upon visual inspection of the summary statistics (variance and kurtosis) across channels; epochs containing significant ocular or muscle artifacts identified by this screening were rejected. The mean epoch rejection rates were similarly low across all conditions: imagery 7.0% (median 4.3%, range 0.7&#x2013;21.7%), counting 6.6% (median 4.0%, range 1.7&#x2013;16.3%), and resting 5.2% (median 3.1%, range 1.7&#x2013;15.6%). This data-adaptive approach was chosen over fixed amplitude thresholds to account for inter-individual variability in baseline noise levels.</p>
<p>Power spectra were then computed for each condition, and the average power across all sensors was calculated. The log ratio of power between conditions, in decibels, was used to depict changes in sensor space as topographic maps across frequencies, as described previously (<xref ref-type="bibr" rid="ref6">Garcia Dominguez et al., 2023</xref>). To determine the frequency of interest (FOI), we calculated the average power changes across all sensors and performed paired <italic>t</italic>-tests at each frequency bin (1&#x2013;60&#x202F;Hz). This data-driven approach identified a continuous range of significant differences (<italic>p</italic>&#x202F;&#x003C;&#x202F;0.05) between 15 and 21&#x202F;Hz in the beta frequency band, centered at 18&#x202F;Hz for both contrast conditions. We expressly used this maximization strategy to optimize the sensitivity of the spatial beamformer to the strongest observed physiological signal, rather than to test a hypothesis regarding beta band specificity. In this study, at the FOI, the MI condition showed a notable desynchronization (power decrease) relative to both the mental counting control and the resting state baseline.</p>
</sec>
<sec id="sec10">
<title>Inverse model</title>
<p>A frequency-domain beamformer&#x2014;dynamic imaging of coherent sources (DICS; <xref ref-type="bibr" rid="ref8">Gross et al., 2001</xref>)&#x2014;was then calculated at the FOI over the warped 3D grid. A common spatial filter was computed by pooling imagery and control conditions. This filter was then applied to each condition separately, resulting in a power estimate for each participant, node, and condition (<xref ref-type="bibr" rid="ref6">Garcia Dominguez et al., 2023</xref>). Since each node is mapped to the template brain, the outcome can be averaged over equivalent nodes across participants and can be plotted on the template brain directly. Because there are two conditions to contrast, we chose to express the beamformer power contrast as the relative change in power of control (resting state or mental counting) versus imagery over every node, calculated as [<italic>P</italic>(control) &#x2013; <italic>P</italic>(imagery)]/P(imagery), where <italic>P</italic> is power. This formulation allows decreases in power during imagery to appear as positive changes in source space figures. The DICS beamformer and frequency analyses were implemented using FieldTrip (<xref ref-type="bibr" rid="ref21">Oostenveld et al., 2011</xref>).</p>
</sec>
<sec id="sec11">
<title>Statistical analysis</title>
<p>To assess the significance of power changes between conditions, a non-parametric cluster-based permutation test was performed (<xref ref-type="bibr" rid="ref16">Maris and Oostenveld, 2007</xref>). Given the sample size (<italic>n</italic>&#x202F;=&#x202F;8), we implemented a custom exact permutation test using MATLAB (R2024b, The MathWorks, Inc.). First, dependent samples t-tests were computed at each voxel of the source grid, comparing the visualization and control conditions. Voxels exceeding a threshold of <italic>p</italic> &#x003C; 0.05 (uncorrected) were clustered based on spatial adjacency. Spatial clustering was performed on the interpolated 3D source grid (26 &#x00D7; 33 &#x00D7; 28 voxels) using 26-point connectivity (MATLAB bwconncomp function). The cluster-level statistic was defined as the sum of the <italic>t</italic>-values within each cluster. To evaluate significance, a null distribution was generated using all possible permutations of the condition labels (2<sup>8</sup>&#x202F;=&#x202F;256 permutations). For each permutation, the condition difference signs were randomly flipped for a subset of participants, and the maximum cluster statistic was recorded. The empirical cluster statistic was considered significant if it exceeded the 95th percentile of this exact null distribution (<italic>p</italic>&#x202F;&#x003C;&#x202F;0.05, two-tailed).</p>
</sec>
</sec>
</sec>
<sec sec-type="results" id="sec12">
<title>Results</title>
<p>All participants demonstrated good imagery ability on the MIQ-RS (<xref rid="SM1" ref-type="supplementary-material">Supplementary material</xref>) and reported no difficulty with visualization inside the magnetically shielded room.</p>
<p>The MEG beamforming results obtained at the group level, comparing relative change in power at the FOI for the different conditions (imagery versus resting state or mental counting), revealed prominent desynchronization&#x2014;that is, decreased oscillatory beta power, reflecting neuronal activation (<xref ref-type="bibr" rid="ref17">Miller et al., 2014</xref>)&#x2014;during the imagery condition in the left posterior hemisphere. This effect was maximal around the intraparietal sulcus and extended to the inferior aspect of the superior parietal lobule and the superior aspect of the inferior parietal lobule.</p>
<p><xref ref-type="fig" rid="fig1">Figure 1</xref> shows the contrast between imagery and the resting state baseline control.</p>
<fig position="float" id="fig1">
<label>Figure 1</label>
<caption>
<p>Localization of neural activation during complex ice hockey motor imagery, group-level results, resting state contrast. <bold>(a)</bold> Topographic plots of average log-transformed spectral power differences in sensor space. Red indicates decreased power (neuronal activation) during imagery. <bold>(b)</bold> Cortical surface rendering of MEG beamformer results shows maximal activation in the left intraparietal sulcus and adjacent parietal association cortex and, to a lesser extent, pre- and post-central gyri. Color scales indicate percentiles of relative power change. <bold>(c)</bold> Source solution maximum from <bold>(b)</bold> is displayed on coronal and axial MRI slices, thresholded at the 99th percentile (top) and unthresholded (bottom).</p>
</caption>
<graphic xlink:href="fnhum-20-1754371-g001.tif" mimetype="image" mime-subtype="tiff">
<alt-text content-type="machine-generated">Panel A shows twenty-four MEG topographic plots at different frequencies from zero to 23.4 Hertz, each with color gradients from blue to red representing varying activity levels. Panel B contains two color-coded 3D brain models, one from above and one from the side, indicating activity percentiles using a range from blue to dark red. Panel C displays four MRI brain images with overlaid colored activation maps and intersecting reference lines, where color intensity highlights regions of interest in the left hemisphere.</alt-text>
</graphic>
</fig>
<p><xref ref-type="fig" rid="fig2">Figure 2</xref> shows the contrast between imagery and the mental counting control.</p>
<fig position="float" id="fig2">
<label>Figure 2</label>
<caption>
<p>Localization of neural activation during complex ice hockey motor imagery, group-level results, mental counting contrast. <bold>(a)</bold> Topographic plots of average log-transformed spectral power differences in sensor space. Red indicates decreased power (neuronal activation) during imagery. <bold>(b)</bold> Cortical surface rendering of MEG beamformer results shows maximal activation in left parietal association cortex and sensorimotor primary cortex. <bold>(c)</bold> Source solution maximum from <bold>(b)</bold> displayed on coronal and axial MRI slices, thresholded at the 99th percentile (top) and unthresholded (bottom). Color scales indicate percentiles of relative power change.</p>
</caption>
<graphic xlink:href="fnhum-20-1754371-g002.tif" mimetype="image" mime-subtype="tiff">
<alt-text content-type="machine-generated">Figure containing three panels depicting MEG and neuroimaging data. Panel a shows a grid of topographic head maps at various frequencies with color gradients indicating brain activity levels, ranging from blue (low) to red (high). Panel b presents two three-dimensional renderings of brain surfaces with color mapping to highlight percentiles of activity. Panel c displays four MRI brain images with crosshair overlays, marking highlighted regions in the left hemisphere using a colored heatmap.</alt-text>
</graphic>
</fig>
<p>The statistical analysis confirmed the observed visualization-related desynchronization to be significant in amplitude and spatial extent in both contrast settings. Specifically, the cluster-based permutation test identified a significant cluster in the posterior left hemisphere parietal region (<italic>p</italic>&#x202F;&#x003C;&#x202F;0.01, corrected for multiple comparisons).</p>
<p>As suggested in the group-level figures, along with the maximal left parietal activation, significant desynchronization was also present during the imagery condition in the left &#x003E; right pre- and post-central gyri. This activation in primary sensorimotor cortex was most evident when the imagery condition was compared to the mental counting control, as opposed to the resting state, implying that the resting state contained greater baseline sensorimotor neural activity than the more focused non-motor mental counting state.</p>
<p>Remarkably, the same pattern of maximal activation in left posterior hemisphere cortex was evident at the individual level in 100% of subjects (8 out of 8 participants). This consistent finding included the goaltender, who used a different visualization script, indicating that the left parietal locus of activation was not dependent on the exact content details of an individual&#x2019;s MI but rather reflected the dynamic cognitive process of the visualization.</p>
<p>At the individual level, lesser significant activation in the left and/or right sensorimotor cortex was observed in six out of eight participants (three unilateral left, two bilateral, and one unilateral right). Some of the athletes also showed lesser significant activation in adjacent left occipital cortex (two out of eight participants) and/or in contralateral right parietal cortex (two out of eight participants, most apparent in the goaltender, <xref ref-type="fig" rid="fig3">Figure 3</xref>).</p>
<fig position="float" id="fig3">
<label>Figure 3</label>
<caption>
<p>Individual-level MEG source localization during motor imagery versus mental counting control. <bold>(a)</bold> Participant 4, professional, 25&#x202F;years old. Left parietal activation is maximum with comparatively weaker activation in the sensorimotor cortices. <bold>(b)</bold> Participant 2, NCAA, 25&#x202F;years old. Left parieto-occipital and sensorimotor cortex activation in the sensorimotor cortices. <bold>(c)</bold> Participant 3, U Sports, 20&#x202F;years old. Left parieto-occipital activation is greater than right parietal activation. <bold>(d)</bold> Participant 5, NCAA, 21&#x202F;years old, goaltender (different visualization script). Left parietal activation is greater than right parietal activation. All panels display beamformer source contrast maps color-coded by percentile of relative power decrease.</p>
</caption>
<graphic xlink:href="fnhum-20-1754371-g003.tif" mimetype="image" mime-subtype="tiff">
<alt-text content-type="machine-generated">Four 3D brain renderings labeled a, b, c, and d display color-coded activation levels based on a vertical scale from dark red (99.9 percent) to blue (0 percent), highlighting variable regional intensities across the cortex.</alt-text>
</graphic>
</fig>
</sec>
<sec sec-type="discussion" id="sec13">
<title>Discussion</title>
<p>Complex PETTLEP-guided visualization in this group of elite female ice hockey players was primarily associated with functional activation of left hemisphere parietal lobe areas involved in somatosensation, kinetic visuospatial processing, and multimodal sensorimotor integration (<xref ref-type="bibr" rid="ref4">Farah, 1989</xref>; <xref ref-type="bibr" rid="ref26">Sirigu et al., 1996</xref>). A similar activation pattern was evident when the MI task was contrasted against resting state or active counting conditions, indicating its specificity to the cognitive operations involved in this type of visualization. Despite, or perhaps because of, the complexity of the MI, this region was reliably activated in all participants and could be identified by MEG even at the individual level. Moreover, the same left parietal region was maximally activated in the goaltender, who used a different visualization script, demonstrating that it was not the precise imagery content (beyond being a dynamic multisensorial hockey-specific moving image) but rather the conscious performance of complex MI&#x2014;the imagining process itself&#x2014;that was detected by MEG beamforming.</p>
<p>Previous studies of MI using positron emission tomography (PET) or functional MRI (fMRI) have identified multiple additional cortical and subcortical regions of brain activation, including frontal premotor and supplementary motor areas, the cingulate gyrus, the caudate nucleus, and the cerebellum (<xref ref-type="bibr" rid="ref10">H&#x00E9;tu et al., 2013</xref>; <xref ref-type="bibr" rid="ref13">Jiang et al., 2015</xref>; <xref ref-type="bibr" rid="ref5">Filgueiras et al., 2018</xref>), all areas that showed no significant activation identifiable by MEG beamforming. The differences in localization across studies and imaging methods could conceivably reflect differences between complex and simple MI, between open and closed sports, between good imagers and less adept individuals (<xref ref-type="bibr" rid="ref9">Guillot et al., 2008</xref>), or perhaps especially between PET/fMRI and MEG beamforming.</p>
<p>Our use of MEG beamforming provided a direct measure of neural activity with millisecond temporal resolution, different from PET or fMRI, which could explain the more discrete localization found with MEG, as non-parietal areas identified by PET/fMRI could represent downstream network connections of the principal locus of image generation. The fast temporal resolution of MEG, combined with the short sliding window of the beamforming technique, may have been ideal for the identification of consistent, prolonged activations during visualization. In this context, the beamformer, in essence, acts as a low-pass filter to de-emphasize transient or spatially inconsistent activations that might be more likely to appear as metabolic changes in PET or fMRI (see <xref rid="SM1" ref-type="supplementary-material">Supplementary material</xref> for more discussion of this issue).</p>
<p>It is conceivable that complex MI in elite hockey players would not significantly activate anterior motor planning areas if the athletes subconsciously &#x201C;filter out&#x201D; or automatize the basic motor skills they are already expertly familiar with during visualization. Experienced athletes expend less energy and activate fewer brain areas than novice athletes during MI and sport-related decision-making (<xref ref-type="bibr" rid="ref24">Ross et al., 2003</xref>; <xref ref-type="bibr" rid="ref18">Milton et al., 2007</xref>; <xref ref-type="bibr" rid="ref5">Filgueiras et al., 2018</xref>). At the elite level, where these differences would be expected to be most pronounced, complex MI may be limited largely to the cognitive aspects of imagination and localized primarily to posterior left hemisphere parietal lobe cortex. Dynamic sensorimotor integration in this left parietal region during visualization could facilitate novel connections and optimize the cognitive hub of existing performance-related cortical&#x2013;subcortical neural networks, providing a possible mechanism whereby covert PETTLEP-guided thought can improve an athlete&#x2019;s subsequent performance.</p>
<p>The MEG identification of a left parietal cortical hub for complex MI provides both neurophysiological and neuroanatomical support for certain theoretical models derived from cognitive neuroscience; specifically, hierarchical internal forward (<xref ref-type="bibr" rid="ref28">Tian and Poeppel, 2010</xref>) and predictive-processing (<xref ref-type="bibr" rid="ref23">Ridderinkhof and Brass, 2015</xref>) models, which propose that &#x201C;efference copies&#x201D; or &#x201C;emulations&#x201D; of imagined actions are generated and dynamically updated in posterior parietal cortex, upstream from frontal motor planning and subcortical motor coordination areas. Viewed in terms of network connectivity, the continuously active parietal integration hub may transiently recruit other areas not needing sustained activation during imagery, but these areas would either be unidentified by MEG beamforming or show only mild or individual-level activation (e.g., primary sensorimotor, left occipital, or right parietal cortices).</p>
<p>Our findings, while preliminary, are hypothesis-generating, in particular about the brain mechanisms underlying image generation, dreaming, and phenomenal consciousness. Historically, neuropsychological lesion studies using componential analyses to identify focal deficits in image generation found a significant association with lesions in the left posterior hemisphere (<xref ref-type="bibr" rid="ref3">Farah, 1984</xref>; <xref ref-type="bibr" rid="ref27">Stangalino et al., 1995</xref>). Interestingly, some of the affected patients also reported a loss of dreaming (<xref ref-type="bibr" rid="ref3">Farah, 1984</xref>), and a left posterior hemisphere lateralization of dreaming was later supported in a neurological review of these and other cases (<xref ref-type="bibr" rid="ref7">Greenberg and Farah, 1986</xref>).</p>
<p>Dreaming is quite obviously a natural sleep analog to complex MI, and our demonstration of a similar neuroanatomical localization for complex MI supports and refines the classical imagery localization hypotheses. Localization of the neural correlates of dreaming has been a subject of interest in modern consciousness studies, with the identification of a posterior hemisphere &#x201C;hot zone&#x201D; for dreaming used to support the postulate of a similar neural locus for all phenomenal consciousness (<xref ref-type="bibr" rid="ref29">Tononi et al., 2016</xref>; <xref ref-type="bibr" rid="ref25">Siclari et al., 2017</xref>). Complex MI is a form of phenomenal consciousness that applies covert thought as a tool to alter subsequent motor behavior. The means by which thought can later influence human behavior, whether in the case of PETTLEP-guided MI for sports performance or as long-reported anecdotally in the case of dreaming, is an unresolved mind&#x2013;brain problem. Nevertheless, the physical substrate of the process appears to predominantly involve the posterior cortex of the left hemisphere.</p>
<sec id="sec14">
<title>Limitations</title>
<p>Our exploratory study aimed simply to determine if MEG could reliably detect evidence of localized neural activity during PETTLEP-guided complex MI. It was neither conceived as a pilot study nor as a feasibility study aiming to answer other hypothesis-driven questions. For example, it was not intended to compare putative differences in activation patterns between complex and simple MI, open and closed sports, or expert and novice athletes and imagers. The small and unusually homogeneous convenience sample of elite female hockey players experienced in visualization was purposely chosen to maximize the chance of identifying a visualization-related MEG signal; the robustness and reliability of the beamforming results were unexpected. It must be emphasized that the presented findings are limited to this specific type of MI and expert individual, and it cannot be presumed that similarly robust and reliable MEG localization would be found in other athletes or in the general population. Nevertheless, although this was not our aim, our results could be taken as support for the feasibility of using the same MEG beamforming methods in future studies designed to compare activation patterns across different types of MI or in athletes and non-athletes with varying levels of expertise.</p>
</sec>
</sec>
</body>
<back>
<sec sec-type="data-availability" id="sec15">
<title>Data availability statement</title>
<p>The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.</p>
</sec>
<sec sec-type="ethics-statement" id="sec16">
<title>Ethics statement</title>
<p>The studies involving humans were approved by University Health Network (UHN) Research Ethics Board (University of Toronto). The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study. Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.</p>
</sec>
<sec sec-type="author-contributions" id="sec17">
<title>Author contributions</title>
<p>AP: Data curation, Visualization, Investigation, Conceptualization, Project administration, Formal analysis, Writing &#x2013; review &#x0026; editing, Methodology, Validation, Writing &#x2013; original draft. LG: Formal analysis, Conceptualization, Visualization, Software, Methodology, Validation, Writing &#x2013; original draft, Resources, Investigation, Data curation, Project administration, Writing &#x2013; review &#x0026; editing. DG: Writing &#x2013; review &#x0026; editing. MM: Writing &#x2013; review &#x0026; editing. RW: Supervision, Writing &#x2013; review &#x0026; editing, Software, Writing &#x2013; original draft, Resources, Investigation, Project administration, Formal analysis, Data curation, Validation, Methodology, Conceptualization, Visualization.</p>
</sec>
<sec sec-type="COI-statement" id="sec18">
<title>Conflict of interest</title>
<p>The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
<p>The author MM declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.</p>
</sec>
<sec sec-type="ai-statement" id="sec19">
<title>Generative AI statement</title>
<p>The author(s) declared that Generative AI was not used in the creation of this manuscript.</p>
<p>Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.</p>
</sec>
<sec sec-type="disclaimer" id="sec20">
<title>Publisher&#x2019;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<sec sec-type="supplementary-material" id="sec21">
<title>Supplementary material</title>
<p>The Supplementary material for this article can be found online at: <ext-link xlink:href="https://www.frontiersin.org/articles/10.3389/fnhum.2026.1754371/full#supplementary-material" ext-link-type="uri">https://www.frontiersin.org/articles/10.3389/fnhum.2026.1754371/full#supplementary-material</ext-link></p>
<supplementary-material xlink:href="Data_Sheet_1.PDF" id="SM1" mimetype="application/PDF" xmlns:xlink="http://www.w3.org/1999/xlink"/>
</sec>
<ref-list>
<title>References</title>
<ref id="ref1"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Ashburner</surname><given-names>J.</given-names></name> <name><surname>Friston</surname><given-names>K. J.</given-names></name></person-group> (<year>2005</year>). <article-title>Unified segmentation</article-title>. <source>NeuroImage</source> <volume>26</volume>, <fpage>839</fpage>&#x2013;<lpage>851</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2005.02.018</pub-id></mixed-citation></ref>
<ref id="ref2"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Decety</surname><given-names>J.</given-names></name> <name><surname>Perani</surname><given-names>D.</given-names></name> <name><surname>Jeannerod</surname><given-names>M.</given-names></name> <name><surname>Bettinardi</surname><given-names>V.</given-names></name> <name><surname>Tadary</surname><given-names>B.</given-names></name> <name><surname>Woods</surname><given-names>R.</given-names></name> <etal/></person-group>. (<year>1994</year>). <article-title>Mapping motor representations with positron emission tomography</article-title>. <source>Nature</source> <volume>371</volume>, <fpage>600</fpage>&#x2013;<lpage>602</lpage>. doi: <pub-id pub-id-type="doi">10.1038/371600a0</pub-id>, <pub-id pub-id-type="pmid">7935791</pub-id></mixed-citation></ref>
<ref id="ref3"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Farah</surname><given-names>M. J.</given-names></name></person-group> (<year>1984</year>). <article-title>The neurological basis of mental imagery: a componential analysis</article-title>. <source>Cognition</source> <volume>18</volume>, <fpage>245</fpage>&#x2013;<lpage>272</lpage>. doi: <pub-id pub-id-type="doi">10.1016/0010-0277(84)90026-x</pub-id>, <pub-id pub-id-type="pmid">6396031</pub-id></mixed-citation></ref>
<ref id="ref4"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Farah</surname><given-names>M. J.</given-names></name></person-group> (<year>1989</year>). <article-title>The neural basis of mental imagery</article-title>. <source>Trends Neurosci.</source> <volume>12</volume>, <fpage>395</fpage>&#x2013;<lpage>399</lpage>. doi: <pub-id pub-id-type="doi">10.1016/0166-2236(89)90079-9</pub-id></mixed-citation></ref>
<ref id="ref5"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Filgueiras</surname><given-names>A.</given-names></name> <name><surname>Quintas Conde</surname><given-names>E. F.</given-names></name> <name><surname>Hall</surname><given-names>C. R.</given-names></name></person-group> (<year>2018</year>). <article-title>The neural basis of kinesthetic and visual imagery in sports: an ALE meta-analysis</article-title>. <source>Brain Imaging Behav.</source> <volume>12</volume>, <fpage>1513</fpage>&#x2013;<lpage>1523</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s11682-017-9813-9</pub-id>, <pub-id pub-id-type="pmid">29260381</pub-id></mixed-citation></ref>
<ref id="ref6"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Garcia Dominguez</surname><given-names>L.</given-names></name> <name><surname>Tarazi</surname><given-names>A.</given-names></name> <name><surname>Valiante</surname><given-names>T.</given-names></name> <name><surname>Wennberg</surname><given-names>R.</given-names></name></person-group> (<year>2023</year>). <article-title>Beamforming seizures from the temporal lobe using magnetoencephalography</article-title>. <source>Can. J. Neurol. Sci.</source> <volume>50</volume>, <fpage>201</fpage>&#x2013;<lpage>213</lpage>. doi: <pub-id pub-id-type="doi">10.1017/cjn.2022.1</pub-id></mixed-citation></ref>
<ref id="ref7"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Greenberg</surname><given-names>M. S.</given-names></name> <name><surname>Farah</surname><given-names>M. J.</given-names></name></person-group> (<year>1986</year>). <article-title>The laterality of dreaming</article-title>. <source>Brain Cogn.</source> <volume>5</volume>, <fpage>307</fpage>&#x2013;<lpage>321</lpage>. doi: <pub-id pub-id-type="doi">10.1016/0278-2626(86)90034-5</pub-id></mixed-citation></ref>
<ref id="ref8"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Gross</surname><given-names>J.</given-names></name> <name><surname>Kujala</surname><given-names>J.</given-names></name> <name><surname>Hamalainen</surname><given-names>M.</given-names></name> <name><surname>Timmermann</surname><given-names>L.</given-names></name> <name><surname>Schnitzler</surname><given-names>A.</given-names></name> <name><surname>Salmelin</surname><given-names>R.</given-names></name></person-group> (<year>2001</year>). <article-title>Dynamic imaging of coherent sources: studying neural interactions in the human brain</article-title>. <source>Proc. Natl. Acad. Sci. USA</source> <volume>98</volume>, <fpage>694</fpage>&#x2013;<lpage>699</lpage>. doi: <pub-id pub-id-type="doi">10.1073/pnas.98.2.694</pub-id>, <pub-id pub-id-type="pmid">11209067</pub-id></mixed-citation></ref>
<ref id="ref9"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Guillot</surname><given-names>A.</given-names></name> <name><surname>Collet</surname><given-names>C.</given-names></name> <name><surname>Nguyen</surname><given-names>V. A.</given-names></name> <name><surname>Malouin</surname><given-names>F.</given-names></name> <name><surname>Richards</surname><given-names>C.</given-names></name> <name><surname>Doyon</surname><given-names>J.</given-names></name></person-group> (<year>2008</year>). <article-title>Functional neuroanatomical networks associated with expertise in motor imagery</article-title>. <source>NeuroImage</source> <volume>41</volume>, <fpage>1471</fpage>&#x2013;<lpage>1483</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2008.03.042</pub-id>, <pub-id pub-id-type="pmid">18479943</pub-id></mixed-citation></ref>
<ref id="ref10"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>H&#x00E9;tu</surname><given-names>S.</given-names></name> <name><surname>Gr&#x00E9;goire</surname><given-names>M.</given-names></name> <name><surname>Saimpont</surname><given-names>A.</given-names></name> <name><surname>Coll</surname><given-names>M. P.</given-names></name> <name><surname>Eug&#x00E8;ne</surname><given-names>F.</given-names></name> <name><surname>Michon</surname><given-names>P. E.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>The neural network of motor imagery: an ALE meta-analysis</article-title>. <source>Neurosci. Biobehav. Rev.</source> <volume>37</volume>, <fpage>930</fpage>&#x2013;<lpage>949</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neubiorev.2013.03.017</pub-id>, <pub-id pub-id-type="pmid">23583615</pub-id></mixed-citation></ref>
<ref id="ref11"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Holmes</surname><given-names>P. S.</given-names></name> <name><surname>Collins</surname><given-names>D. J.</given-names></name></person-group> (<year>2001</year>). <article-title>The PETTLEP approach to motor imagery: a functional equivalence model for sports psychologists</article-title>. <source>J. Appl. Sport Psychol.</source> <volume>13</volume>, <fpage>60</fpage>&#x2013;<lpage>83</lpage>. doi: <pub-id pub-id-type="doi">10.1080/10413200109339004</pub-id></mixed-citation></ref>
<ref id="ref12"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Jeannerod</surname><given-names>M.</given-names></name></person-group> (<year>1995</year>). <article-title>Mental imagery in the motor context</article-title>. <source>Neuropsychologia</source> <volume>33</volume>, <fpage>1419</fpage>&#x2013;<lpage>1432</lpage>. doi: <pub-id pub-id-type="doi">10.1016/0028-3932(95)00073-c</pub-id>, <pub-id pub-id-type="pmid">8584178</pub-id></mixed-citation></ref>
<ref id="ref13"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Jiang</surname><given-names>D.</given-names></name> <name><surname>Edwards</surname><given-names>M. G.</given-names></name> <name><surname>Mullins</surname><given-names>P.</given-names></name> <name><surname>Callow</surname><given-names>N.</given-names></name></person-group> (<year>2015</year>). <article-title>The neural substrates for the different modalities of movement imagery</article-title>. <source>Brain Cogn.</source> <volume>97</volume>, <fpage>22</fpage>&#x2013;<lpage>31</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.bandc.2015.04.005</pub-id>, <pub-id pub-id-type="pmid">25956141</pub-id></mixed-citation></ref>
<ref id="ref14"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Kraeutner</surname><given-names>S.</given-names></name> <name><surname>Gionfriddo</surname><given-names>A.</given-names></name> <name><surname>Bardouille</surname><given-names>T.</given-names></name> <name><surname>Boe</surname><given-names>S.</given-names></name></person-group> (<year>2014</year>). <article-title>Motor imagery-based brain activity parallels that of motor execution: evidence from magnetic source imaging of cortical oscillations</article-title>. <source>Brain Res.</source> <volume>1588</volume>, <fpage>81</fpage>&#x2013;<lpage>91</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.brainres.2014.09.001</pub-id>, <pub-id pub-id-type="pmid">25251592</pub-id></mixed-citation></ref>
<ref id="ref15"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Lang</surname><given-names>P. J.</given-names></name></person-group> (<year>1979</year>). <article-title>A bio-informational theory of emotional imagery</article-title>. <source>Psychophysiology</source> <volume>16</volume>, <fpage>495</fpage>&#x2013;<lpage>512</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.1469-8986.1979.tb01511.x</pub-id>, <pub-id pub-id-type="pmid">515293</pub-id></mixed-citation></ref>
<ref id="ref16"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Maris</surname><given-names>E.</given-names></name> <name><surname>Oostenveld</surname><given-names>R.</given-names></name></person-group> (<year>2007</year>). <article-title>Nonparametric statistical testing of EEG- and MEG-data</article-title>. <source>J. Neurosci. Methods</source> <volume>164</volume>, <fpage>177</fpage>&#x2013;<lpage>190</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jneumeth.2007.03.024</pub-id>, <pub-id pub-id-type="pmid">17517438</pub-id></mixed-citation></ref>
<ref id="ref17"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Miller</surname><given-names>K. J.</given-names></name> <name><surname>Honey</surname><given-names>C. J.</given-names></name> <name><surname>Hermes</surname><given-names>D.</given-names></name> <name><surname>Rao</surname><given-names>R. P.</given-names></name> <name><surname>denNijs</surname><given-names>M.</given-names></name> <name><surname>Ojemann</surname><given-names>J. G.</given-names></name></person-group> (<year>2014</year>). <article-title>Broadband changes in the cortical surface potential track activation of functionally diverse neuronal populations</article-title>. <source>NeuroImage</source> <volume>85 Pt 2</volume>, <fpage>711</fpage>&#x2013;<lpage>720</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2013.08.070</pub-id>, <pub-id pub-id-type="pmid">24018305</pub-id></mixed-citation></ref>
<ref id="ref18"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Milton</surname><given-names>J.</given-names></name> <name><surname>Solodkin</surname><given-names>A.</given-names></name> <name><surname>Hlu&#x0161;t&#x00ED;k</surname><given-names>P.</given-names></name> <name><surname>Small</surname><given-names>S. L.</given-names></name></person-group> (<year>2007</year>). <article-title>The mind of expert motor performance is cool and focused</article-title>. <source>NeuroImage</source> <volume>35</volume>, <fpage>804</fpage>&#x2013;<lpage>813</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.neuroimage.2007.01.003</pub-id>, <pub-id pub-id-type="pmid">17317223</pub-id></mixed-citation></ref>
<ref id="ref19"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Moran</surname><given-names>A.</given-names></name> <name><surname>Guillot</surname><given-names>A.</given-names></name> <name><surname>Macintyre</surname><given-names>T.</given-names></name> <name><surname>Collet</surname><given-names>C.</given-names></name></person-group> (<year>2012</year>). <article-title>Re-imagining motor imagery: building bridges between cognitive neuroscience and sport psychology</article-title>. <source>Br. J. Psychol.</source> <volume>103</volume>, <fpage>224</fpage>&#x2013;<lpage>247</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.2044-8295.2011.02068.x</pub-id>, <pub-id pub-id-type="pmid">22506748</pub-id></mixed-citation></ref>
<ref id="ref20"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Nolte</surname><given-names>G.</given-names></name></person-group> (<year>2003</year>). <article-title>The magnetic lead field theorem in the quasi-static approximation and its use for magnetoencephalography forward calculation in realistic volume conductors</article-title>. <source>Phys. Med. Biol.</source> <volume>48</volume>, <fpage>3637</fpage>&#x2013;<lpage>3652</lpage>. doi: <pub-id pub-id-type="doi">10.1088/0031-9155/48/22/002</pub-id>, <pub-id pub-id-type="pmid">14680264</pub-id></mixed-citation></ref>
<ref id="ref21"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Oostenveld</surname><given-names>R.</given-names></name> <name><surname>Fries</surname><given-names>P.</given-names></name> <name><surname>Maris</surname><given-names>E.</given-names></name> <name><surname>Schoffelen</surname><given-names>J. M.</given-names></name></person-group> (<year>2011</year>). <article-title>FieldTrip: open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data</article-title>. <source>Comput. Intell. Neurosci.</source> <volume>2011</volume>:<fpage>156869</fpage>. doi: <pub-id pub-id-type="doi">10.1155/2011/156869</pub-id>, <pub-id pub-id-type="pmid">21253357</pub-id></mixed-citation></ref>
<ref id="ref22"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Pearson</surname><given-names>J.</given-names></name> <name><surname>Naselaris</surname><given-names>T.</given-names></name> <name><surname>Holmes</surname><given-names>E. A.</given-names></name> <name><surname>Kosslyn</surname><given-names>S. M.</given-names></name></person-group> (<year>2015</year>). <article-title>Mental imagery: functional mechanisms and clinical applications</article-title>. <source>Trends Cogn. Sci.</source> <volume>19</volume>, <fpage>590</fpage>&#x2013;<lpage>602</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.tics.2015.08.003</pub-id>, <pub-id pub-id-type="pmid">26412097</pub-id></mixed-citation></ref>
<ref id="ref23"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Ridderinkhof</surname><given-names>K. R.</given-names></name> <name><surname>Brass</surname><given-names>M.</given-names></name></person-group> (<year>2015</year>). <article-title>How kinesthetic motor imagery works: a predictive-processing theory of visualization in sports and motor expertise</article-title>. <source>J. Physiol. Paris</source> <volume>109</volume>, <fpage>53</fpage>&#x2013;<lpage>63</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jphysparis.2015.02.003</pub-id>, <pub-id pub-id-type="pmid">25817985</pub-id></mixed-citation></ref>
<ref id="ref24"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Ross</surname><given-names>J. S.</given-names></name> <name><surname>Tkach</surname><given-names>J.</given-names></name> <name><surname>Ruggieri</surname><given-names>P. M.</given-names></name> <name><surname>Lieber</surname><given-names>M.</given-names></name> <name><surname>Lapresto</surname><given-names>E.</given-names></name></person-group> (<year>2003</year>). <article-title>The mind&#x2019;s eye: functional MR imaging evaluation of golf motor imagery</article-title>. <source>Am. J. Neuroradiol.</source> <volume>24</volume>, <fpage>1036</fpage>&#x2013;<lpage>1044</lpage>, <pub-id pub-id-type="pmid">12812924</pub-id></mixed-citation></ref>
<ref id="ref25"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Siclari</surname><given-names>F.</given-names></name> <name><surname>Baird</surname><given-names>B.</given-names></name> <name><surname>Perogamvros</surname><given-names>L.</given-names></name> <name><surname>Bernardi</surname><given-names>G.</given-names></name> <name><surname>LaRocque</surname><given-names>J. J.</given-names></name> <name><surname>Riedner</surname><given-names>B.</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>The neural correlates of dreaming</article-title>. <source>Nat. Neurosci.</source> <volume>20</volume>, <fpage>872</fpage>&#x2013;<lpage>878</lpage>. doi: <pub-id pub-id-type="doi">10.1038/nn.4545</pub-id>, <pub-id pub-id-type="pmid">28394322</pub-id></mixed-citation></ref>
<ref id="ref26"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Sirigu</surname><given-names>A.</given-names></name> <name><surname>Duhamel</surname><given-names>J. R.</given-names></name> <name><surname>Cohen</surname><given-names>L.</given-names></name> <name><surname>Pillon</surname><given-names>B.</given-names></name> <name><surname>Dubois</surname><given-names>B.</given-names></name> <name><surname>Agid</surname><given-names>Y.</given-names></name></person-group> (<year>1996</year>). <article-title>The mental representation of hand movements after parietal cortex damage</article-title>. <source>Science</source> <volume>273</volume>, <fpage>1564</fpage>&#x2013;<lpage>1568</lpage>. doi: <pub-id pub-id-type="doi">10.1126/science.273.5281.1564</pub-id>, <pub-id pub-id-type="pmid">8703221</pub-id></mixed-citation></ref>
<ref id="ref27"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Stangalino</surname><given-names>C.</given-names></name> <name><surname>Semenza</surname><given-names>C.</given-names></name> <name><surname>Mondini</surname><given-names>S.</given-names></name></person-group> (<year>1995</year>). <article-title>Generating visual mental images: deficit after brain damage</article-title>. <source>Neuropsychologia</source> <volume>33</volume>, <fpage>1473</fpage>&#x2013;<lpage>1483</lpage>. doi: <pub-id pub-id-type="doi">10.1016/0028-3932(95)00076-f</pub-id>, <pub-id pub-id-type="pmid">8584181</pub-id></mixed-citation></ref>
<ref id="ref28"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Tian</surname><given-names>X.</given-names></name> <name><surname>Poeppel</surname><given-names>D.</given-names></name></person-group> (<year>2010</year>). <article-title>Mental imagery of speech and movement implicates the dynamics of internal forward models</article-title>. <source>Front. Psychol.</source> <volume>1</volume>:<fpage>166</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2010.00166</pub-id>, <pub-id pub-id-type="pmid">21897822</pub-id></mixed-citation></ref>
<ref id="ref29"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Tononi</surname><given-names>G.</given-names></name> <name><surname>Boly</surname><given-names>M.</given-names></name> <name><surname>Massimini</surname><given-names>M.</given-names></name> <name><surname>Koch</surname><given-names>C.</given-names></name></person-group> (<year>2016</year>). <article-title>Integrated information theory: from consciousness to its physical substrate</article-title>. <source>Nat. Rev. Neurosci.</source> <volume>17</volume>, <fpage>450</fpage>&#x2013;<lpage>461</lpage>. doi: <pub-id pub-id-type="doi">10.1038/nrn.2016.44</pub-id>, <pub-id pub-id-type="pmid">27225071</pub-id></mixed-citation></ref>
<ref id="ref30"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Wakefield</surname><given-names>C.</given-names></name> <name><surname>Smith</surname><given-names>D.</given-names></name> <name><surname>Moran</surname><given-names>A. P.</given-names></name> <name><surname>Holmes</surname><given-names>P.</given-names></name></person-group> (<year>2013</year>). <article-title>Functional equivalence or behavioral matching? A critical reflection on 15 years of research using the PETTLEP model of motor imagery</article-title>. <source>Int. Rev. Sport Exerc. Psychol.</source> <volume>6</volume>, <fpage>105</fpage>&#x2013;<lpage>121</lpage>. doi: <pub-id pub-id-type="doi">10.1080/1750984X.2012.724437</pub-id></mixed-citation></ref>
<ref id="ref31"><mixed-citation publication-type="journal"><person-group person-group-type="author"><name><surname>Weinberg</surname><given-names>R.</given-names></name></person-group> (<year>2008</year>). <article-title>Does imagery work? Effects on performance and mental skills</article-title>. <source>J. Im. Res. Sport Phys. Act.</source> <volume>3</volume>:<fpage>1932</fpage>. doi: <pub-id pub-id-type="doi">10.2202/1932-0191.1025</pub-id></mixed-citation></ref>
</ref-list>
<fn-group>
<fn fn-type="custom" custom-type="edited-by" id="fn0001">
<p>Edited by: <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/396763/overview">Nayara Mota</ext-link>, Universidade Cat&#x00F3;lica do Salvador, Brazil</p>
</fn>
<fn fn-type="custom" custom-type="reviewed-by" id="fn0002">
<p>Reviewed by: <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/667215/overview">Johannes Vorwerk</ext-link>, University of Innsbruck, Austria</p>
<p><ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/2963655/overview">Zhengxiang Cai</ext-link>, Carnegie Mellon University, United States</p>
</fn>
</fn-group>
<app-group>
<app id="app1"><label>Appendix:</label> <title>Visualization script</title><sec id="sec22">
<title>Set up: perspective in first person</title>
<p>Imagine yourself playing a game against an important opponent (choose an opponent). Imagine yourself in the uniform and your equipment. Imagine the feeling of being dressed and holding your stick. Imagine you are in the third period of this game and the score is tied 1&#x2013;1. Imagine yourself sitting on the bench watching the play go by, waiting for your next opportunity to go out and sacrifice for the team. You are sweaty, breathing heavily, and your adrenaline level is high. Picture the sounds of the crowd, your coach, and your teammates around you. Imagine the cool air from the rink hitting the sweat on your skin.</p>
<p>Imagined scenario (<italic>what participants imagined during the &#x201C;GO&#x201D; cues&#x002A;</italic>)</p>
<p>Your coach calls you to play and you hop over the boards as your player changes. You are on the (left or right depending on your handedness) wing when your defender passes you the puck as you begin to skate toward the opponent&#x2019;s zone. You cradle the puck with ease and pick up speed. You are flying down the ice handling the puck on your blade with your head up, reading the play and the players around you. You see that you are on a 2 vs. 1 with your other teammate against the defender on the opposing team. Your heart is pumping, but you are composed, ready to seize the moment. At this point, you are at the opposing team&#x2019;s blue line and you have to make a play. You notice their defender cheats toward your player that is on the two-on-one with you. You decide to fake the pass to your teammate, turning your head and stick toward them to fake out the goalie and the defender, but at the last second you turn toward the net and fire a hard shot using your muscle and power high and up to the top (left for right-handed players or right for left-handed players) corner for a goal. The crowd jumps to their feet, you feel a rush of excitement and joy as you throw up your hands to celebrate the goal, skating over to your teammates for a celebratory hug/bear tackle.</p>
<p>Reminders:</p>
<list list-type="bullet">
<list-item>
<p>Try to imagine the situation in its entirety, with as much detail and physiological awareness as possible.</p>
</list-item>
<list-item>
<p>Imagine the sounds, sensations, smells, tastes, emotions, and sights. Channel your focus, heartbeat, sweat, adrenaline, excitement, breath, physical movements, muscle tension, strength, balance, power, the puck on your stick, your hands around the stick, the skates cutting into the ice, your teammates calling, your coaches talking, the crowd cheering, the wind in your face and hair, the deception in your body positioning and any other details that relate to the visualized situation.</p>
</list-item>
<list-item>
<p>Keep the PETTLEP model in mind: physical, environment, task, timing, learning, emotion, perspective.</p>
</list-item>
</list>
<p>&#x002A;One participant, a goaltender, imagined a different scenario specific to her position. The imagery was a penalty-killing situation (her team playing short one player), with the puck controlled by the opposition team in her own team&#x2019;s zone, the puck moving from one opponent to another through a series of passes in a structured power play formation, her own re-positioning movements mirroring the changing location of the puck handler. The imagined play culminates in a final cross-ice pass just in front of the goal crease and a one timer shot that she saves after a strong lateral push and power slide, having read the play perfectly, allowing no rebound.</p>
<p>The goaltender&#x2019;s scenario presumably differed most from the other participants&#x2019; scenario in the task and learning elements of the PETTLEP model (and possibly the emotion element, if preventing a goal is more emotionally taxing than scoring a goal, which is likely in hockey).</p>
</sec></app>
</app-group>
</back>
</article>