<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="discussion">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Neurosci.</journal-id>
<journal-title>Frontiers in Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-453X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnins.2022.859887</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Opinion</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Active Brain-Computer Interfacing for Healthy Users</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Shishkin</surname> <given-names>Sergei L.</given-names></name>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/84662/overview"/>
</contrib>
</contrib-group>
<aff><institution>MEG Center, Moscow State University of Psychology and Education</institution>, <addr-line>Moscow</addr-line>, <country>Russia</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Giovanni Mirabella, University of Brescia, Italy</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Luca Falciati, University of Brescia, Italy</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Sergei L. Shishkin <email>sergshishkin&#x00040;mail.ru</email></corresp>
<fn fn-type="other" id="fn001"><p>This article was submitted to Neural Technology, a section of the journal Frontiers in Neuroscience</p></fn></author-notes>
<pub-date pub-type="epub">
<day>25</day>
<month>04</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>16</volume>
<elocation-id>859887</elocation-id>
<history>
<date date-type="received">
<day>21</day>
<month>01</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>30</day>
<month>03</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2022 Shishkin.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Shishkin</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license> </permissions>
<kwd-group>
<kwd>brain-computer interfaces</kwd>
<kwd>active BCI</kwd>
<kwd>human-computer interaction</kwd>
<kwd>human-machine interfaces</kwd>
<kwd>healthy users</kwd>
</kwd-group>
<contract-num rid="cn001">22-29-01361</contract-num>
<contract-sponsor id="cn001">Russian Science Foundation<named-content content-type="fundref-id">10.13039/501100006769</named-content></contract-sponsor>
<counts>
<fig-count count="0"/>
<table-count count="0"/>
<equation-count count="0"/>
<ref-count count="47"/>
<page-count count="4"/>
<word-count count="3521"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>Introduction</title>
<p>Brain-computer interface (BCI) research and development continues to grow. In particular, BCI patent applications have been increasing exponentially in a few recent years (Greenberg et al., <xref ref-type="bibr" rid="B19">2021</xref>). The situation is, however, different for different kinds of BCI: <italic>invasive</italic> and <italic>non-invasive, active</italic> and <italic>passive</italic>, especially regarding possible use by healthy users. <italic>Invasive</italic> BCIs provide best performance, and even may provide access to early stages of motor decision formation, enabling faster interaction compared to usual input devices (Mirabella and Lebedev, <xref ref-type="bibr" rid="B27">2017</xref>), but they are associated with high risk and cost, and will unlikely be available for healthy users in near future. Existing <italic>non-invasive</italic> BCIs have low bandwidth, speed, and accuracy, and this is why only <italic>passive</italic>, not active BCIs have been considered as a prospective technology for healthy users in the roadmap of brain/neural-computer interaction (<xref ref-type="bibr" rid="B4">BNCI Horizon 2020</xref>, <xref ref-type="bibr" rid="B4">2015</xref>; Brunner et al., <xref ref-type="bibr" rid="B6">2015</xref>). Passive BCIs are those that use &#x0201C;brain activity arising without the purpose of voluntary control&#x0201D; (Zander and Kothe, <xref ref-type="bibr" rid="B45">2011</xref>). As they do not claim the user&#x00027;s attention, their low speed of interaction can be acceptable (Current Research in Neuroadaptive Technology, <xref ref-type="bibr" rid="B12">2021</xref>).</p>
<p>In contrast, a user of an <italic>active</italic> BCI controls an application explicitly, via conscious control of his or her brain activity (Zander and Kothe, <xref ref-type="bibr" rid="B45">2011</xref>)<xref ref-type="fn" rid="fn0001"><sup>1</sup></xref>. These BCIs have to compete with the manual input devices (keyboard, mouse, touchscreen) and emerging touchless alternatives (voice-, gesture- and gaze-based), as playing the same role in human-computer interaction (HCI) (Lance et al., <xref ref-type="bibr" rid="B23">2012</xref>; van Erp et al., <xref ref-type="bibr" rid="B41">2012</xref>). Although some attempts were announced to dramatically improve performance of the non-invasive BCIs by advancing brain sensor technology (most noticeably, Facebook&#x00027;s plans to enable fast text input &#x0201C;directly from your brain&#x0201D;&#x02014;Constine, <xref ref-type="bibr" rid="B10">2017</xref>), the electroencephalography (EEG) remains the only widely used technology and performance is still below from what is provided by electromechanical input devices. For example, the best reported average time of activation of a non-invasive asynchronous &#x0201C;brain switch&#x0201D; (a BCI requiring low false positive rate but enabling detection of only one discrete command) is about 1.5 s (Zheng et al., <xref ref-type="bibr" rid="B47">2022</xref>). Moreover, while some non-medical active BCIs use well-established non-invasive BCI paradigms&#x02014;the motor imagery BCI, the P300 BCI, the steady-state visual evoked potential (SSVEP) BCI and the code-modulated visual evoked potential (c-VEP) BCI&#x02014;many projects rely on even less precise control based on learned changing EEG rhythms (Nijholt, <xref ref-type="bibr" rid="B28">2019</xref>; Prpa and Pasquier, <xref ref-type="bibr" rid="B34">2019</xref>; Vasiljevic and de Miranda, <xref ref-type="bibr" rid="B42">2020</xref>). Due to low performance, active BCIs are still affordable mainly for people who cannot use other input, such as paralyzed individuals.</p>
<p>Nevertheless, attempts to develop active BCIs for healthy people continue. In this Opinion, I briefly overview the application areas for which they are currently developed, then try to figure out what motivates these attempts, and what is the near perspective.</p>
</sec>
<sec id="s2">
<title>Applications</title>
<p>What types of non-medical applications of active BCIs have been developed and studied in recent years? In my view, most of them fall into one of the several groups:</p>
<p><italic>1. Games</italic>&#x02014;BCI gaming remains the most studied application of active BCI for healthy users (Vasiljevic and de Miranda, <xref ref-type="bibr" rid="B42">2020</xref>). In this application, input imprecision inherent to non-invasive BCIs is not always as critical as in most real-life applications, and even can serve as a part of intentionally constructed uncertainty within the gameplay (Nijholt et al., <xref ref-type="bibr" rid="B29">2009</xref>). Commercial EEG devices for gaming have been produced for more than 10 years, and games developed for them are becoming increasingly user-friendly (Vasiljevic and de Miranda, <xref ref-type="bibr" rid="B42">2020</xref>). Both active and passive BCIs are studied as means to interact with games, but both are still far from becoming a widely accepted input for games, which is partly due to low performance. Low popularity of the BCI games in the gamer community can also be related to insufficient attention to studying interaction in BCI games, developing relevant game design and software and hardware solutions (Vasiljevic and de Miranda, <xref ref-type="bibr" rid="B42">2020</xref>; Cattan, <xref ref-type="bibr" rid="B7">2021</xref>).</p>
<p><italic>2. Art</italic>&#x02014;Another BCI application for healthy users is the use of BCI by enthusiast artists in performances and creating pieces of art, i.e., &#x0201C;brain art&#x0201D; (Nijholt, <xref ref-type="bibr" rid="B28">2019</xref>) or &#x0201C;BCI art&#x0201D; (Prpa and Pasquier, <xref ref-type="bibr" rid="B34">2019</xref>). These projects are very diverse (Brain Art, <xref ref-type="bibr" rid="B5">2019</xref>; Bernal et al., <xref ref-type="bibr" rid="B2">2021</xref>), but, unfortunately, rarely documented in the scientific literature (Prpa and Pasquier, <xref ref-type="bibr" rid="B34">2019</xref>; Friedman, <xref ref-type="bibr" rid="B17">2020</xref>). Of 61 BCI art projects surveyed by Prpa and Pasquier (<xref ref-type="bibr" rid="B34">2019</xref>), mostly described in non-science sources such as YouTube videos, 18 used active or reactive control (Table 3.4 in Prpa and Pasquier, <xref ref-type="bibr" rid="B34">2019</xref>). For brain art, like for the BCI games, robustness and efficiency may be considered less important than experience (Nijholt et al., <xref ref-type="bibr" rid="B30">2022</xref>).</p>
<p><italic>3. Autonomous-driving vehicles</italic>&#x02014;BCI control of autonomous vehicles is increasingly considered for healthy users (Rehman et al., <xref ref-type="bibr" rid="B36">2018</xref>; Chai et al., <xref ref-type="bibr" rid="B9">2021</xref>; Hekmatmanesh et al., <xref ref-type="bibr" rid="B21">2021</xref>). Such BCI presented by Mercedes-Benz in their concept car (Rosso, <xref ref-type="bibr" rid="B37">2021</xref>) enabled &#x0201C;selecting the navigation destination by thought control, switching the ambient light in the interior or changing the radio station&#x0201D; (Mercedes-Benz VISION AVTR, <xref ref-type="bibr" rid="B26">2021</xref>).</p>
<p><italic>4. Augmented and virtual reality (AR/VR)</italic>&#x02014;While these technologies are quickly improving, input in AR/VR is still far from perfect. Therefore, active BCIs have some chances to compete, either as a general-purpose AR/VR input mean or in connection with BCI games and BCI art (Putze, <xref ref-type="bibr" rid="B35">2019</xref>; Cattan et al., <xref ref-type="bibr" rid="B8">2020</xref>; Paszkiel, <xref ref-type="bibr" rid="B32">2020</xref>; Wen et al., <xref ref-type="bibr" rid="B44">2021</xref>). Noticeably, NextMind, the company that provided their BCI for the above-mentioned Mercedes car (Rosso, <xref ref-type="bibr" rid="B37">2021</xref>), was recently purchased by an AR developer (Heath, <xref ref-type="bibr" rid="B20">2022</xref>).</p>
<p>Attempts were also made to develop BCIs which could be used to enable additional input when the two arms are busy (&#x0201C;third arm&#x0201D;; Penaloza and Nishio, <xref ref-type="bibr" rid="B33">2018</xref>), or even replacing normal input devices in some tasks by providing more effortless and fluent control (&#x0201C;wish mouse,&#x0201D; Shishkin et al., <xref ref-type="bibr" rid="B40">2016</xref>). In these areas BCI performance remains significantly lower than what is acceptable for practical applications.</p>
</sec>
<sec id="s3">
<title>Motivations</title>
<p>Why do some BCI developers expect that healthy users would prefer BCIs over other, more accurate, faster, and robust input technologies?</p>
<p><italic>1. Practical reasons</italic>&#x02014;AR/VR and, less obviously, autonomous-driving cars are special cases where traditional input means do not fit the technology well. Here, BCIs compete with emerging control approaches based on the movements of the head, body, hands (gestures), and gaze, each of which has its own shortcomings. Moreover, if a user wears a head-mounted display, adding BCI control to it is not necessarily associated with significant inflation of the price and increased inconvenience. In an autonomous-driving car, the increase of price would be even less noticeable; in this case, there is a range of tasks where response time and accuracy are not critical issues as well (see above the Mercedes example). However, in almost all applications productivity and efficiency are not what non-invasive BCIs are valued for (I refrain here from discussing neurofeedback-based training, which is typically based on technologies somewhat different from BCI&#x02014;the only exception, to my knowledge, is Arvaneh et al., <xref ref-type="bibr" rid="B1">2019</xref>).</p>
<p><italic>2. Experience</italic>&#x02014;In HCI, not only productivity and efficiency are valuable, but also, increasingly, various aspects of interaction experience, such as &#x0201C;affect, comfort, family, community, or playfulness,&#x0201D; where BCI technologies have certain advantages (Bernal et al., <xref ref-type="bibr" rid="B2">2021</xref>; Nijholt et al., <xref ref-type="bibr" rid="B30">2022</xref>). In some cases, BCI-based interaction brings highly paradoxical experience: for example, the long-known feature of control based on alpha rhythm is &#x0201C;the more you try, the less likely is to succeed&#x0201D; (Lucier and Simon, <xref ref-type="bibr" rid="B25">1980</xref>, cited by Prpa and Pasquier, <xref ref-type="bibr" rid="B34">2019</xref>, p. 102). User experience is especially important for BCI art (Nijholt, <xref ref-type="bibr" rid="B28">2019</xref>; Nijholt et al., <xref ref-type="bibr" rid="B30">2022</xref>) but also for BCI games and AR/VR (Vasiljevic and de Miranda, <xref ref-type="bibr" rid="B42">2020</xref>; Cattan, <xref ref-type="bibr" rid="B7">2021</xref>; Nijholt et al., <xref ref-type="bibr" rid="B30">2022</xref>), and even for autonomous driving (where the goal for a BCI is &#x0201C;to further enhance driving comfort in the future&#x0201D; and to open up &#x0201C;revolutionary possibilities for intuitive interaction with the vehicle,&#x0201D; Mercedes-Benz VISION AVTR, <xref ref-type="bibr" rid="B26">2021</xref>).</p>
<p>Unique BCI experience in BCI art and in some BCI games can be partly associated with one interesting feature of BCI-based control, not found in computer inputs which exclude passive interaction: <italic>an active BCI makes possible passive BCI control, and vice versa</italic>. As Anton Nijholt explained: &#x0201C;Obviously, when a subject is told to wear a BCI cap he or she can become aware and learn how changes are related to a mental state and can turn passive BCI into active BCI by producing different mental states. A subject&#x00027;s active and reactive BCI performance can be dependent on his or her mental state&#x0201D; (Nijholt, <xref ref-type="bibr" rid="B28">2019</xref>, p. 6). It is tempting to hypothesize that this &#x0201C;fuzziness&#x0201D; of the conscious control may open the door for the user&#x00027;s unconsciousness to cause desirable but suppressed actions. This can help artists to express something that is difficult to express in other ways, and possibly may lead to unusual engaging experiences in games. To my knowledge, such &#x0201C;fuzziness&#x0201D; has never been addressed in experimental research.</p>
<p>Moreover, the experience of healthy users of active BCI control was very little studied so far (Vasiljevic and de Miranda, <xref ref-type="bibr" rid="B42">2020</xref>; Cattan, <xref ref-type="bibr" rid="B7">2021</xref>). The most systematic study, to my knowledge, was conducted by Schmid and Jox (<xref ref-type="bibr" rid="B39">2021</xref>), who engaged (apart from professional BCI researchers and developers) only three participants with regular BCI use experience (BCI gamers).</p>
</sec>
<sec id="s4">
<title>Perspectives</title>
<p>As the previous two sections suggest, the development of active BCIs for healthy users continued in recent years, but the focus was on applications for which user experience was more valuable than productivity and efficiency. More attention of researchers and developers to experience-related issues can therefore help strongly improve affordability of these BCIs in the near future (Vasiljevic and de Miranda, <xref ref-type="bibr" rid="B42">2020</xref>; Cattan, <xref ref-type="bibr" rid="B7">2021</xref>).</p>
<p>Even though the unique experience of interaction mediated by active BCIs provides certain advantages in their competition with traditional input means, improvement of BCI performance is still highly desirable. One possible way is the use of deep neural networks as BCI classifiers (Craik et al., <xref ref-type="bibr" rid="B11">2019</xref>; Roy et al., <xref ref-type="bibr" rid="B38">2019</xref>). However, such classifiers often have many parameters, and therefore rarely can be well-trained on single-session data. The current trend of the increased availability of large datasets, on which more advanced classifiers can be learned, therefore may make possible significant improvement of performance. Further development of transfer learning (e.g., Zanini et al., <xref ref-type="bibr" rid="B46">2017</xref>; Fahimi et al., <xref ref-type="bibr" rid="B14">2019</xref>; Dehghani et al., <xref ref-type="bibr" rid="B13">2021</xref>) and more recent meta-learning (Li et al., <xref ref-type="bibr" rid="B24">2021</xref>; Bhosale et al., <xref ref-type="bibr" rid="B3">2022</xref>; Wei et al., <xref ref-type="bibr" rid="B43">2022</xref>) approaches make possible applying a classifier trained on large multisubject datasets to the data from new users. Additional opportunities can be found in combining different BCI modalities and creating hybrid systems based on joint use of a BCI and other input devices (Wen et al., <xref ref-type="bibr" rid="B44">2021</xref>).</p>
<p>Improved performance may make feasible modifications of existing BCI paradigms that provide more intensive experience. In BCI games, for example, better classification may help to turn the P300 paradigm into single-trial (Finke et al., <xref ref-type="bibr" rid="B16">2009</xref>; Ganin et al., <xref ref-type="bibr" rid="B18">2013</xref>) and single-stimulus (Fedorova et al., <xref ref-type="bibr" rid="B15">2014</xref>) modifications, enabling higher integration with gameplay and higher immersion (Kaplan et al., <xref ref-type="bibr" rid="B22">2013</xref>); &#x0201C;quasi-movement&#x0201D; paradigm (Nikulin et al., <xref ref-type="bibr" rid="B31">2008</xref>) may offer easier training and, possibly, more intensive experience than traditional motor imagery BCI.</p>
<p>If passive BCIs will become widely used by healthy users, their hardware could be used for active BCIs. Similarly, wide use of gaze-based control by healthy users may make hybrid interfaces using gaze and EEG also more affordable.</p>
<p>In summary, while non-invasive active BCIs for healthy users are not currently a mature technology, further efforts of researchers and developers may soon lead to creation of affordable products.</p>
</sec>
<sec id="s5">
<title>Author Contributions</title>
<p>The author confirms being the sole contributor of this work and has approved it for publication.</p>
</sec>
<sec sec-type="funding-information" id="s6">
<title>Funding</title>
<p>This work was supported by the Russian Science Foundation, grant 22-29-01361.</p>
</sec>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of Interest</title>
<p>The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="disclaimer" id="s7">
<title>Publisher&#x00027;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Arvaneh</surname> <given-names>M.</given-names></name> <name><surname>Robertson</surname> <given-names>I. H.</given-names></name> <name><surname>Ward</surname> <given-names>T. E.</given-names></name></person-group> (<year>2019</year>). <article-title>A P300-based brain-computer interface for improving attention</article-title>. <source>Front. Hum. Neurosci</source>. <volume>12</volume>, <fpage>524</fpage>. <pub-id pub-id-type="doi">10.3389/fnhum.2018.00524</pub-id><pub-id pub-id-type="pmid">30662400</pub-id></citation></ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bernal</surname> <given-names>G.</given-names></name> <name><surname>Montgomery</surname> <given-names>S. M.</given-names></name> <name><surname>Maes</surname> <given-names>P.</given-names></name></person-group> (<year>2021</year>). <article-title>Brain-computer interfaces, open-source, and democratizing the future of augmented consciousness</article-title>. <source>Front. Comput. Sci.</source> <volume>3</volume>, <fpage>661300</fpage>. <pub-id pub-id-type="doi">10.3389/fcomp.2021.661300</pub-id></citation>
</ref>
<ref id="B3">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bhosale</surname> <given-names>S.</given-names></name> <name><surname>Chakraborty</surname> <given-names>R.</given-names></name> <name><surname>Kopparapu</surname> <given-names>S. K.</given-names></name></person-group> (<year>2022</year>). <article-title>Calibration free meta learning based approach for subject independent EEG emotion recognition</article-title>. <source>Biomed. Signal Process. Control</source> <volume>72</volume>, <fpage>103289</fpage>. <pub-id pub-id-type="doi">10.1016/j.bspc.0.2021.103289</pub-id></citation>
</ref>
<ref id="B4">
<citation citation-type="web"><person-group person-group-type="author"><collab>BNCI Horizon 2020.</collab></person-group> (<year>2015</year>). <source>Roadmap - The Future in Brain/Neural-Computer Interaction: Horizon 2020</source>. Available online at: <ext-link ext-link-type="uri" xlink:href="http://bnci-horizon-2020.eu/roadmap">http://bnci-horizon-2020.eu/roadmap</ext-link> (accessed February 21, 2022).</citation>
</ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><collab>Brain Art: Brain-Computer Interfaces for Artistic Expression</collab></person-group>. (<year>2019</year>). Ed. by <person-group person-group-type="editor"><name><surname>Nijholt</surname> <given-names>A.</given-names></name></person-group> Springer Nature Switzerland AG.</citation>
</ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brunner</surname> <given-names>C.</given-names></name> <name><surname>Birbaumer</surname> <given-names>N.</given-names></name> <name><surname>Blankertz</surname> <given-names>B.</given-names></name> <name><surname>Guger</surname> <given-names>C.</given-names></name> <name><surname>K&#x000FC;bler</surname> <given-names>A.</given-names></name> <name><surname>Mattia</surname> <given-names>D.</given-names></name> <etal/></person-group>. (<year>2015</year>). <article-title>BNCI Horizon 2020: towards a roadmap for the BCI community</article-title>. <source>Brain Comput. Interfaces</source> <volume>2</volume>, <fpage>1</fpage>&#x02013;<lpage>10</lpage>. <pub-id pub-id-type="doi">10.1080/2326263X.2015.1008956</pub-id></citation>
</ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cattan</surname> <given-names>G</given-names></name></person-group>. (<year>2021</year>). <article-title>The use of brain&#x02013;computer interfaces in games is not ready for the general public</article-title>. <source>Front. Comput. Sci.</source> <volume>3</volume>, <fpage>628773</fpage>. <pub-id pub-id-type="doi">10.3389/fcomp.2021.628773</pub-id></citation>
</ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cattan</surname> <given-names>G.</given-names></name> <name><surname>Andreev</surname> <given-names>A.</given-names></name> <name><surname>Visinoni</surname> <given-names>E.</given-names></name></person-group> (<year>2020</year>). <article-title>Recommendations for integrating a P300-based brain&#x02013;computer interface in virtual reality environments for gaming: an update</article-title>. <source>Computers</source> <volume>9</volume>, <fpage>92</fpage>. <pub-id pub-id-type="doi">10.3390/computers9040092</pub-id></citation>
</ref>
<ref id="B9">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Chai</surname> <given-names>Z.</given-names></name> <name><surname>Nie</surname> <given-names>T.</given-names></name> <name><surname>Becker</surname> <given-names>J.</given-names></name></person-group> (<year>2021</year>). <article-title>Top ten challenges facing autonomous driving</article-title>, in <source>Autonomous Driving Changes the Future</source> (<publisher-loc>Singapore</publisher-loc>: <publisher-name>Springer</publisher-name>). <pub-id pub-id-type="doi">10.1007/978-981-15-6728-5</pub-id></citation>
</ref>
<ref id="B10">
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Constine</surname> <given-names>J</given-names></name></person-group>. (<year>2017</year>). <source>Facebook Is Building Brain-Computer Interfaces for Typing and Skin-Hearing, TechCrunch, April 19, 2017</source>. Available online at: <ext-link ext-link-type="uri" xlink:href="https://techcrunch.com/2017/04/19/facebook-brain-interface/">https://techcrunch.com/2017/04/19/facebook-brain-interface/</ext-link> (accessed January 20, 2022).</citation>
</ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Craik</surname> <given-names>A.</given-names></name> <name><surname>He</surname> <given-names>Y.</given-names></name> <name><surname>Contreras-Vidal</surname> <given-names>J. L.</given-names></name></person-group> (<year>2019</year>). <article-title>Deep learning for electroencephalogram (EEG) classification tasks: a review</article-title>. <source>J. Neural Eng</source>. <volume>16</volume>, <fpage>031001</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2552/ab0ab5</pub-id><pub-id pub-id-type="pmid">30808014</pub-id></citation></ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><collab>Current Research in Neuroadaptive Technology</collab></person-group>. (<year>2021</year>). Ed. by <person-group person-group-type="editor"><name><surname>Fairclough</surname> <given-names>S.H.</given-names></name> <name><surname>Zander</surname> <given-names>T.O.</given-names></name></person-group> Elsevier.</citation>
</ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dehghani</surname> <given-names>M.</given-names></name> <name><surname>Mobaien</surname> <given-names>A.</given-names></name> <name><surname>Boostani</surname> <given-names>R.</given-names></name></person-group> (<year>2021</year>). <article-title>A deep neural network-based transfer learning to enhance the performance and learning speed of BCI systems</article-title>. <source>Brain Comput. Interfaces</source> <volume>8</volume>, <fpage>14</fpage>&#x02013;<lpage>25</lpage>. <pub-id pub-id-type="doi">10.1080/2326263X.2021.1943955</pub-id></citation>
</ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fahimi</surname> <given-names>F.</given-names></name> <name><surname>Zhang</surname> <given-names>Z.</given-names></name> <name><surname>Goh</surname> <given-names>W. B.</given-names></name> <name><surname>Lee</surname> <given-names>T. S.</given-names></name> <name><surname>Ang</surname> <given-names>K. K.</given-names></name> <name><surname>Guan</surname> <given-names>C.</given-names></name></person-group> (<year>2019</year>). <article-title>Inter-subject transfer learning with an end-to-end deep convolutional neural network for EEG-based BCI</article-title>. <source>J. Neural Eng</source>. <volume>16</volume>, <fpage>026007</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2552/aaf3f6</pub-id><pub-id pub-id-type="pmid">30524056</pub-id></citation></ref>
<ref id="B15">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Fedorova</surname> <given-names>A. A.</given-names></name> <name><surname>Shishkin</surname> <given-names>S. L.</given-names></name> <name><surname>Nuzhdin</surname> <given-names>Y. O.</given-names></name> <name><surname>Faskhiev</surname> <given-names>M. N.</given-names></name> <name><surname>Vasilyevskaya</surname> <given-names>A. M.</given-names></name> <name><surname>Ossadtchi</surname> <given-names>A. E.</given-names></name> <etal/></person-group>. (<year>2014</year>). <article-title>A fast &#x0201C;single-stimulus&#x0201D; brain switch</article-title>, in <source>Proc. 6th Int. Brain-Computer Interface Conference</source>, eds <person-group person-group-type="editor"><name><surname>Muller-Putz</surname> <given-names>G.</given-names></name> <name><surname>Huggins</surname> <given-names>J. </given-names></name> <name><surname>Steyrl</surname> <given-names>D</given-names></name></person-group>. (<publisher-loc>Graz</publisher-loc>: <publisher-name>Verlag der Technischen Universitat Graz</publisher-name>). <pub-id pub-id-type="doi">10.3217/978-3-85125-378-8-52</pub-id></citation>
</ref>
<ref id="B16">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Finke</surname> <given-names>A.</given-names></name> <name><surname>Lenhardt</surname> <given-names>A.</given-names></name> <name><surname>Ritter</surname> <given-names>H.</given-names></name></person-group> (<year>2009</year>). <article-title>The MindGame: a P300-based brain-computer interface game</article-title>. <source>Neural Netw.</source> <volume>22</volume>, <fpage>1329</fpage>&#x02013;<lpage>1333</lpage>. <pub-id pub-id-type="doi">10.1016/j.neunet.2009.07.003</pub-id><pub-id pub-id-type="pmid">19635654</pub-id></citation></ref>
<ref id="B17">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Friedman</surname> <given-names>D</given-names></name></person-group>. (<year>2020</year>). <article-title>Brain art: brain-computer interfaces for artistic expression</article-title>. <source>Brain Comput. Interfaces</source> <volume>7</volume>, <fpage>36</fpage>&#x02013;<lpage>37</lpage>. <pub-id pub-id-type="doi">10.1080/2326263X.2020.1756573</pub-id></citation>
</ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ganin</surname> <given-names>I. P.</given-names></name> <name><surname>Shishkin</surname> <given-names>S. L.</given-names></name> <name><surname>Kaplan</surname> <given-names>A. Y.</given-names></name></person-group> (<year>2013</year>). <article-title>A P300-based brain-computer interface with stimuli on moving objects: four-session single-trial and triple-trial tests with a game-like task design</article-title>. <source>PLoS ONE</source> <volume>8</volume>, <fpage>e77755</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0077755</pub-id><pub-id pub-id-type="pmid">24302977</pub-id></citation></ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Greenberg</surname> <given-names>A.</given-names></name> <name><surname>Cohen</surname> <given-names>A.</given-names></name> <name><surname>Grewal</surname> <given-names>M.</given-names></name></person-group> (<year>2021</year>). <article-title>Patent landscape of brain&#x02013;machine interface technology</article-title>. <source>Nat. Biotechnol.</source> <volume>39</volume>, <fpage>1194</fpage>&#x02013;<lpage>1199</lpage>. <pub-id pub-id-type="doi">10.1038/s41587-021-01071-7</pub-id><pub-id pub-id-type="pmid">34621062</pub-id></citation></ref>
<ref id="B20">
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Heath</surname> <given-names>A</given-names></name></person-group>. (<year>2022</year>). <source>Snap buys brain-computer interface startup for future AR glasses. The Verge, Mar 23, 2022, 9:00am EDT</source>. Available online at: <ext-link ext-link-type="uri" xlink:href="https://www.theverge.com/2022/3/23/22991667/snap-buys-nextmind-brain-computer-interface-spectacles-ar-glasses">https://www.theverge.com/2022/3/23/22991667/snap-buys-nextmind-brain-computer-interface-spectacles-ar-glasses</ext-link> (accessed March 24, 2022).</citation>
</ref>
<ref id="B21">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hekmatmanesh</surname> <given-names>A.</given-names></name> <name><surname>Nardelli</surname> <given-names>P. H.</given-names></name> <name><surname>Handroos</surname> <given-names>H.</given-names></name></person-group> (<year>2021</year>). <article-title>Review of the state-of-the-art of brain-controlled vehicles</article-title>. <source>IEEE Access</source> <volume>9</volume>, <fpage>110173</fpage>&#x02013;<lpage>110193</lpage>. <pub-id pub-id-type="doi">10.1109/ACCESS.2021.3100700</pub-id><pub-id pub-id-type="pmid">23016646</pub-id></citation></ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kaplan</surname> <given-names>A. Y.</given-names></name> <name><surname>Shishkin</surname> <given-names>S. L.</given-names></name> <name><surname>Ganin</surname> <given-names>I. P.</given-names></name> <name><surname>Basyul</surname> <given-names>I. A.</given-names></name> <name><surname>Zhigalov</surname> <given-names>A. Y.</given-names></name></person-group> (<year>2013</year>). <article-title>Adapting the P300-based brain-computer interface for gaming: a review</article-title>. <source>IEEE Trans. Comput. Intellig. AI Games</source> <volume>5</volume>, <fpage>141</fpage>&#x02013;<lpage>149</lpage>. <pub-id pub-id-type="doi">10.1109/TCIAIG.2012.2237517</pub-id></citation>
</ref>
<ref id="B23">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lance</surname> <given-names>B. J.</given-names></name> <name><surname>Kerick</surname> <given-names>S. E.</given-names></name> <name><surname>Ries</surname> <given-names>A. J.</given-names></name> <name><surname>Oie</surname> <given-names>K. S.</given-names></name> <name><surname>McDowell</surname> <given-names>K.</given-names></name></person-group> (<year>2012</year>). <article-title>Brain&#x02013;computer interface technologies in the coming decades</article-title>. <source>Proc. IEEE</source> <volume>100</volume>, <fpage>1585</fpage>&#x02013;<lpage>1599</lpage>. <pub-id pub-id-type="doi">10.1109/JPROC.2012.2184830</pub-id></citation>
</ref>
<ref id="B24">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>D.</given-names></name> <name><surname>Ortega</surname> <given-names>P.</given-names></name> <name><surname>Wei</surname> <given-names>X.</given-names></name> <name><surname>Faisal</surname> <given-names>A.</given-names></name></person-group> (<year>2021</year>). <article-title>Model-agnostic meta-learning for EEG motor imagery decoding in brain-computer-interfacing</article-title>, in <source>10th International IEEE/EMBS Conference on Neural Engineering (NER) 2021 May</source> <volume>4</volume>, <fpage>527</fpage>&#x02013;<lpage>530</lpage>. <pub-id pub-id-type="doi">10.1109/NER49283.2021.9441077</pub-id></citation>
</ref>
<ref id="B25">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lucier</surname> <given-names>A.</given-names></name> <name><surname>Simon</surname> <given-names>D.</given-names></name></person-group> (<year>1980</year>). <source>Chambers. Scores by Alvin Lucier, interviews with the composer by Douglas Simon</source>. Weslayan University Press.</citation>
</ref>
<ref id="B26">
<citation citation-type="web"><person-group person-group-type="author"><collab>Mercedes-Benz VISION AVTR: Operating the User Interface With the Power of Thought.</collab></person-group> (<year>2021</year>). Available online at: <ext-link ext-link-type="uri" xlink:href="https://media.daimler.com/marsMediaSite/en/instance/ko/Mercedes-Benz-VISION-AVTR-operating-the-user-interface-with-the-power-of-thought.xhtml?oid=51228086">https://media.daimler.com/marsMediaSite/en/instance/ko/Mercedes-Benz-VISION-AVTR-operating-the-user-interface-with-the-power-of-thought.xhtml?oid=51228086</ext-link> (accessed January 20, 2022).</citation>
</ref>
<ref id="B27">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mirabella</surname> <given-names>G.</given-names></name> <name><surname>Lebedev</surname> <given-names>M.A.</given-names></name></person-group> (<year>2017</year>). <article-title>Interfacing to the brain&#x00027;s motor decisions</article-title>. <source>J. Neurophysiol.</source> <volume>117</volume>, <fpage>1305</fpage>&#x02013;<lpage>1319</lpage>. <pub-id pub-id-type="doi">10.1152/jn.00051.2016</pub-id><pub-id pub-id-type="pmid">28003406</pub-id></citation></ref>
<ref id="B28">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Nijholt</surname> <given-names>A</given-names></name></person-group>. (<year>2019</year>). <article-title>Introduction: brain-computer interfaces for artistic expression</article-title>, in <source>Brain Art: Brain-Computer Interfaces for Artistic Expression</source>, ed <person-group person-group-type="editor"><name><surname>Nijholt</surname> <given-names>A</given-names></name></person-group>. (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer Nature Switzerland AG</publisher-name>), <fpage>1</fpage>&#x02013;<lpage>29</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-030-14323-7_1</pub-id></citation>
</ref>
<ref id="B29">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nijholt</surname> <given-names>A.</given-names></name> <name><surname>Bos</surname> <given-names>D. P.</given-names></name> <name><surname>Reuderink</surname> <given-names>B.</given-names></name></person-group> (<year>2009</year>). <article-title>Turning shortcomings into challenges: brain&#x02013;computer interfaces for games</article-title>. <source>Entertain. Comput</source>. <volume>1</volume>, <fpage>85</fpage>&#x02013;<lpage>94</lpage>. <pub-id pub-id-type="doi">10.1016/j.entcom.2009.09.007</pub-id></citation>
</ref>
<ref id="B30">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nijholt</surname> <given-names>A.</given-names></name> <name><surname>Contreras-Vidal</surname> <given-names>J. L.</given-names></name> <name><surname>Jeunet</surname> <given-names>C.</given-names></name> <name><surname>V&#x000E4;ljam&#x000E4;e</surname> <given-names>A.</given-names></name></person-group> (<year>2022</year>). <article-title>Editorial: brain-computer interfaces for non-clinical (home, sports, art, entertainment, education, well-being) applications</article-title>. <source>Front. Comput. Sci.</source> <volume>4</volume>, <fpage>860619</fpage>. <pub-id pub-id-type="doi">10.3389/fcomp.2022.860619</pub-id></citation>
</ref>
<ref id="B31">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nikulin</surname> <given-names>V. V.</given-names></name> <name><surname>Hohlefeld</surname> <given-names>F. U.</given-names></name> <name><surname>Jacobs</surname> <given-names>A. M.</given-names></name> <name><surname>Curio</surname> <given-names>G.</given-names></name></person-group> (<year>2008</year>). <article-title>Quasi-movements: a novel motor&#x02013;cognitive phenomenon</article-title>. <source>Neuropsychologia</source> <volume>46</volume>, <fpage>727</fpage>&#x02013;<lpage>742</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2007.10.008</pub-id><pub-id pub-id-type="pmid">18035381</pub-id></citation></ref>
<ref id="B32">
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Paszkiel</surname> <given-names>S</given-names></name></person-group>. (<year>2020</year>). <article-title>Using BCI and VR technology in neurogaming</article-title>, in <source>Analysis and Classification of EEG Signals for Brain&#x02013;Computer Interfaces</source> (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer</publisher-name>), <fpage>93</fpage>&#x02013;<lpage>99</lpage>. Available online at: <ext-link ext-link-type="uri" xlink:href="https://link.springer.com/chapter/10.1007/978-3-030-30581-9_11">https://link.springer.com/chapter/10.1007/978-3-030-30581-9_11</ext-link> (accessed February 21, 2022).</citation>
</ref>
<ref id="B33">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Penaloza</surname> <given-names>C. I.</given-names></name> <name><surname>Nishio</surname> <given-names>S.</given-names></name></person-group> (<year>2018</year>). <article-title>BMI control of a third arm for multitasking</article-title>. <source>Sci. Robot</source>. <volume>3</volume>, eaat1228. <pub-id pub-id-type="doi">10.1126/scirobotics.aat1228</pub-id><pub-id pub-id-type="pmid">33141729</pub-id></citation></ref>
<ref id="B34">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Prpa</surname> <given-names>M.</given-names></name> <name><surname>Pasquier</surname> <given-names>P.</given-names></name></person-group> (<year>2019</year>). <article-title>Brain-computer interfaces in contemporary art: a state of the art and taxonomy</article-title>, in <source>Brain Art: Brain-Computer Interfaces for Artistic Expression</source>, ed <person-group person-group-type="editor"><name><surname>Nijholt</surname> <given-names>A</given-names></name></person-group>. (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer Nature Switzerland AG</publisher-name>), <fpage>1</fpage>&#x02013;<lpage>29</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-030-14323-7_3</pub-id></citation>
</ref>
<ref id="B35">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Putze</surname> <given-names>F</given-names></name></person-group>. (<year>2019</year>). <article-title>Methods and tools for using BCI with augmented and virtual reality</article-title>, in <source>Brain Art: Brain-Computer Interfaces for Artistic Expression</source>, ed <person-group person-group-type="editor"><name><surname>Nijholt</surname> <given-names>A</given-names></name></person-group>. (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer Nature Switzerland AG</publisher-name>), <fpage>433</fpage>&#x02013;<lpage>446</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-030-14323-7_16</pub-id></citation>
</ref>
<ref id="B36">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Rehman</surname> <given-names>A. U.</given-names></name> <name><surname>Ghaffarianhoseini</surname> <given-names>A.</given-names></name> <name><surname>Naismith</surname> <given-names>N.</given-names></name> <name><surname>Zhang</surname> <given-names>T.</given-names></name> <name><surname>Doan</surname> <given-names>D. T.</given-names></name> <name><surname>Tookey</surname> <given-names>J.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>A review: harnessing immersive technologies prowess for autonomous vehicles</article-title>, in <source>Proceedings of the 18th International Conference on Construction Applications of Virtual Reality (CONVR2018)</source>, eds <person-group person-group-type="editor"><name><surname>Amor</surname> <given-names>R.</given-names></name> <name><surname>Dimyad</surname> <given-names>J.</given-names></name></person-group> (<publisher-loc>Auckland</publisher-loc>: <publisher-name>The University of Auckland</publisher-name>), <fpage>545</fpage>&#x02013;<lpage>555</lpage>.</citation>
</ref>
<ref id="B37">
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Rosso</surname> <given-names>C</given-names></name></person-group>. (<year>2021</year>). <source>Autos to Integrate AI-Based Brain-Computer Interfaces (BCIs)</source>. Psychology Today. Available online at: <ext-link ext-link-type="uri" xlink:href="https://www.psychologytoday.com/us/blog/the-future-brain/202109/autos-integrate-ai-based-brain-computer-interfaces-bcis-0">https://www.psychologytoday.com/us/blog/the-future-brain/202109/autos-integrate-ai-based-brain-computer-interfaces-bcis-0</ext-link> (accessed January 20, 2022).</citation>
</ref>
<ref id="B38">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roy</surname> <given-names>Y.</given-names></name> <name><surname>Banville</surname> <given-names>H.</given-names></name> <name><surname>Albuquerque</surname> <given-names>I.</given-names></name> <name><surname>Gramfort</surname> <given-names>A.</given-names></name> <name><surname>Falk</surname> <given-names>T. H.</given-names></name> <name><surname>Faubert</surname> <given-names>J.</given-names></name></person-group> (<year>2019</year>). <article-title>Deep learning-based electroencephalography analysis: a systematic review</article-title>. <source>J. Neural Eng</source>. <volume>16</volume>, <fpage>051001</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2552/ab260c</pub-id><pub-id pub-id-type="pmid">31151119</pub-id></citation></ref>
<ref id="B39">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Schmid</surname> <given-names>J. R.</given-names></name> <name><surname>Jox</surname> <given-names>R. J.</given-names></name></person-group> (<year>2021</year>). <article-title>The power of thoughts: a qualitative interview study with healthy users of brain-computer interfaces</article-title>, in <source>Clinical Neurotechnology Meets Artificial Intelligence</source> (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer</publisher-name>), <fpage>117</fpage>&#x02013;<lpage>126</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-030-64590-8_9</pub-id></citation>
</ref>
<ref id="B40">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shishkin</surname> <given-names>S. L.</given-names></name> <name><surname>Nuzhdin</surname> <given-names>Y. O.</given-names></name> <name><surname>Svirin</surname> <given-names>E. P.</given-names></name> <name><surname>Trofimov</surname> <given-names>A. G.</given-names></name> <name><surname>Fedorova</surname> <given-names>A. A.</given-names></name> <name><surname>Kozyrskiy</surname> <given-names>B. L.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>EEG negativity in fixations used for gaze-based control: toward converting intentions into actions with an eye-brain-computer interface</article-title>. <source>Front. Neurosci</source>. <volume>10</volume>, <fpage>528</fpage>. <pub-id pub-id-type="doi">10.3389/fnins.2016.00528</pub-id><pub-id pub-id-type="pmid">27917105</pub-id></citation></ref>
<ref id="B41">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>van Erp</surname> <given-names>J.</given-names></name> <name><surname>Lotte</surname> <given-names>F.</given-names></name> <name><surname>Tangermann</surname> <given-names>M.</given-names></name></person-group> (<year>2012</year>). <article-title>Brain-computer interfaces: beyond medical applications</article-title>. <source>Computer</source> <volume>45</volume>, <fpage>26</fpage>&#x02013;<lpage>34</lpage>. <pub-id pub-id-type="doi">10.1109/MC.2012.107</pub-id></citation>
</ref>
<ref id="B42">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vasiljevic</surname> <given-names>G. A.</given-names></name> <name><surname>de Miranda</surname> <given-names>L. C.</given-names></name></person-group> (<year>2020</year>). <article-title>Brain&#x02013;computer interface games based on consumer-grade EEG devices: a systematic literature review</article-title>. <source>Int. J. Hum. Comput. Interact</source>. <volume>36</volume>, <fpage>105</fpage>&#x02013;<lpage>142</lpage>. <pub-id pub-id-type="doi">10.1080/10447318.2019.1612213</pub-id></citation>
</ref>
<ref id="B43">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wei</surname> <given-names>W.</given-names></name> <name><surname>Qiu</surname> <given-names>S.</given-names></name> <name><surname>Zhang</surname> <given-names>Y.</given-names></name> <name><surname>Mao</surname> <given-names>J.</given-names></name> <name><surname>He</surname> <given-names>H.</given-names></name></person-group> (<year>2022</year>). <article-title>ERP prototypical matching net: a meta-learning method for zero-calibration RSVP-based image retrieval</article-title>. <source>J. Neural Eng</source>. <volume>19</volume>, <fpage>026028</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2552/ac5eb7</pub-id><pub-id pub-id-type="pmid">35299166</pub-id></citation></ref>
<ref id="B44">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wen</surname> <given-names>D.</given-names></name> <name><surname>Liang</surname> <given-names>B.</given-names></name> <name><surname>Zhou</surname> <given-names>Y.</given-names></name> <name><surname>Chen</surname> <given-names>H.</given-names></name> <name><surname>Jung</surname> <given-names>T.- P.</given-names></name></person-group> (<year>2021</year>). <article-title>The current research of combining multi-modal brain-computer interfaces with virtual reality</article-title>. <source>IEEE J. Biomed. Health Informatics</source> <volume>25</volume>, <fpage>3278</fpage>&#x02013;<lpage>3287</lpage>. <pub-id pub-id-type="doi">10.1109/JBHI.2020.3047836</pub-id><pub-id pub-id-type="pmid">33373308</pub-id></citation></ref>
<ref id="B45">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zander</surname> <given-names>T. O.</given-names></name> <name><surname>Kothe</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). <article-title>Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general</article-title>. <source>J. Neural Eng</source>. <volume>8</volume>, <fpage>025005</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2560/8/2/025005</pub-id><pub-id pub-id-type="pmid">21436512</pub-id></citation></ref>
<ref id="B46">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zanini</surname> <given-names>P.</given-names></name> <name><surname>Congedo</surname> <given-names>M.</given-names></name> <name><surname>Jutten</surname> <given-names>C.</given-names></name> <name><surname>Said</surname> <given-names>S.</given-names></name> <name><surname>Berthoumieu</surname> <given-names>Y.</given-names></name></person-group> (<year>2017</year>). <article-title>Transfer learning: a Riemannian geometry framework with applications to brain&#x02013;computer interfaces</article-title>. <source>IEEE Trans. Biomed. Eng</source>. <volume>65</volume>, <fpage>1107</fpage>&#x02013;<lpage>1116</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2017.2742541</pub-id><pub-id pub-id-type="pmid">28841546</pub-id></citation></ref>
<ref id="B47">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zheng</surname> <given-names>L.</given-names></name> <name><surname>Pei</surname> <given-names>W.</given-names></name> <name><surname>Gao</surname> <given-names>X.</given-names></name> <name><surname>Zhang</surname> <given-names>L.</given-names></name> <name><surname>Wang</surname> <given-names>Y.</given-names></name></person-group> (<year>2022</year>). <article-title>A high-performance brain switch based on code-modulated visual evoked potentials</article-title>. <source>J. Neural Eng.</source> <volume>19</volume>, <fpage>016002</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2552/ac494f</pub-id><pub-id pub-id-type="pmid">34996051</pub-id></citation></ref>
</ref-list>
<fn-group>
<fn id="fn0001"><p><sup>1</sup>Zander and Kothe (<xref ref-type="bibr" rid="B45">2011</xref>) suggested a distinction between <italic>active</italic> and <italic>reactive</italic> BCIs, the latter depending on &#x0201C;brain activity arising in reaction to external stimulation, which is indirectly modulated by the user&#x0201D;. Here, I use the term &#x0201C;active BCI&#x0201D; for both these BCIs, as they both enable explicit, intentional control, with active role of the user.</p></fn>
</fn-group>
</back>
</article>