<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2013.00883</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Visual attention for a desktop virtual environment with ambient scent</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Toet</surname> <given-names>Alexander</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x0002A;</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Schaik</surname> <given-names>Martin G. van</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>TNO</institution> <country>Soesterberg, Netherlands</country></aff>
<aff id="aff2"><sup>2</sup><institution>Department of Information and Computing Sciences, University Utrecht</institution> <country>Utrecht, Netherlands</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: <italic>Ilona Croy, University of Gothenburg, Sweden</italic></p></fn>
<fn fn-type="edited-by"><p>Reviewed by: <italic>Johannes Frasnelli, Universit&#x000E9; de Montr&#x000E9;al, Canada; Han-Seok Seo, University of Arkansas, USA</italic></p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: <italic>Alexander Toet, TNO, Kampweg 5, 3769 DE Soesterberg, Netherlands e-mail: <email>lex.toet@tno.nl; lextoet@gmail.com</email></italic></p></fn>
<fn fn-type="other" id="fn002"><p>This article was submitted to Cognitive Science, a section of the journal Frontiers in
Psychology.</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>26</day>
<month>11</month>
<year>2013</year>
</pub-date>
<pub-date pub-type="collection">
<year>2013</year>
</pub-date>
<volume>4</volume>
<elocation-id>883</elocation-id>
<history>
<date date-type="received">
<day>11</day>
<month>09</month>
<year>2013</year>
</date>
<date date-type="accepted">
<day>06</day>
<month>11</month>
<year>2013</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2013 Toet and van Schaik.</copyright-statement>
<copyright-year>2013</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/3.0/"><p> This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>In the current study participants explored a desktop virtual environment (VE) representing a suburban neighborhood with signs of public disorder (neglect, vandalism, and crime), while being exposed to either room air (control group), or subliminal levels of tar (unpleasant; typically associated with burned or waste material) or freshly cut grass (pleasant; typically associated with natural or fresh material) ambient odor. They reported all signs of disorder they noticed during their walk together with their associated emotional response. Based on recent evidence that odors reflexively direct visual attention to (either semantically or affectively) congruent visual objects, we hypothesized that participants would notice more signs of disorder in the presence of ambient tar odor (since this odor may bias attention to unpleasant and negative features), and less signs of disorder in the presence of ambient grass odor (since this odor may bias visual attention toward the vegetation in the environment and away from the signs of disorder). Contrary to our expectations the results provide no indication that the presence of an ambient odor affected the participants&#x02019; visual attention for signs of disorder or their emotional response. However, the paradigm used in present study does not allow us to draw any conclusions in this respect. We conclude that a closer affective, semantic, or spatiotemporal link between the contents of a desktop VE and ambient scents may be required to effectively establish diagnostic associations that guide a user&#x02019;s attention. In the absence of these direct links, ambient scent may be more diagnostic for the physical environment of the observer as a whole than for the particular items in that environment (or, in this case, items represented in the VE).</p>
</abstract>
<kwd-group>
<kwd>attention</kwd>
<kwd>ambient odor</kwd>
<kwd>semantic congruency</kwd>
<kwd>affective congruency</kwd>
<kwd>virtual environment</kwd>
</kwd-group>
<counts>
<fig-count count="2"/>
<table-count count="1"/>
<equation-count count="0"/>
<ref-count count="88"/>
<page-count count="11"/>
<word-count count="0"/>
</counts>
</article-meta>
</front>
<body>
<sec>
<title>INTRODUCTION</title>
<sec>
<title>BACKGROUND</title>
<p>Desktop virtual environments (VEs) are increasingly deployed to study future design plans and the possible effects of environmental qualities and interventions on human behavior and feelings of safety in built environments with signs of public disorder (<xref ref-type="bibr" rid="B12">Cozens et al., 2003</xref>; <xref ref-type="bibr" rid="B45">Park et al., 2008</xref>, <xref ref-type="bibr" rid="B46">2010</xref>; <xref ref-type="bibr" rid="B62">Toet and van Schaik, 2012</xref>). Desktop VEs offer cost-effective, safe, controlled, and flexible environments that allow to investigate human response to a wide range of environmental factors without the constraints, distractions, and dangers of the real world (e.g., <xref ref-type="bibr" rid="B42">Nasar and Cubukcu, 2011</xref>). They are relatively cheap, widely available, and easy to use, while most users are familiar with these displays and their interaction devices. Desktop VEs are also preferred for communication of design and intervention plans because they can be made accessible to a large numbers of users in internet applications (<xref ref-type="bibr" rid="B13">Dang et al., 2012</xref>). For these applications it is essential that users perceive the desktop VE in a similar way as they would perceive its real world counterpart. Previous studies have shown that environmental characteristics like lighting, sound, and dynamic elements similarly affect the perception of desktop VEs and real environments (<xref ref-type="bibr" rid="B6">Bishop and Rohrmann, 2003</xref>; <xref ref-type="bibr" rid="B28">Houtkamp et al., 2008</xref>). Ambient scent is another important environmental characteristic that is currently lacking in most VEs. Ambient scent is known to significantly affect our perception of real environments (<xref ref-type="bibr" rid="B76">Wrzesniewski et al., 1999</xref>), and people have strong expectations about the way an environment should smell (<xref ref-type="bibr" rid="B26">Henshaw and Bruce, 2012</xref>). It has also been shown that ambient odor can increase the sense of presence in immersive VEs (<xref ref-type="bibr" rid="B16">Dinh et al., 1999</xref>; <xref ref-type="bibr" rid="B69">Washburn et al., 2003</xref>; <xref ref-type="bibr" rid="B64">Tortell et al., 2007</xref>). Thus, ambient odors may be an effective tool to tune the user perception of less immersive desktop VEs (e.g., by evoking implicit associations).</p>
<p>Despite the importance of scent in our everyday life olfaction is rarely applied in the scope of VEs (<xref ref-type="bibr" rid="B4">Baus and Bouchard, 2010</xref>). Recent technological developments enable the effective and localized dispersion and control of scents (<xref ref-type="bibr" rid="B78">Yanigada et al., 2003</xref>, <xref ref-type="bibr" rid="B79">2004</xref>, <xref ref-type="bibr" rid="B77">2005</xref>; <xref ref-type="bibr" rid="B80">Yu et al., 2003</xref>; <xref ref-type="bibr" rid="B43">Oshima et al., 2007</xref>; for reviews see <xref ref-type="bibr" rid="B52">Richard et al., 2006</xref>; <xref ref-type="bibr" rid="B53">Riener and Harders, 2012</xref>), thereby providing VE researchers and developers with the ability to utilize scent to create compelling VEs (<xref ref-type="bibr" rid="B63">Tomono et al., 2011</xref>). Enhancing VEs with olfactory stimuli may enhance user experience by heightening the sense of reality (<xref ref-type="bibr" rid="B10">Chalmers et al., 2009</xref>; <xref ref-type="bibr" rid="B22">Ghinea and Ademoye, 2011</xref>). It has indeed been shown that the addition of olfactory cues to an immersive VE can increase the user&#x02019;s sense of presence, memory and perceived realism of the simulated environment (<xref ref-type="bibr" rid="B16">Dinh et al., 1999</xref>; <xref ref-type="bibr" rid="B69">Washburn et al., 2003</xref>; <xref ref-type="bibr" rid="B64">Tortell et al., 2007</xref>). However, it is still unknown if ambient scents can influence the attention for details in a desktop VE (<xref ref-type="bibr" rid="B22">Ghinea and Ademoye, 2011</xref>).</p>
<p>In a previous study we found that signs of disorder influence the affective appraisal of a desktop VE to a large degree in a similar way as the appraisal of its real world counterpart (<xref ref-type="bibr" rid="B62">Toet and van Schaik, 2012</xref>). However, it appeared that participants focused more on signs of disorder in a desktop VE than in a similar real world environment. This finding, which may seriously degrade the ecological validity of VEs for the aforementioned applications, was partly reduced by the addition of a realistic soundscape to the VE simulation. We argued that in the real world the saliency of signs of public disorder is typically modulated by various environmental factors which are typically lacking in a desktop VE, such as ambient sounds, tactile or olfactory cues. For instance, their saliency may be ameliorated by the sound of birds, a soft warm breeze, sun, and pleasant ambient smells of fresh air and vegetation, or enhanced by loud noise, strong cold wind, or unpleasant (e.g., garbage and urine) smells. In this study we investigated if ambient odors can influence the visual attention for these details in a desktop VE.</p>
</sec>
<sec>
<title>VISUAL-OLFACTORY INTERACTIONS</title>
<p>Interactions between olfaction and vision appear to be widespread. Neuroimaging studies have shown that interaction between olfaction and vision occurs at multiple levels of information processing (<xref ref-type="bibr" rid="B23">Gottfried and Dolan, 2003</xref>; <xref ref-type="bibr" rid="B44">&#x000D6;sterbauer et al., 2005</xref>; <xref ref-type="bibr" rid="B68">Walla, 2008</xref>; <xref ref-type="bibr" rid="B58">Seubert et al., 2013</xref>). Also, it was found that stimulation of the human visual cortex enhances odor discrimination (<xref ref-type="bibr" rid="B30">Jadauji et al., 2012</xref>). Linking the perceptions of odors and colors appears to occur mainly in the amygdala and the orbitofrontal cortex (OFC; <xref ref-type="bibr" rid="B23">Gottfried and Dolan, 2003</xref>; <xref ref-type="bibr" rid="B44">&#x000D6;sterbauer et al., 2005</xref>).</p>
<p>The amygdala is a central perceptual node where information from olfactory, visual, auditory, and tactile modalities converges (<xref ref-type="bibr" rid="B81">Zald, 2003</xref>). It is an integral component of a distributed affective circuit in the mammalian brain that mediates both positive and negative affect and the processing of reward-predicting cues (<xref ref-type="bibr" rid="B41">Murray, 2007</xref>). Recent evidence suggests that the amygdala also plays a central causal role in the modulation of visual attention (<xref ref-type="bibr" rid="B67">Vuilleumier, 2005</xref>; <xref ref-type="bibr" rid="B71">Williams et al., 2005</xref>; <xref ref-type="bibr" rid="B18">Duncan and Feldman Barrett, 2007</xref>; for a recent overview see <xref ref-type="bibr" rid="B50">Pourtois et al., 2013</xref>). The amygdala enhances the visual saliency of affective targets (<xref ref-type="bibr" rid="B18">Duncan and Feldman Barrett, 2007</xref>). This implies that the activation state of the amygdala determines whether affective features or objects are prioritized. Since the amygdala responds to both positive and negative valenced odors (but not to neutral odors: <xref ref-type="bibr" rid="B72">Winston et al., 2005</xref>), olfactory induced amygdala activity may boost visual attention for affectively congruent (potentially threatening or rewarding) targets (<xref ref-type="bibr" rid="B67">Vuilleumier, 2005</xref>; <xref ref-type="bibr" rid="B37">Mohanty et al., 2009</xref>; <xref ref-type="bibr" rid="B29">Jacobs et al., 2012</xref>).</p>
<p>There is ample evidence for the visual modulation of olfactory perception. A neutral suprathrehold odor is rated significantly more pleasant after viewing positive pictures and significantly less pleasant and more intense after seeing unpleasant pictures (<xref ref-type="bibr" rid="B49">Pollatos et al., 2007</xref>). A visual feature that has a particular strong influence on odor perception is color (<xref ref-type="bibr" rid="B83">Zellner, 2013</xref>). Color enhances the perceived intensity of odors (independent of color appropriateness: <xref ref-type="bibr" rid="B85">Zellner and Kautz, 1990</xref>). Color also modulates the hedonic value of odors: both neural response in brain area encoding the hedonic value of smells (<xref ref-type="bibr" rid="B44">&#x000D6;sterbauer et al., 2005</xref>) and the subjectively judged pleasantness of color-odor combinations (<xref ref-type="bibr" rid="B84">Zellner et al., 1991</xref>) increase with perceived color-odor appropriateness. Odors are detected faster and more accurately in the presence of semantically congruent colors (<xref ref-type="bibr" rid="B84">Zellner et al., 1991</xref>) or pictures (<xref ref-type="bibr" rid="B23">Gottfried and Dolan, 2003</xref>; <xref ref-type="bibr" rid="B15">Dematt&#x000E8; et al., 2009</xref>), while incongruent colors and shape cues reduce odor discrimination accuracy (<xref ref-type="bibr" rid="B15">Dematt&#x000E8; et al., 2009</xref>). Color-smell associations can be so compelling that color can even completely change the quality of the perceived odor (a white wine is perceived as having the odor of a red wine when artificially colored red: <xref ref-type="bibr" rid="B40">Morrot et al., 2001</xref>). Visual-olfactory interactions appear to be automatic: color and shape cues affect the accuracy of odor discrimination, even when the information is task irrelevant and when participants are explicitly instructed to ignore these cues (<xref ref-type="bibr" rid="B15">Dematt&#x000E8; et al., 2009</xref>). Specific odor components of complex odor mixtures that are congruent with a presented color are perceived as more prominent, suggesting that color directs olfactory attention to color-associated components (<xref ref-type="bibr" rid="B2">Arao et al., 2012</xref>). Functional magnetic resonance imaging studies have shown neurophysiological correlates of olfactory response modulation by color cues: activity in caudal regions of the OFC and in the insular cortex increase progressively with perceived odor-color congruency (<xref ref-type="bibr" rid="B44">&#x000D6;sterbauer et al., 2005</xref>).</p>
<p>In contrast to the large amount of evidence for the visual modulation of olfactory perception, there are less reports on the reverse. However, recently evidence was presented that olfactory input can indeed modulate visual perception. Fear-related chemical signals modulate visual emotion perception in an emotion-specific way (<xref ref-type="bibr" rid="B86">Zhou and Chen, 2009</xref>), while unpleasant odors reduce perceived attractiveness of faces (<xref ref-type="bibr" rid="B14">Dematt&#x000E8; et al., 2007</xref>). Olfactory cues also bias the dynamic process of binocular rivalry: an odorant that is congruent with one of the competing images prolongs the time that image is visible and shortens its suppression time (<xref ref-type="bibr" rid="B87">Zhou et al., 2010</xref>, <xref ref-type="bibr" rid="B88">2012</xref>). Finally, subliminal olfactory cues modulate visual sex discriminations made on the basis of biological motion cues: ambiguous point-light walkers are more often judged as males in the presence of unconsciously perceived male sweat (<xref ref-type="bibr" rid="B24">Hacker et al., 2013</xref>). Hence, there is now sufficient evidence for the modulation of visual perception by olfactory input.</p>
</sec>
<sec>
<title>OLFACTION AND VISUAL ATTENTION</title>
<p>An organism continuously and simultaneously receives an overload of multisensory input from its environment. Because of limitations in processing capacity, simultaneous stimuli cannot be fully analyzed in parallel and thus compete for processing resources in order to gain access to higher cognitive stages and awareness. Attention serves as a gating mechanism to prioritize and enhance sensory information that is relevant for survival such as threats (<xref ref-type="bibr" rid="B21">Fox et al., 2002</xref>; <xref ref-type="bibr" rid="B31">Koster et al., 2004</xref>; <xref ref-type="bibr" rid="B70">Williams et al., 2006</xref>; <xref ref-type="bibr" rid="B34">Lin et al., 2009</xref>) or rewards (<xref ref-type="bibr" rid="B1">Anderson, 2013</xref>), while suppressing irrelevant information. Attentional selection is typically driven by stimulus saliency, novelty, and reward-related associations (<xref ref-type="bibr" rid="B1">Anderson, 2013</xref>). Attention acts upon and modulates information in each sensory modality (visual, auditory, olfactory, etc.; <xref ref-type="bibr" rid="B73">Woldorff et al., 1993</xref>; <xref ref-type="bibr" rid="B82">Zelano et al., 2005</xref>). Information from different sensory modalities is pre-attentively integrated into a unified coherent percept, resulting in multimodal internal representations in which attention can be directed (<xref ref-type="bibr" rid="B17">Driver and Spence, 1998</xref>). As a result, tactile (<xref ref-type="bibr" rid="B66">Van der Burg et al., 2009</xref>), auditory (<xref ref-type="bibr" rid="B65">Van der Burg et al., 2008</xref>), and olfactory (<xref ref-type="bibr" rid="B57">Seo et al., 2010</xref>; <xref ref-type="bibr" rid="B63">Tomono et al., 2011</xref>; <xref ref-type="bibr" rid="B56">Seigneuric et al., 2012</xref>; <xref ref-type="bibr" rid="B11">Chen et al., 2013</xref>; <xref ref-type="bibr" rid="B19">Durand et al., 2013</xref>) cues can boost the saliency of visual features, even when the cues provide no information about the location or nature of the visual feature. Thus, ambient odors (even at sub-threshold levels) can modulate visual attention (<xref ref-type="bibr" rid="B39">Morrin and Ratneshwar, 2000</xref>; <xref ref-type="bibr" rid="B35">Michael et al., 2003</xref>, <xref ref-type="bibr" rid="B36">2005</xref>; <xref ref-type="bibr" rid="B11">Chen et al., 2013</xref>), even in 4-month-old infants (<xref ref-type="bibr" rid="B19">Durand et al., 2013</xref>). Recent studies have shown that odors can reflexively direct visual attention to <italic>semantically congruent</italic> visual objects (<xref ref-type="bibr" rid="B57">Seo et al., 2010</xref>; <xref ref-type="bibr" rid="B63">Tomono et al., 2011</xref>; <xref ref-type="bibr" rid="B56">Seigneuric et al., 2012</xref>; <xref ref-type="bibr" rid="B11">Chen et al., 2013</xref>). Objects that are semantically congruent with a presented odor are looked at faster and more frequently than other objects in a scene (<xref ref-type="bibr" rid="B57">Seo et al., 2010</xref>; <xref ref-type="bibr" rid="B11">Chen et al., 2013</xref>), even if participants are not aware that an odor has been presented (<xref ref-type="bibr" rid="B55">Seigneuric et al., 2010</xref>). It appears that crossmodal odor-object associations are automatically activated, without the need for explicit odor identification (<xref ref-type="bibr" rid="B56">Seigneuric et al., 2012</xref>), thus boosting the saliency of the corresponding visual object (<xref ref-type="bibr" rid="B11">Chen et al., 2013</xref>). Ambient odors also bias visual attention to favor stimuli that are <italic>affectively congruent</italic> to their hedonic quality (a case of affect-biased attention: <xref ref-type="bibr" rid="B61">Todd et al., 2012</xref>). Pleasant odors facilitate the processing of positive visual cues (<xref ref-type="bibr" rid="B33">Lepp&#x000E4;nen and Hietanen, 2003</xref>), while unpleasant odors facilitate the processing of negative cues (<xref ref-type="bibr" rid="B20">Ehrlichman and Halpern, 1988</xref>) and inhibit the processing of positive cues (<xref ref-type="bibr" rid="B33">Lepp&#x000E4;nen and Hietanen, 2003</xref>). The pre-attentive affective bias induced by ambient unpleasant odors probably serves the ecological purpose of facilitating threat detection (<xref ref-type="bibr" rid="B32">Krusemark and Li, 2012</xref>).</p>
</sec>
<sec>
<title>CURRENT STUDY</title>
<p>The current study was performed to test if exposure to ambient odor can modulate the visual attention to signs of disorder in a desktop VE representing an urban area. Participants performed a walking tour through the VE while being exposed to either room air (control group), tar (typically perceived as unpleasant and frequently associated with burned or waste material), or the odor of freshly cut grass (typically perceived as pleasant and frequently associated with natural or fresh material). Whenever they noticed signs of disorder during their walk they reported their detection and their emotional response. The scent of cut grass had semantically congruent visual and auditory representations in the simulation, since the VE showed abundant greenery and contained the occasional sound of grass mowers in the associated soundtrack. The scent of tar could be associated with the occasional sounds of construction activities (e.g., hammering, sawing) in the soundtrack of the VE, and was affectively congruent with derelict areas in general. Since people tend to respond to an environment as a whole (a &#x0201C;molar&#x0201D; environment) rather than to its individual features (<xref ref-type="bibr" rid="B7">Bitner, 1992</xref>; <xref ref-type="bibr" rid="B5">Bell et al., 2010</xref>; <xref ref-type="bibr" rid="B8">Brosch et al., 2010</xref>; <xref ref-type="bibr" rid="B27">Houtkamp, 2012</xref>), and since affective qualities are prioritized in this categorization process (<xref ref-type="bibr" rid="B8">Brosch et al., 2010</xref>), the presence of an ambient scent with an affective (pleasant or unpleasant) loading was expected to bias the visual attention (away from or toward) for signs of disorder in the VE. More specifically, it was hypothesized that (H1) participants in the ambient tar (unpleasant) odor condition would report more signs of public disorder than participants in the control condition, because the unpleasant odor would bias visual attention to visual cues with a negative affective connotation. In contrast, it was expected that (H2) participants in the cut grass (pleasant) odor condition would report less signs of public disorder than participants in the control condition, because the smell of cut grass would bias their attention to the &#x02013; semantically congruent &#x02013; greenery and thereby distract them from the negative cues.</p>
</sec>
</sec>
<sec id="s1" sec-type="materials|methods">
<title>MATERIALS AND METHODS</title>
<sec>
<title>VIRTUAL ENVIRONMENT</title>
<p>A small area in the town of Soesterberg, The Netherlands (with a rectangular shape and a total extent of about 200 m &#x000D7; 200 m; coordinates 52&#x000B0;; 7&#x02032; N, 5&#x000B0;; 17&#x02032;34&#x02033; E:) was simulated in 3D using the Unreal Tournament 2004 game-engine v2.5 (Epic Games Inc.; for further details on the VE model and its contents see <xref ref-type="bibr" rid="B62">Toet and van Schaik, 2012</xref>). The area is enclosed by roads on four sides and contains blocks of houses, two squares with parking places, benches, and statues, two playgrounds with benches, and a network of pathways connecting the squares and playgrounds (see <bold>Figure <xref ref-type="fig" rid="F1">1</xref></bold>). All houses have a garden in the back, typically enclosed with a wooden fence, with an exit door to a pathway. The pathways are typically covered with tarmac, and bordered on both sides with trees and shrubs. The houses are generally well maintained and quite uniform. The pathways and parks are reasonably well kept. The walking route (designated by arrows drawn on the ground) had no intersections and covered most of the area. To simulate a state of public disorder 42 test items were distributed over 34 different locations in the VE. The items signaled three different classes of social incivilities: Neglect (24 items), Vandalism (one item), and Crime (17 items: see <bold>Table <xref ref-type="table" rid="T1">1</xref></bold>; <xref ref-type="bibr" rid="B47">Perkins et al., 1992</xref>; <xref ref-type="bibr" rid="B9">Caughy et al., 2001</xref>), and had social connotations ranging from indifference (e.g., litter, trash, dog droppings) and loitering (e.g. empty beer cans, cigarette butts, fast food wrappers) to vandalism (broken bus shelter windows) and predatory crime (smashed car windows, crime watch signs, CCTV cameras, and camera surveillance signs).</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption><p><bold>Screen shots of the virtual environment, showing locations with litter (A&#x02013;E), garbage (F,J), bicycle- and car parts (G&#x02013;J), warning signs (J&#x02013;M), cameras (K), and evidence of car burglary (M) and vandalism (N)</bold>.</p></caption>
<graphic xlink:href="fpsyg-04-00883-g001.tif"/>
</fig>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Experimental items, their connotations of physical and social disorder, and the experimental classification.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<th valign="top" align="left">Experimental items (no.)</th>
<th valign="top" align="left">Social connotations</th>
<th valign="top" align="left">Experimental class (no. of items)</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Garbage bags (2)</td>
<td valign="top" align="left">Neglect, indifference (Litter)</td>
<td valign="top" align="left">Neglect (24)</td>
</tr>
<tr>
<td valign="top" align="left">Cardboard boxes (1)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Newspapers, flyers (2)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Plastic shopping bags (2)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Dog droppings (3)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Bicycle frame (1)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Bicycle wheels (2)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Cigarette butts (1)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Empty beer cans (7)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Fast-food wrappers, boxes, paper cups (1)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Old car tires (2)<hr/></td>
<td valign="top" align="left"><hr/></td>
<td valign="top" align="left"><hr/></td>
</tr>
<tr>
<td valign="top" align="left">Bus shelter with broken windows (1)<hr/></td>
<td valign="top" align="left">Vandalism<hr/></td>
<td valign="top" align="left">Vandalism (1)<hr/></td>
</tr>
<tr>
<td valign="top" align="left">Smashed car windows and signs warning for car burglary (6)</td>
<td valign="top" align="left">Car burglary</td>
<td valign="top" align="left">Crime (17)</td>
</tr>
<tr>
<td valign="top" align="left">Neighborhood crime watch signs (3)</td>
<td valign="top" align="left">Home burglary</td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Signs that homes are protected by private security services (2)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">Signs that homes are protected by dogs (2)</td>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">CCTV security cameras and signs (4)</td>
<td valign="top" align="left">Predatory crime</td>
<td valign="top" align="left"></td></tr>
<tr>
<td valign="top" align="left"></td></tr></tbody></table>
<table-wrap-foot>
<attrib>Numbers in brackets indicate the number of test items present in the VE.</attrib>
</table-wrap-foot>
</table-wrap>
<p>The simulation was performed on Dell Precision 490 PC computers, equipped with Dell 19&#x02033; monitors. Logitech Rumblepad 2 Gamepads were used for navigation. User movement in the VE was from a first-person viewing perspective with walking motion supporting forward and backward movements and left and right rotation movements. User movement speed was fixed and collision detection enabled to prevent users from walking through objects. A non-repeating soundscape that was characteristic for the environment was composed from sounds (birds twittering, cars passing by, children shouting, hammering and drilling, and dogs barking) recorded at several locations and at different times in the corresponding real environment. The soundscape was presented through Sennheiser eH 150 headphones. A previous study showed that this soundscape effectively increased the ecological validity of the VE (<xref ref-type="bibr" rid="B62">Toet and van Schaik, 2012</xref>).</p>
</sec>
<sec>
<title>ODOR SELECTION</title>
<p>The scent of freshly cut grass was selected as a semantically congruent pleasant odor in this study. This scent is generally considered to be stimulating and refreshing (the smell of freshly cut grass ranks among the top five preferred smell in several recent independent large scale polls in Britain: <xref ref-type="bibr" rid="B51">Reynolds, 2012</xref>; <xref ref-type="bibr" rid="B25">Henning, 2013</xref>). Since the VE used in this study shows a lot of grass and vegetation, the scent of grass may direct attention toward the greenery (<xref ref-type="bibr" rid="B57">Seo et al., 2010</xref>; <xref ref-type="bibr" rid="B63">Tomono et al., 2011</xref>; <xref ref-type="bibr" rid="B56">Seigneuric et al., 2012</xref>). The smell of cut-grass was created by mixing ethanol with cis-3-hexenol (leaf alcohol) in a 9:1 ratio. The associations that could be elicited by this scent in combination with the VE were investigated by presenting it to a panel of 10 participants while they were viewing the VE. The scent was presented in small glass tubes containing a cotton swab with three to four drops of the solution and sniffed by the participants approximately 5&#x02033; from their nose. About 9 out of 10 participants reported associations with greenery (four mentioned grass, three named freshly cut leaves and one mentioned broken twigs). All participants judged the scent to be pleasant.</p>
<p>An affectively congruent unpleasant scent was selected in a pilot test from a set of eight candidate aversive smells. The candidate smells were respectively Burned Wood (RS/420), Reptile (RS/424), Diesel Fumes (RS/423), Metal (RS/426), Dusty (RS/425), Tar (RS/401), Cow Manure, and Natural Gas (all obtained from RetroScent, Rotterdam, The Netherlands: <ext-link ext-link-type="uri" xlink:href="http://www.geurmachine.nl"></ext-link>). The scents were identified by randomly assigned numbers, presented in small glass tubes containing a cotton swab with 3&#x02013;4 drops of aroma oil, and sniffed by the 10 participants of the pilot test in random order, approximately 5&#x02033; from their nose, while viewing the VE. The degree to which each scent fitted the VE (how environmentally appropriate the scent was for the VE) was evaluated on a 11 point Likert scale (ranging from 0 = <italic>absolutely not</italic> to 10 = <italic>definitely</italic>). Tar received the highest mean score (7.4), followed by Dusty (5.7). In addition, although the exact the nature of the tar smell was not identified by any of the testers, 8 out of 10 spontaneously reported associations with fire and burned material, while it was unanimously judged to be a very unpleasant scent that could occur in an environment as the one represented by the VE.</p>
<p>A second pilot test served to investigate the spontaneous associations that may be elicited by the two selected scents (grass and tar) independent of visual feedback. Three small glass tubes containing a cotton swab with three to four drops of either the grass odor solution, the tar aroma oil or clear tap water were presented in random order to 10 participants (who did not take part in the first pilot test). The tap water condition served as a control condition. The participants sniffed the samples approximately five inches from their nose, and rated respectively their pleasantness and familiarity on five point Likert scales (ranging from 0 = <italic>absolutely not</italic> to 4 = <italic>very much</italic>). The grass smell received the highest mean pleasantness rating (3.6), followed by tap water (2.6), while the tar smell received the lowest mean pleasantness rating (0.2). The tar smell received the highest mean familiarity score (2.9), followed by tap water (2.0), and grass (1.9). For the tar smell, 6 out of 10 participants reported associations with smoke, fire, and burned material, while two participants associated this smell with industrial activities, and two others had respectively associations with garages and garbage dumps. For the grass smell, 5 out of 10 participants reported associations with nature, flowers, pine trees, or leafs, one was reminded of fruit, while four participants associated it with air refreshers or cleaning material. Hence, the tar smell was frequently perceived as an unpleasant smell and associated with negative (burned or waste) material, while the grass smell was predominantly considered a pleasant smell associated with positive (natural) material.</p>
</sec>
<sec>
<title>ODOR DIFFUSION</title>
<p>Scents were diffused in the room (about 25 m<sup>2</sup>) through a commercial electronic dispenser (1-3 RS-Classic Scentvertiser, RetroScent, Rotterdam, The Netherlands: <ext-link ext-link-type="uri" xlink:href="http://www.geurmachine.nl"></ext-link>). No odor was applied in the control condition. The dispenser was placed out of the participant&#x02019;s sight behind a screen. The participants could not hear the sound of the dispenser when they wore their headphones and listened to the soundscape of the VE. The experimenter turned on the dispenser after the participants had started their tour through the VE and he turned it off before they were instructed to take off their headphones. Odor was intermittently diffused (with a cycle period of 1 min) during the experiment so that the participants received fluctuating concentrations over time, thus preventing full adaptation.</p>
<p>It is likely that both aversive and pleasant odors turn on the sensory-driven attentional systems even at subthreshold levels to facilitate the detection and analysis of behavioral relevant stimuli (<xref ref-type="bibr" rid="B32">Krusemark and Li, 2012</xref>). In this study olfactory stimulation was therefore intentionally performed at a near-threshold level to preclude the possibility of top-down influence on visual perception (e.g., the use of explicit search strategies), thereby narrowing the effects down to bottom-up sensory driven attentional systems facilitating threat or reward detection. Ideally, the odor intensity should be sufficiently strong to be just noticeable when attended to. The odor intensity used in this study was between low and intermediate, corresponding to a mean level between 3 and 5 on a 10-point scale. A pilot experiment was performed to determine a setting of the dispenser and a duty cycle that resulted in a mean rating of 5.</p>
<p>The room in which the test was performed was well ventilated prior to each session. Only one scent per day was diffused to avoid mixing odors, and the lab was fully ventilated overnight to remove any lingering trace of the scent. Before beginning the study each morning, the room was &#x0201C;sniff-tested&#x0201D; by the two experimenters; no odors were detected to have remained in the room.</p>
</sec>
<sec>
<title>INSTRUMENTS</title>
<sec>
<title>General questionnaire</title>
<p>As the results may be influenced by the characteristics of the participants, they were asked to complete a <italic>General Questionnaire</italic> including socio-demographic measures (sex, age, and education). Education was clustered into four groups: middle and higher level education, academic education, and other types of education.</p>
</sec>
<sec>
<title>Mental state questionnaire</title>
<p>A 7-item <italic>Mental State Questionnaire</italic> (adapted from <xref ref-type="bibr" rid="B59">Spielberger, 1983</xref>), consisting of four negative (<italic>agitated, angry, anxious, distressed</italic>), two neutral (<italic>calm, relaxed</italic>), and one positive (<italic>cheerful</italic>) emotional terms served to assess the emotions elicited by the individual incivilities. On each encounter with a sign of disorder during their walk participants reported their emotional reaction by selecting one of the seven items (&#x0201C;<italic>I feel&#x022EF;</italic>&#x0201D;).</p>
</sec>
<sec>
<title>Post-experiment questionnaire</title>
<p>A 4-item <italic>Post-Experiment Questionnaire</italic> contained three questions investigating the extent to which the ambient temperature, illumination, and atmosphere in the room were characteristic for the VE (these three items were scored on a 5-point Likert scale, ranging from 1 = <italic>completely disagree</italic> to 5 = <italic>completely agree</italic>) and an open question (&#x0201C;<italic>Was there anything else you noticed during the experiment</italic>?&#x0201D;) to test if the participants had noticed the ambient scent in the room.</p>
</sec>
</sec>
<sec>
<title>EXPERIMENTAL PROCEDURE</title>
<p>After their arrival at the laboratory, the participants first read and signed an informed consent form. Next, they filled out the <italic>General Questionnaire</italic>. Then they read the following instructions:</p>
<p>&#x0201C;<italic>The experiment concerns an area of Soesterberg near the TNO lab, and will take about 45 minutes. Citizens living in this area are concerned about the increasing social disorder in their neighborhood. They intend to draft a plan of action to confront this problem. After making an inventory of the different types of incivilities occurring in their neighborhood, the citizens will prioritize the order in which these should be addressed. To enable a large number of people to give their opinion on the social disorder in this area, the concerned citizens have commissioned a realistic and highly detailed computer model of their neighborhood.</italic></p>
<p><italic>It is your task to make a tour through this virtual model and assess the social disorder in this neighborhood. Your route is marked by arrows drawn on the ground. Each time you notice signs of incivilities (e.g., litter, dog droppings, broken car windows, etc.) during your inspection tour, you are requested to:</italic></p>
<list list-type="simple" prefix-word="simple">
<list-item><label><italic>1.</italic></label><p> <italic>Make a snapshot of each sign of incivilities you notice (by pressing key F12).</italic></p></list-item>
<list-item><label><italic>2.</italic></label><p> <italic>Enter a brief description of the incivility on your questionnaire.</italic></p></list-item>
<list-item><label><italic>3.</italic></label><p> <italic>Report your current mental state by choosing one of the 7 emotional terms on the &#x02018;Mental State Questionnaire&#x02019; (agitated, angry, anxious, distressed, calm, relaxed, cheerful).&#x0201D;</italic></p></list-item></list>
<p>Next, the experimenter verified if the participants had understood their instructions, and started the simulation. The experimenter then explained the function of the gamepad, and gave the participant the opportunity to practice maneuvering through the VE for about 5 min. At the end of this practice period the experimenter checked if the participant was able to perform the required maneuvers, and whether the participant paid attention to the arrows on the ground and the signs of disorder. Then, the experimenter gave the participants the printed questionnaires which they could use to fill out their reports, and positioned the point-of-view in the VE at the starting location, facing the direction of the route. The participants then put on their headphones and started their walkthrough, which they performed at their own pace. Each time the participants noticed signs of disorder during their walk they reported the item they had noticed and their current mental state. During the test, the experimenter was seated behind a screen in the room and intermittently turned on the odor dispenser at one minute intervals, maintaining a slightly fluctuating near threshold ambient odor level. Finally, after finishing their walkthrough, the participants filled out the <italic>Post-Experiment Questionnaire</italic>.</p>
<p>The experimental protocol was reviewed and approved by the TNO internal review board on experiments with human participants (TNO Toetsings Commissie Proefpersoon Experimenten, Soesterberg, The Netherlands), and was in accordance with the Helsinki Declaration of 1975, as revised in 2000 (<xref ref-type="bibr" rid="B74">World Medical Association, 2000</xref>). The participants provided their written informed consent prior to testing. The participants received a modest financial compensation for their participation.</p>
</sec>
<sec>
<title>PARTICIPANTS</title>
<p>The experiment was performed by 69 participants (3 groups of 23 each) that were selected from the TNO database of volunteers: 39 males and 30 females, aged 43 &#x000B1; 18 years. The selection criteria guaranteed that they were not familiar with the urban area represented by the VE, that they had no problems with their sense of smell, and that they all had normal (or corrected to normal) vision with no color deficiencies. Also, they were unaware of the aim of the experiment. The participants&#x02019; mean age, level of education, and computer proficiency and game experience were approximately the same for all three (no-ambient smell, ambient tar odor, and ambient grass odor) experimental conditions.</p>
</sec>
<sec>
<title>DATA ANALYSIS</title>
<p>The emotional responses reported for the detected signs of disorder (from the <italic>Mental State Questionnaires</italic>) were clustered for each of the three classes of experimental items: neglect, vandalism, and crime. Analysis of variance (ANOVA) was used to test the relationships between the main variables. Chi-squared tests were performed to determine whether observed frequencies were significantly different from expected frequencies. The statistical analyses were performed with IBM SPSS 20.0 for Windows. For all analyses a probability level of <italic>p</italic> &#x0003C; 0.05 was considered to be statistically significant.</p>
</sec>
</sec>
<sec>
<title>RESULTS</title>
<p>Chi-squared tests showed a significant difference (&#x003C7;<sup>2</sup> = 18.94; df = 4; <italic>p</italic> &#x02264; 0.05) between the observed and expected frequencies of the emotional responses (negative, neutral, or positive) associated with the reported items (signs of incivilities) in the classes Neglect, Vandalism, and Crime. Items in the classes Vandalism and Crime were more frequently associated with negative emotional responses than items in the class Neglect.</p>
<p><bold>Figure <xref ref-type="fig" rid="F2">2</xref></bold> lists the detection performance for items signaling<italic> Neglect </italic>and<italic> Crime</italic> in each of the three experimental conditions. To enable a comparison of the performance between the different experimental classes (that were each represented by a different number of test items) the results are expressed in percentages (for the sake of completeness this figure also provides the mean number of detected items for each condition). <bold>Figure <xref ref-type="fig" rid="F2">2</xref></bold> clearly shows that the relative detection performance is lower for signals of crime than for signals of neglect in all conditions.</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption><p><bold>Percentage of detected items signaling<italic> Neglect </italic>(a total of 24 items)<italic> </italic>and<italic> Crime</italic> (a total of 17 items) in each of the three experimental ambient scent conditions (no odor, grass odor, tar odor).</bold> The labels inside the bars represent the mean number of detected items. The error bars represent the standard error in the mean.</p></caption>
<graphic xlink:href="fpsyg-04-00883-g002.tif"/>
</fig>
<p>A one-way ANOVA showed that the mean numbers of detected items signaling<italic> </italic>respectively<italic> Neglect </italic>and<italic> Crime </italic>did not differ significantly<italic> </italic>between the three ambient odor conditions. More specifically, there were no significant differences between the <italic>Control</italic> and <italic>Grass</italic> (respectively <italic>F</italic><sub>1,42</sub> = 0.57, <italic>p</italic> = 0.45 and <italic>F</italic><sub>1,37</sub>= 1.76, <italic>p</italic> = 0.19), <italic>Control</italic> and <italic>Tar</italic> (respectively <italic>F</italic><sub>1,45</sub>= 3.10, <italic>p</italic> = 0.09 and <italic>F</italic><sub>1,36</sub>= 0.96, <italic>p</italic> = 0.33) and between the <italic>Tar</italic> and <italic>Grass</italic> (respectively <italic>F</italic><sub>1,42</sub>= 0.79, <italic>p</italic> = 0.38 and <italic>F</italic><sub>1,38</sub>= 0.01, <italic>p</italic> = 0.93) conditions. Hence, the hypotheses (H1 and H2) that participants in the (un)pleasant odor condition would notice (more) less signs of disorder than participants in the (odorless) control condition is not supported by the present data. Compared to the control (odorless) condition, participants reported the same mean number (percentage) of signs of disorder in both (tar and grass) ambient odor conditions. In addition, there appears to be no effect of the hedonic tone of the ambient odor on visual attention toward neglect or crime objects. Also, ambient scent did not affect participants&#x02019; subjectively reported emotional state. Since there were no main or interaction effects of age and level of education, these factors were omitted from later analyses.</p>
<p>The VE contained multiple objects representing<italic> Neglect </italic>and<italic> Crime, </italic>but<italic> </italic>only a single object signaling <italic>Vandalism</italic> (a broken bus shelter). Since this item was rather conspicuous it was never missed by any of the participants. Hence, the results for this item have no discriminative value and are therefore not further discussed in this study.</p>
<p>In response to the open question in the <italic>Post-Experiment Questionnaire</italic> one participant (out of 23) claimed to have noticed a Lysol smell in the room in the control condition. In the tar odor condition one participant (out of 23) reported to have noticed a smell, but he was unable to identify its nature, and did not link the odor to the exploration task. No participant noticed a smell in the grass odor condition.</p>
</sec>
<sec>
<title>DISCUSSION</title>
<p>Based on the present we cannot conclude whether a subliminal ambient scent can affect the perception of the VE. The finding that ambient scent did not seem to affect participants&#x02019; subjectively reported emotional state agrees with similar findings from related earlier studies who observed that pleasant ambient scents did not affect self-reported mood and arousal (<xref ref-type="bibr" rid="B39">Morrin and Ratneshwar, 2000</xref>, <xref ref-type="bibr" rid="B38">2003</xref>; <xref ref-type="bibr" rid="B60">Teller and Dennis, 2011</xref>).</p>
<p>Contrary to our expectations the presence of the ambient odors also did not bias the participants&#x02019; attention for the experimental items. Thus, we found no indication that ambient smell of a given nature selectively biases visual attention to details in a desktop VE. The design of the current study does not allow to determine whether the fact that we did not observe an effect is due to (1) the absence of an effect or (2) the limited power of the study design itself. In any case, it appears that ambient smell may only have limited effectiveness as a tool to direct a user&#x02019;s attention to specific details in a desktop VE. This result is somewhat surprising given the substantial amount of evidence that odors draw visual attention to congruent visual objects (e.g., <xref ref-type="bibr" rid="B57">Seo et al., 2010</xref>; <xref ref-type="bibr" rid="B63">Tomono et al., 2011</xref>; <xref ref-type="bibr" rid="B56">Seigneuric et al., 2012</xref>; <xref ref-type="bibr" rid="B11">Chen et al., 2013</xref>). However, the present result agrees with earlier reports that ambient scent has no effect on shopping behavior (<xref ref-type="bibr" rid="B54">Schifferstein and Blok, 2002</xref>; <xref ref-type="bibr" rid="B60">Teller and Dennis, 2011</xref>). It has in fact been argued that previous reports of significant effects of ambient scents on perception, emotions, and behavior in shopping environments need to be taken with care since most previous studies typically did not control for different sources of bias (<xref ref-type="bibr" rid="B60">Teller and Dennis, 2011</xref>). Our results also agree with those of <xref ref-type="bibr" rid="B54">Schifferstein and Blok (2002)</xref>, who found that the scent of freshly cut grass did not affect sales of thematically (in-) congruent products. They argue that ambient scent is probably more diagnostic for the physical environment of the observer than for the particular items in that environment. This suggests that ambient scent may only effectively guide visual attention when there is a close link between the affective or semantic qualities of the scent and visual features in the VE. Although there may be a semantic link between the scent of cut grass and the greenery shown in the VE, the link between the scent of tar and signs of disorder is probably less evident. Also, more immersive VEs may be required to automatically establish associations between ambient scents and the VE itself. In case of desktop VEs, a close spatiotemporal link between the contents of the desktop VE and the scents with which they are supposed to be associated may be required to effectively establish diagnostic associations (i.e., smells and visual features may need to appear and disappear together to effectively induce the illusion that the smells actually emanate from the objects shown on the screen) that guide a user&#x02019;s attention.</p>
<p>Experimental items signaling vandalism (e.g., a damaged bus shelter) and crime (e.g., home protection signs and cameras) more frequently evoked negative affective appraisals than items representing neglect (e.g., litter, dog droppings, old bicycle parts). This finding agrees with the discriminant validity of different types of perceptual incivilities that is also found in the real world (e.g., between crime and social incivilities: <xref ref-type="bibr" rid="B75">Worrall, 2006</xref>; <xref ref-type="bibr" rid="B3">Armstrong and Katz, 2010</xref>). In reality, signs of crime are also more likely to evoke negative appraisals since they are typically associated with the risk of personal victimization (<xref ref-type="bibr" rid="B48">Phillips and Smith, 2004</xref>). This finding suggests that the affective appraisal of the VE had at least some ecological validity.</p>
<p>In all experimental conditions, the relative detection performance for signals of crime was lower than for signals of neglect. This is probably due to the fact that most signals of crime (i.e., the warning signs and CCTV cameras) were positioned at eye height or higher in the VE (e.g., attached to trees, lamp posts, or walls), while the signals of neglect were on the ground or on low supports (statues). Although participants were informed about the nature of the signals of disorder, and shown examples during their introduction to the experiment, they may have focused primarily on the signs of neglect on the ground and may have paid less attention to signals higher up in the scene. The fact that the walking route was indicated by arrows drawn on the ground may also have induced a bias for downward perception.</p>
<p>Summarizing, the present study does not allow us to conclude whether ambient odors may be an effective tool to direct a user&#x02019;s attention to specific (congruent) objects in a desktop VE (e.g., by evoking implicit associations).</p>
<sec>
<title>LIMITATIONS OF THE PRESENT STUDY</title>
<p>In previous studies on the effects of odor on visual attention participants freely inspected visual scenes without any explicit instructions, and odor induced attentional bias became manifest in spontaneous fixation behavior (<xref ref-type="bibr" rid="B55">Seigneuric et al., 2010</xref>; <xref ref-type="bibr" rid="B57">Seo et al., 2010</xref>). In the current study the participants were explicitly instructed to look for signs of disorder in the VE. The cognitive effort associated with this strict assignment may have overruled any odor induced attentional bias effects. However, the fact that only a fraction of the targets was actually noticed suggests that there was still room for odor modulated performance enhancement.</p>
<p>The walking route through the VE was indicated by arrows drawn on the ground, which may have induced a bias for visual search near the ground. Unfortunately, fixation behavior was not measured in this study, so this hypothesis cannot be verified.</p>
<p>The scent of grass had an explicit visual representation in the VE, while the scent of tar could only implicitly be linked to visual (litter) and auditory (construction sounds) elements in the VE. Future studies should preferably employ scents that have explicit and unequivocal visual counterparts in the VE. Also, a range of both (1) neutral odors or odors with the same valence but different semantic connotations, and (2) odors of different valence but without any semantic counterparts in the VE should be deployed to enable a distinction between effects induced by hedonic or semantic congruency.</p>
<p>There was only one sign of vandalism in this study (the broken bus shelter) which was also highly salient. As a result this item had no discriminant value. Future studies should include a larger number of test items for each experimental class, with different (including low) visual saliencies. The attention enhancing effects of olfactory cues may be more prominent for targets with low visual saliencies.</p>
<p>The participants in this study reported that they had no problems with their sense of smell at the time of this experiment. Also, there were no entries in the TNO database of volunteers that any olfactory deficiencies had been noted during their participation in previous smell experiments. However, since we did not explicitly test their sense of smell in the current experiment there is no guarantee that they all had normal olfactory function.</p>
</sec>
<sec>
<title>SUGGESTIONS FOR FUTURE RESEARCH</title>
<p>It would be interesting to test whether the finding that specific odors can reflexively direct visual attention to <italic>semantically congruent</italic> visual objects (<xref ref-type="bibr" rid="B57">Seo et al., 2010</xref>; <xref ref-type="bibr" rid="B63">Tomono et al., 2011</xref>; <xref ref-type="bibr" rid="B56">Seigneuric et al., 2012</xref>; <xref ref-type="bibr" rid="B11">Chen et al., 2013</xref>) can also be replicated with dynamic desktop VEs. To effectively guide a user&#x02019;s attention dynamic olfactory displays are probably required so that a close spatiotemporal link may be established between the contents of the VE and the scents with which they are supposed to be associated.</p>
<p>Future studies should also register eye movements, since human fixation behavior may provide valuable additional information to subjectively reported results. Also, future studies should track the exact path of the participants through the VE. It is in principle possible that participants use scent cues to adjust their distance to certain items in the VE (e.g., that they show an approach or avoidance behavior, maintaining a larger distance to unpleasant smelling items, and coming closer to pleasant smelling items). Since distance affects the visual saliency and detectability of targets this may affect the results. Path deviations are not likely to be a significant confounding factor in the present study, since most parts of the route were rather narrow and did not leave much room for deviations.</p>
<p>It has previously been shown that the addition of olfactory cues to an immersive VE increases the user&#x02019;s sense of presence and perceived realism of the simulated environment, and ultimately his memory for details therein (<xref ref-type="bibr" rid="B16">Dinh et al., 1999</xref>; <xref ref-type="bibr" rid="B69">Washburn et al., 2003</xref>; <xref ref-type="bibr" rid="B64">Tortell et al., 2007</xref>). It would therefore be interesting to investigate whether an odor induced visual attention bias may also become manifest for desktop VEs when memory for details is tested instead of the number of detections. From the abovementioned previous studies we expect that participants in an (un)pleasant odor condition will remember (more) less signs of disorder than participants in an odorless control condition after completing their inspection tour of the VE.</p>
</sec>
</sec>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ref-list>
<title>REFERENCES</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Anderson</surname> <given-names>B. A.</given-names></name></person-group> (<year>2013</year>). <article-title>A value-driven mechanism of attentional selection.</article-title> <source><italic>J. Vis.</italic></source> <volume>13</volume> <issue>7</issue><pub-id pub-id-type="doi"> 10.1167/13.3.7</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Arao</surname> <given-names>M.</given-names></name> <name><surname>Suzuki</surname> <given-names>M.</given-names></name> <name><surname>Katayama</surname> <given-names>J.</given-names></name> <name><surname>Akihiro</surname> <given-names>Y.</given-names></name></person-group> (<year>2012</year>). <article-title>An odorant congruent with a colour cue is selectively perceived in an odour mixture.</article-title> <source><italic>Perception</italic></source> <volume>41</volume> <fpage>474</fpage>&#x02013;<lpage>482</lpage>. <pub-id pub-id-type="doi">10.1068/p7152</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Armstrong</surname> <given-names>T.</given-names></name> <name><surname>Katz</surname> <given-names>C.</given-names></name></person-group> (<year>2010</year>). <article-title>Further evidence on the discriminant validity of perceptual incivilities measures.</article-title> <source><italic>Justice Q.</italic></source> <volume>27</volume> <fpage>280</fpage>&#x02013;<lpage>304</lpage>. <pub-id pub-id-type="doi">10.1080/07418820802506198</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Baus</surname> <given-names>O.</given-names></name> <name><surname>Bouchard</surname> <given-names>S.</given-names></name></person-group> (<year>2010</year>). <article-title>The sense of olfaction: its characteristics and its possible applications in virtual environments.</article-title> <source><italic>J. Cyber Ther. Rehab.</italic></source> <volume>3</volume> <fpage>31</fpage>&#x02013;<lpage>50</lpage>.</citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bell</surname> <given-names>P. A.</given-names></name> <name><surname>Greene</surname> <given-names>T. C.</given-names></name> <name><surname>Fisher</surname> <given-names>J. D.</given-names></name> <name><surname>Baum</surname> <given-names>A.</given-names></name></person-group> (<year>2010</year>). <source><italic>Environmental Psychology</italic>.</source> <edition>5th Edn</edition>. <publisher-loc>London</publisher-loc>: <publisher-name>Lawrence Erlbaum Associates</publisher-name>.</citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bishop</surname> <given-names>I. D.</given-names></name> <name><surname>Rohrmann</surname> <given-names>B.</given-names></name></person-group> (<year>2003</year>). <article-title>Subjective responses to simulated and real environments: a comparison.</article-title> <source><italic>Landsc. Urban Plan.</italic></source> <volume>65</volume> <fpage>261</fpage>&#x02013;<lpage>277</lpage>. <pub-id pub-id-type="doi">10.1016/S0169-2046(03)00070-7</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bitner</surname> <given-names>M. J.</given-names></name></person-group> (<year>1992</year>). <article-title>Servicescapes: the impact of physical surroundings on customers and employees.</article-title> <source><italic>J. Mark.</italic></source> <volume>56</volume> <fpage>57</fpage>&#x02013;<lpage>71</lpage>. <pub-id pub-id-type="doi">10.2307/1252042</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brosch</surname> <given-names>T.</given-names></name> <name><surname>Pourtois</surname> <given-names>G.</given-names></name> <name><surname>Sander</surname> <given-names>D.</given-names></name></person-group> (<year>2010</year>). <article-title>The perception and categorisation of emotional stimuli: a review.</article-title> <source><italic>Cogn. Emot.</italic></source> <volume>24</volume> <fpage>377</fpage>&#x02013;<lpage>400</lpage>. <pub-id pub-id-type="doi">10.1080/02699930902975754</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caughy</surname> <given-names>M. O.</given-names></name> <name><surname>O&#x02019;Campo</surname> <given-names>P. J.</given-names></name> <name><surname>Patterson</surname> <given-names>J.</given-names></name></person-group> (<year>2001</year>). <article-title>A brief observational measure for urban neighborhoods.</article-title> <source><italic>Health Place</italic></source> <volume>7</volume> <fpage>225</fpage>&#x02013;<lpage>236</lpage>. <pub-id pub-id-type="doi">10.1016/S1353-8292(01)00012-0</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chalmers</surname> <given-names>A.</given-names></name> <name><surname>Debattista</surname> <given-names>K.</given-names></name> <name><surname>Ramic-Brkic</surname> <given-names>B.</given-names></name></person-group> (<year>2009</year>). <article-title>Towards high-fidelity multi-sensory virtual environments.</article-title> <source><italic>Vis. Comput.</italic></source> <volume>25</volume> <fpage>1101</fpage>&#x02013;<lpage>1108</lpage>. <pub-id pub-id-type="doi">10.1007/s00371-009-0389-2</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>K.</given-names></name> <name><surname>Zhou</surname> <given-names>B.</given-names></name> <name><surname>Chen</surname> <given-names>S.</given-names></name> <name><surname>He</surname> <given-names>S.</given-names></name> <name><surname>Zhou</surname> <given-names>W.</given-names></name></person-group> (<year>2013</year>). <article-title>Olfaction spontaneously highlights visual saliency map.</article-title> <source><italic>Proc. Biol. Sci.</italic></source> <volume>280</volume> <fpage>1</fpage>&#x02013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1098/rspb.2013.1729</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cozens</surname> <given-names>P.</given-names></name> <name><surname>Neal</surname> <given-names>R.</given-names></name> <name><surname>Whitaker</surname> <given-names>J.</given-names></name> <name><surname>Hillier</surname> <given-names>D.</given-names></name></person-group> (<year>2003</year>). <article-title>Investigating personal safety at railway stations using &#x0201C;virtual reality&#x0201D; technology.</article-title> <source><italic>Facilties</italic></source> <volume>21</volume> <fpage>188</fpage>&#x02013;<lpage>194</lpage>. <pub-id pub-id-type="doi">10.1108/02632770310489936</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dang</surname> <given-names>A.</given-names></name> <name><surname>Liang</surname> <given-names>W.</given-names></name> <name><surname>Chi</surname> <given-names>W.</given-names></name></person-group> (<year>2012</year>). <article-title>&#x0201C;Review of VR application in digital urban planning and managing,&#x0201D; in</article-title> <source><italic>Geospatial Techniques in Urban Planning</italic></source> <role>ed.</role> <person-group person-group-type="editor"><name><surname>Shen</surname> <given-names>Z.</given-names></name></person-group> (<publisher-loc>Berlin</publisher-loc>: <publisher-name>Springer</publisher-name>) <fpage>131</fpage>&#x02013;<lpage>154</lpage>.</citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dematt&#x000E8;</surname> <given-names>M. L.</given-names></name> <name><surname>Osterbauer</surname> <given-names>R.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2007</year>). <article-title>Olfactory cues modulate facial attractiveness.</article-title> <source><italic>Chem. Sens.</italic></source> <volume>32</volume> <fpage>603</fpage>&#x02013;<lpage>610</lpage>. <pub-id pub-id-type="doi">10.1093/chemse/bjm030</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dematt&#x000E8;</surname> <given-names>M. L.</given-names></name> <name><surname>Sanabria</surname> <given-names>D.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>2009</year>). <article-title>Olfactory discrimination: when vision matters?</article-title> <source><italic>Chem. Sens.</italic></source> <volume>34</volume> <fpage>103</fpage>&#x02013;<lpage>109</lpage>. <pub-id pub-id-type="doi">10.1093/chemse/bjn055</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dinh</surname> <given-names>H. Q.</given-names></name> <name><surname>Walker</surname> <given-names>N.</given-names></name> <name><surname>Hodges</surname> <given-names>L. F.</given-names></name> <name><surname>Song</surname> <given-names>C.</given-names></name> <name><surname>Kobayashi</surname> <given-names>A.</given-names></name></person-group> (<year>1999</year>). <article-title>&#x0201C;Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments,&#x0201D; in</article-title> <source><italic>Proceedings of the Virtual Reality Annual International Symposium</italic></source> (<publisher-loc>Piscataway, NJ</publisher-loc>: <publisher-name>IEEE Press</publisher-name>) <fpage>222</fpage>&#x02013;<lpage>228</lpage>.</citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Driver</surname> <given-names>J.</given-names></name> <name><surname>Spence</surname> <given-names>C.</given-names></name></person-group> (<year>1998</year>). <article-title>Cross-modal links in spatial attention.</article-title> <source><italic>Phil. Trans. R. Soc. B Biol. Sci.</italic></source> <volume>353</volume> <fpage>1319</fpage>&#x02013;<lpage>1331</lpage>. <pub-id pub-id-type="doi">10.1098/rstb.1998.0286</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Duncan</surname> <given-names>S</given-names></name><name><surname>Feldman Barrett</surname> <given-names>L.</given-names></name></person-group> (<year>2007</year>). <article-title>The role of the amygdala in visual awareness.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>11</volume> <fpage>190</fpage>&#x02013;<lpage>192</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2007.01.007</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Durand</surname> <given-names>K.</given-names></name> <name><surname>Baudouin</surname> <given-names>J. Y.</given-names></name> <name><surname>Lewkowicz</surname> <given-names>D. J.</given-names></name> <name><surname>Goubet</surname> <given-names>N.</given-names></name> <name><surname>Schaal</surname> <given-names>B.</given-names></name></person-group> (<year>2013</year>). <article-title>Eye-catching odors: olfaction elicits sustained gazing to faces and eyes in 4-month-old infants.</article-title> <source><italic>PLoS ONE </italic>8:e70677.</source> <pub-id pub-id-type="doi"> 10.1371/journal.pone.0070677</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ehrlichman</surname> <given-names>H.</given-names></name> <name><surname>Halpern</surname> <given-names>J. N.</given-names></name></person-group> (<year>1988</year>). <article-title>Affect and memory: effects of pleasant and unpleasant odors on retrieval of happy and unhappy memories.</article-title> <source><italic>J. Pers. Soc. Psychol.</italic></source> <volume>55</volume> <fpage>769</fpage>&#x02013;<lpage>779</lpage>. <pub-id pub-id-type="doi">10.1037/0022-3514.55.5.769</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fox</surname> <given-names>E.</given-names></name> <name><surname>Russo</surname> <given-names>R.</given-names></name> <name><surname>Dutton</surname> <given-names>K.</given-names></name></person-group> (<year>2002</year>). <article-title>Attentional bias for threat: evidence for delayed disengagement from emotional faces.</article-title> <source><italic>Cogn. Emot.</italic></source> <volume>16</volume> <fpage>355</fpage>&#x02013;<lpage>379</lpage>. <pub-id pub-id-type="doi">10.1080/02699930143000527</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ghinea</surname> <given-names>G.</given-names></name> <name><surname>Ademoye</surname> <given-names>O. A.</given-names></name></person-group> (<year>2011</year>). <article-title>Olfaction-enhanced multimedia: perspectives and challenges.</article-title> <source><italic>Multimed. Tools Appl.</italic></source> <volume>55</volume> <fpage>601</fpage>&#x02013;<lpage>626</lpage>. <pub-id pub-id-type="doi">10.1007/s11042-010-0581-4</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gottfried</surname> <given-names>J. A.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2003</year>). <article-title>The nose smells what the eye sees: crossmodal visual facilitation of human olfactory perception.</article-title> <source><italic>Neuron</italic></source> <volume>39</volume> <fpage>375</fpage>&#x02013;<lpage>386</lpage>. <pub-id pub-id-type="doi">10.1016/S0896-6273(03)00392-1</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hacker</surname> <given-names>G.</given-names></name> <name><surname>Brooks</surname> <given-names>A</given-names></name><name><surname>van der Zwan</surname> <given-names>R.</given-names></name></person-group> (<year>2013</year>). <article-title>Sex discriminations made on the basis of ambiguous visual cues can be affected by the presence of an olfactory cue.</article-title> <source><italic>BMC Psychol.</italic></source> <volume>1</volume>:<issue>10</issue>. <pub-id pub-id-type="doi"> 10.1186/2050-7283-1-10</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Henning</surname> <given-names>E.</given-names></name></person-group> (<year>2013</year>). <article-title><italic>Britain&#x02019;s Favourite Smells</italic>.</article-title> <comment>Available at: <ext-link ext-link-type="uri" xlink:href="http://blog.ambius.com"></ext-link>/britains-favourite-smells</comment></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Henshaw</surname> <given-names>V.</given-names></name> <name><surname>Bruce</surname> <given-names>N.</given-names></name></person-group> (<year>2012</year>). <article-title>&#x0201C;Smell and sound expectation and the ambiances of English cities,&#x0201D; in</article-title> <source><italic>>Proceedings of the 2nd International Congress on Ambiances</italic></source> (<comment>Montreal</comment>) <fpage>449</fpage>&#x02013;<lpage>454</lpage>.</citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Houtkamp</surname> <given-names>J. M.</given-names></name></person-group> (<year>2012</year>). <article-title><italic>Affective Appraisal of Virtual Environments</italic>.</article-title> <publisher-name>Ph.D. Thesis. University Utrecht</publisher-name> <publisher-loc>Utrecht</publisher-loc>. <comment>Available at: <ext-link ext-link-type="uri" xlink:href="http://igitur-archive.library.uu.nl/dissertations/2012-0620-200449/UUindex.html"></ext-link></comment></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Houtkamp</surname> <given-names>J. M.</given-names></name> <name><surname>Schuurink</surname> <given-names>E. L.</given-names></name> <name><surname>Toet</surname> <given-names>A.</given-names></name></person-group> (<year>2008</year>). <article-title>&#x0201C;Thunderstorms in my computer: the effect of visual dynamics and sound in a 3D environment,&#x0201D; in</article-title> <source><italic>Proceedings of the International Conference on Visualisation in Built and Rural Environments BuiltViz&#x02019;08</italic></source> <role>eds</role> <person-group person-group-type="editor"><name><surname>Bannatyne</surname> <given-names>M.</given-names></name> <name><surname>Counsell</surname> <given-names>J.</given-names></name></person-group> (<publisher-loc>Los Alamitos</publisher-loc>: <publisher-name>IEEE Computer Society</publisher-name>) <fpage>11</fpage>&#x02013;<lpage>17</lpage>.</citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jacobs</surname> <given-names>R. H.</given-names></name> <name><surname>Renken</surname> <given-names>R.</given-names></name> <name><surname>Aleman</surname> <given-names>A.</given-names></name> <name><surname>Cornelissen</surname> <given-names>F. W.</given-names></name></person-group> (<year>2012</year>). <article-title>The amygdala, top-down effects, and selective attention to features.</article-title> <source><italic>Neurosci. Biobehav. Rev.</italic></source> <volume>36</volume> <fpage>2069</fpage>&#x02013;<lpage>2084</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2012.05.011</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jadauji</surname> <given-names>J. B.</given-names></name> <name><surname>Djordjevic</surname> <given-names>J.</given-names></name> <name><surname>Lundstr&#x000D5;m</surname> <given-names>J. N.</given-names></name><name><surname>Pack</surname> <given-names>C. C.</given-names></name></person-group> (<year>2012</year>). <article-title>Modulation of olfactory perception by visual cortex stimulation.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>32</volume> <fpage>3095</fpage>&#x02013;<lpage>3100</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.6022-11.2012</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Koster</surname> <given-names>E. H.</given-names></name> <name><surname>Crombez</surname> <given-names>G.</given-names></name> <name><surname>Van Damme</surname> <given-names>S.</given-names></name> <name><surname>Verschuere</surname> <given-names>B</given-names></name><name><surname>De Houwer</surname> <given-names>J.</given-names></name></person-group> (<year>2004</year>). <article-title>Does imminent threat capture and hold attention?</article-title> <source><italic>Emotion</italic></source> <volume>4</volume> <fpage>312</fpage>&#x02013;<lpage>317</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.4.3.312</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Krusemark</surname> <given-names>E.</given-names></name> <name><surname>Li</surname> <given-names>W.</given-names></name></person-group> (<year>2012</year>). <article-title>Enhanced olfactory sensory perception of threat in anxiety: an event-related fMRI study.</article-title> <source><italic>Chemosens. Percept.</italic></source> <volume>5</volume> <fpage>37</fpage>&#x02013;<lpage>45</lpage>. <pub-id pub-id-type="doi">10.1007/s12078-011-9111-7</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal">Lepp&#x000E4;<person-group person-group-type="author"><name><surname>nen</surname> <given-names>J. M.</given-names></name> <name><surname>Hietanen</surname> <given-names>J. K.</given-names></name></person-group> (<year>2003</year>). <article-title>Affect and face perception: odors modulate the recognition advantage of happy faces.</article-title> <source><italic>Emotion</italic></source> <volume>3</volume> <fpage>315</fpage>&#x02013;<lpage>326</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.3.4.315</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lin</surname> <given-names>J. Y.</given-names></name> <name><surname>Murray</surname> <given-names>S. O.</given-names></name> <name><surname>Boynton</surname> <given-names>G. M.</given-names></name></person-group> (<year>2009</year>). <article-title>Capture of attention to threatening stimuli without perceptual awareness.</article-title> <source><italic>Curr. Biol.</italic></source> <volume>19</volume> <fpage>1118</fpage>&#x02013;<lpage>1122</lpage>. <pub-id pub-id-type="doi">10.1016/j.cub.2009.05.021</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Michael</surname> <given-names>G. A.</given-names></name> <name><surname>Jacquot</surname> <given-names>L.</given-names></name> <name><surname>Millot</surname> <given-names>J.-L.</given-names></name> <name><surname>Brand</surname> <given-names>G.</given-names></name></person-group> (<year>2003</year>). <article-title>Ambient odors modulate visual attentional capture.</article-title> <source><italic>Neurosci. Lett.</italic></source> <volume>352</volume> <fpage>221</fpage>&#x02013;<lpage>225</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2003.08.068</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Michael</surname> <given-names>G. A.</given-names></name> <name><surname>Jacquot</surname> <given-names>L.</given-names></name> <name><surname>Millot</surname> <given-names>J.-L.</given-names></name> <name><surname>Brand</surname> <given-names>G.</given-names></name></person-group> (<year>2005</year>). <article-title>Ambient odors influence the amplitude and time course of visual distraction.</article-title> <source><italic>Behav. Neurosci.</italic></source> <volume>119</volume> <fpage>708</fpage>&#x02013;<lpage>715</lpage>. <pub-id pub-id-type="doi">10.1037/0735-7044.119.3.708</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mohanty</surname> <given-names>A.</given-names></name> <name><surname>Egner</surname> <given-names>T.</given-names></name> <name><surname>Monti</surname> <given-names>J. M.</given-names></name> <name><surname>Mesulam</surname> <given-names>M. M.</given-names></name></person-group> (<year>2009</year>). <article-title>Search for a threatening target triggers limbic guidance of spatial attention.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>29</volume> <fpage>10563</fpage>&#x02013;<lpage>10572</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.1170-09.2009</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morrin</surname> <given-names>M.</given-names></name> <name><surname>Ratneshwar</surname> <given-names>S.</given-names></name></person-group> (<year>2003</year>). <article-title>Does it make sense to use scents to enhance brand memory?</article-title> <source><italic>J. Market. Res.</italic></source> <volume>40</volume> <fpage>10</fpage>&#x02013;<lpage>25</lpage>. <pub-id pub-id-type="doi">10.1509/jmkr.40.1.10.19128</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morrin</surname> <given-names>M.</given-names></name> <name><surname>Ratneshwar</surname> <given-names>S.</given-names></name></person-group> (<year>2000</year>). <article-title>The impact of ambient scent on evaluation, attention, and memory for familiar and unfamiliar brands.</article-title> <source><italic>J. Bus. Res.</italic></source> <volume>49</volume> <fpage>157</fpage>&#x02013;<lpage>165</lpage>. <pub-id pub-id-type="doi">10.1016/S0148-2963(99)00006-5</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morrot</surname> <given-names>G.</given-names></name> <name><surname>Brochet</surname> <given-names>F.</given-names></name> <name><surname>Dubourdieu</surname> <given-names>D.</given-names></name></person-group> (<year>2001</year>). <article-title>The color of odors.</article-title> <source><italic>Brain Lang.</italic></source> <volume>79</volume> <fpage>309</fpage>&#x02013;<lpage>320</lpage>. <pub-id pub-id-type="doi">10.1006/brln.2001.2493</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Murray</surname> <given-names>E. A.</given-names></name></person-group> (<year>2007</year>). <article-title>The amygdala, reward and emotion.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>11</volume> <fpage>489</fpage>&#x02013;<lpage>497</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2007.08.013</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nasar</surname> <given-names>J. L.</given-names></name> <name><surname>Cubukcu</surname> <given-names>E.</given-names></name></person-group> (<year>2011</year>). <article-title>Evaluative appraisals of environmental mystery and surprise.</article-title> <source><italic>Environ. Behav.</italic></source> <volume>43</volume> <fpage>387</fpage>&#x02013;<lpage>414</lpage>. <pub-id pub-id-type="doi">10.1177/0013916510364500</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oshima</surname> <given-names>C.</given-names></name> <name><surname>Wada</surname> <given-names>A.</given-names></name> <name><surname>Ando</surname> <given-names>H.</given-names></name> <name><surname>Matsuo</surname> <given-names>N.</given-names></name> <name><surname>Abe</surname> <given-names>S.</given-names></name> <name><surname>Yanigada</surname> <given-names>Y.</given-names></name></person-group> (<year>2007</year>). <article-title>&#x0201C;Improved delivery of olfactory stimukus to keep drivers awake,&#x0201D; in</article-title> <source><italic>Workshop on DSP for in-Vehicle and Mobile Systems</italic></source> I<publisher-loc>stanbul</publisher-loc>.</citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>&#x000D6;sterbauer</surname> <given-names>R. A.</given-names></name> <name><surname>Matthews</surname> <given-names>P. M.</given-names></name> <name><surname>Jenkinson</surname> <given-names>M.</given-names></name> <name><surname>Beckmann</surname> <given-names>C. F.</given-names></name> <name><surname>Hansen</surname> <given-names>P. C.</given-names></name> <name><surname>Calvert</surname> <given-names>G. A.</given-names></name></person-group> (<year>2005</year>). <article-title>The color of scents: chromatic stimuli modulate odor responses in the human brain.</article-title> <source><italic>J. Neurophysiol. </italic></source> <volume>93</volume> <fpage>3434</fpage>&#x02013;<lpage>3441</lpage>. <pub-id pub-id-type="doi">10.1152/jn.00555.2004</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Park</surname> <given-names>A. J.</given-names></name> <name><surname>Calvert</surname> <given-names>T.</given-names></name> <name><surname>Brantingham</surname> <given-names>P. L.</given-names></name> <name><surname>Brantingham</surname> <given-names>P. J.</given-names></name></person-group> (<year>2008</year>). <article-title>The use of virtual and mixed reality environments for urban behavioural studies.</article-title> <source><italic>Psychnol. J.</italic></source> <volume>6</volume> <fpage>119</fpage>&#x02013;<lpage>130</lpage>.</citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Park</surname> <given-names>A. J.</given-names></name> <name><surname>Spicer</surname> <given-names>V.</given-names></name> <name><surname>Guterres</surname> <given-names>M.</given-names></name> <name><surname>Brantingham</surname> <given-names>P. L.</given-names></name> <name><surname>Jenion</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>&#x0201C;Testing perception of crime in a virtual environment,&#x0201D; in</article-title> <source><italic>Proceedings of the 2010 IEEE International Conference on Intelligence and Security Informatics (ISI)</italic></source> (<publisher-loc>Piscataway, NJ</publisher-loc>: <publisher-name>IEEE Press</publisher-name>) <fpage>7</fpage>&#x02013;<lpage>12</lpage>.</citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perkins</surname> <given-names>D. D.</given-names></name> <name><surname>Meeks</surname> <given-names>J. W.</given-names></name> <name><surname>Taylor</surname> <given-names>R. B.</given-names></name></person-group> (<year>1992</year>). <article-title>The physical environment of street blocks and resident perceptions of crime and disorder: implications for theory and measurement.</article-title> <source><italic>J. Environ. Psychol.</italic></source> <volume>12</volume> <fpage>21</fpage>&#x02013;<lpage>34</lpage>. <pub-id pub-id-type="doi">10.1016/S0272-4944(05)80294-4</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Phillips</surname> <given-names>T.</given-names></name> <name><surname>Smith</surname> <given-names>P.</given-names></name></person-group> (<year>2004</year>). <article-title>Emotional and behavioral responses to everyday incivility: challenging the fear/avoidance paradigm.</article-title> <source><italic>J. Sociol.</italic></source> <volume>40</volume> <fpage>378</fpage>&#x02013;<lpage>399</lpage>. <pub-id pub-id-type="doi">10.1177/1440783304048382</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pollatos</surname> <given-names>O.</given-names></name> <name><surname>Kopietz</surname> <given-names>R.</given-names></name> <name><surname>Linn</surname> <given-names>J.</given-names></name> <name><surname>Albrecht</surname> <given-names>J.</given-names></name> <name><surname>Sakar</surname> <given-names>V.</given-names></name> <name><surname>Anzinger</surname> <given-names>A.</given-names></name><etal/></person-group> (<year>2007</year>). <article-title>Emotional stimulation alters olfactory sensitivity and odor judgment.</article-title> <source><italic>Chem. Sens.</italic></source> <volume>32</volume> <fpage>583</fpage>&#x02013;<lpage>589</lpage>. <pub-id pub-id-type="doi">10.1093/chemse/bjm027</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pourtois</surname> <given-names>G.</given-names></name> <name><surname>Schettino</surname> <given-names>A.</given-names></name> <name><surname>Vuilleumier</surname> <given-names>P.</given-names></name></person-group> (<year>2013</year>). <article-title>Brain mechanisms for emotional influences on perception and attention: what is magic and what is not.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>92</volume> <fpage>492</fpage>&#x02013;<lpage>512</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsycho.2012.02.007</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Reynolds</surname> <given-names>E.</given-names></name></person-group> (<year>2012</year>). <article-title>Our favourite smells: cut grass, aftershave, a freshly cleaned house, baking and a Sunday roast.</article-title> <source><italic>Daily Mail</italic></source>. <comment>Available at: <ext-link ext-link-type="uri" xlink:href="http://www.dailymail.co.uk/femail/article-2157519/"></ext-link> (accesses June 11, 2012)</comment>.</citation></ref>
<ref id="B52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Richard</surname> <given-names>E.</given-names></name> <name><surname>Tijou</surname> <given-names>A.</given-names></name> <name><surname>Richard</surname> <given-names>P.</given-names></name> <name><surname>Ferrier</surname> <given-names>J.-L.</given-names></name></person-group> (<year>2006</year>). <article-title>Multi-modal virtual environments for education with haptic and olfactory feedback.</article-title> <source><italic>Virtual Reality</italic></source> <volume>10</volume> <fpage>207</fpage>&#x02013;<lpage>225</lpage>. <pub-id pub-id-type="doi">10.1007/s10055-006-0040-8</pub-id></citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Riener</surname> <given-names>R.</given-names></name> <name><surname>Harders</surname> <given-names>M.</given-names></name></person-group> (<year>2012</year>). <article-title>&#x0201C;Olfactory and gustatory aspects,&#x0201D; in</article-title> <source><italic>Virtual Reality in Medicine</italic></source> <role>eds</role> <person-group person-group-type="editor"><name><surname>Riener</surname> <given-names>R.</given-names></name> <name><surname>Harders</surname> <given-names>M.</given-names></name></person-group> (<publisher-loc>London</publisher-loc>: <publisher-name>Springer</publisher-name>) <fpage>149</fpage>&#x02013;<lpage>159</lpage>.</citation></ref>
<ref id="B54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schifferstein</surname> <given-names>H. N. J.</given-names></name><name><surname>Blok</surname> <given-names>S. T.</given-names></name></person-group> (<year>2002</year>). <article-title>The signal function of thematically (in)congruent ambient scents in a retail environment.</article-title> <source><italic>Chem. Sens.</italic></source> <volume>27</volume> <fpage>539</fpage>&#x02013;<lpage>549</lpage>. <pub-id pub-id-type="doi">10.1093/chemse/27.6.539</pub-id></citation></ref>
<ref id="B55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Seigneuric</surname> <given-names>A.</given-names></name> <name><surname>Durand</surname> <given-names>K.</given-names></name> <name><surname>Jiang</surname> <given-names>T.</given-names></name> <name><surname>Baudouin</surname> <given-names>J. Y.</given-names></name> <name><surname>Schaal</surname> <given-names>B.</given-names></name></person-group> (<year>2010</year>). <article-title>The nose tells it to the eyes: crossmodal associations between olfaction and vision.</article-title> <source><italic>Perception</italic></source> <volume>39</volume> <fpage>1541</fpage>&#x02013;<lpage>1554</lpage>. <pub-id pub-id-type="doi">10.1068/p6740</pub-id></citation></ref>
<ref id="B56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Seigneuric</surname> <given-names>A.</given-names></name> <name><surname>Durand</surname> <given-names>K.</given-names></name> <name><surname>Jiang</surname> <given-names>T.</given-names></name> <name><surname>Baudouin</surname> <given-names>J. Y.</given-names></name> <name><surname>Schaal</surname> <given-names>B.</given-names></name></person-group> (<year>2012</year>). <article-title>The nose tells it to the eyes: crossmodal associations between olfaction and vision.</article-title> <source><italic>Perception</italic></source> <volume>39</volume> <fpage>1541</fpage>&#x02013;<lpage>1554</lpage>. <pub-id pub-id-type="doi">10.1068/p6740</pub-id></citation></ref>
<ref id="B57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Seo</surname> <given-names>H. S.</given-names></name> <name><surname>Roidl</surname> <given-names>E.</given-names></name> <name><surname>M&#x000FC;ller</surname> <given-names>F.</given-names></name> <name><surname>Negoias</surname> <given-names>S.</given-names></name></person-group> (<year>2010</year>). <article-title>Odors enhance visual attention to congruent objects.</article-title> <source><italic>Appetite</italic></source> <volume>54</volume> <fpage>544</fpage>&#x02013;<lpage>549</lpage>. <pub-id pub-id-type="doi">10.1016/j.appet.2010.02.011</pub-id></citation></ref>
<ref id="B58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Seubert</surname> <given-names>J.</given-names></name> <name><surname>Freiherr</surname> <given-names>J.</given-names></name> <name><surname>Djordjevic</surname> <given-names>J.</given-names></name> <name><surname>Lundstr&#x000F6;m</surname> <given-names>J. N.</given-names></name></person-group> (<year>2013</year>). <article-title>Statistical localization of human olfactory cortex.</article-title> <source><italic>Neuroimage</italic></source> <volume>66</volume> <fpage>333</fpage>&#x02013;<lpage>342</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2012.10.030</pub-id></citation></ref>
<ref id="B59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spielberger</surname> <given-names>C. D.</given-names></name></person-group> (<year>1983</year>). <article-title><italic>State-Trait Anxiety Inventory for adults</italic>.</article-title> <publisher-loc>Mountain View, CA</publisher-loc>: <publisher-name>Consulting Psychologists Press, Inc</publisher-name>.</citation></ref>
<ref id="B60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Teller</surname> <given-names>C.</given-names></name> <name><surname>Dennis</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). <article-title>The effect of ambient scent on consumers&#x02019; perception, emotions and behaviour: a critical review.</article-title> <source><italic>J. Mark. Manage.</italic></source> <volume>28</volume> <fpage>14</fpage>&#x02013;<lpage>36</lpage>. <pub-id pub-id-type="doi">10.1080/0267257X.2011.560719</pub-id></citation></ref>
<ref id="B61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Todd</surname> <given-names>R. M.</given-names></name> <name><surname>Cunningham</surname> <given-names>W. A.</given-names></name> <name><surname>Anderson</surname> <given-names>A. K.</given-names></name> <name><surname>Thompson</surname> <given-names>E.</given-names></name></person-group> (<year>2012</year>). <article-title>Affect-biased attention as emotion regulation.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>16</volume> <fpage>365</fpage>&#x02013;<lpage>372</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2012.06.003</pub-id></citation></ref>
<ref id="B62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Toet</surname> <given-names>A</given-names></name><name><surname>van Schaik</surname> <given-names>M. G.</given-names></name></person-group> (<year>2012</year>). <article-title>Effects of signals of disorder on fear of crime in real and virtual environments.</article-title> <source><italic>J. Environ. Psychol.</italic></source> <volume>32</volume> <fpage>260</fpage>&#x02013;<lpage>276</lpage>. <pub-id pub-id-type="doi">10.1016/j.jenvp.2012.04.001</pub-id></citation></ref>
<ref id="B63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tomono</surname> <given-names>A.</given-names></name> <name><surname>Kanda</surname> <given-names>K.</given-names></name> <name><surname>Otake</surname> <given-names>S.</given-names></name></person-group> (<year>2011</year>). <article-title>Effect of smell presentation on individuals with regard to eye catching and memory.</article-title> <source><italic>Electron. Comm. Jpn.</italic></source> <volume>94</volume> <fpage>9</fpage>&#x02013;<lpage>19</lpage>. <pub-id pub-id-type="doi">10.1002/ecj.10319</pub-id></citation></ref>
<ref id="B64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tortell</surname> <given-names>R.</given-names></name> <name><surname>Luigi</surname> <given-names>D. P.</given-names></name> <name><surname>Dozois</surname> <given-names>A.</given-names></name> <name><surname>Bouchard</surname> <given-names>S.</given-names></name> <name><surname>Morie</surname> <given-names>J. F.</given-names></name> <name><surname>Ilan</surname> <given-names>D.</given-names></name></person-group> (<year>2007</year>). <article-title>The effects of scent and game play experience on memory of a virtual environment.</article-title> <source><italic>Virtual Real.</italic></source> <volume>11</volume> <fpage>61</fpage>&#x02013;<lpage>68</lpage>. <pub-id pub-id-type="doi">10.1007/s10055-006-0056-0</pub-id></citation></ref>
<ref id="B65"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Van der Burg</surname> <given-names>E.</given-names></name> <name><surname>Olivers</surname> <given-names>C. N.</given-names></name> <name><surname>Bronkhorst</surname> <given-names>A. W.</given-names></name> <name><surname>Theeuwes</surname> <given-names>J.</given-names></name></person-group> (<year>2008</year>). <article-title>Pip and pop: nonspatial auditory signals improve spatial visual search.</article-title> <source><italic>J. Exp. Psychol. Hum. Percept. Perform.</italic></source> <volume>34</volume> <fpage>1053</fpage>&#x02013;<lpage>1065</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.34.5.1053</pub-id></citation></ref>
<ref id="B66"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Van der Burg</surname> <given-names>E.</given-names></name> <name><surname>Olivers</surname> <given-names>C. N.</given-names></name> <name><surname>Bronkhorst</surname> <given-names>A. W.</given-names></name> <name><surname>Theeuwes</surname> <given-names>J.</given-names></name></person-group> (<year>2009</year>). <article-title>Poke and pop: tactile-visual synchrony increases visual saliency.</article-title> <source><italic>Neurosci. Lett.</italic></source> <volume>450</volume> <fpage>60</fpage>&#x02013;<lpage>64</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2008.11.002</pub-id></citation></ref>
<ref id="B67"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vuilleumier</surname> <given-names>P.</given-names></name></person-group> (<year>2005</year>). <article-title>How brains beware: neural mechanisms of emotional attention.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>9</volume> <fpage>585</fpage>&#x02013;<lpage>594</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2005.10.011</pub-id></citation></ref>
<ref id="B68"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Walla</surname> <given-names>P.</given-names></name></person-group> (<year>2008</year>). <article-title>Olfaction and its dynamic influence on word and face processing: cross-modal integration.</article-title> <source><italic>Prog. Neurobiol.</italic></source> <volume>84</volume> <fpage>192</fpage>&#x02013;<lpage>209</lpage>. <pub-id pub-id-type="doi">10.1016/j.pneurobio.2007.10.005</pub-id></citation></ref>
<ref id="B69"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Washburn</surname> <given-names>D. A.</given-names></name> <name><surname>Jones</surname> <given-names>L. M.</given-names></name> <name><surname>Satya</surname> <given-names>R. V.</given-names></name> <name><surname>Bowers</surname> <given-names>C. A.</given-names></name> <name><surname>Cortes</surname> <given-names>A.</given-names></name></person-group> (<year>2003</year>). <article-title>Olfactory use in virtual environment training.</article-title> <source><italic>Model. Simulat. Mag.</italic></source> <volume>2</volume> <fpage>19</fpage>&#x02013;<lpage>25</lpage>.</citation></ref>
<ref id="B70"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Williams</surname> <given-names>L. M.</given-names></name> <name><surname>Palmer</surname> <given-names>D.</given-names></name> <name><surname>Liddell</surname> <given-names>B. J.</given-names></name> <name><surname>Song</surname> <given-names>L.</given-names></name> <name><surname>Gordon</surname> <given-names>E.</given-names></name></person-group> (<year>2006</year>). <article-title>The &#x02018;when&#x02019; and &#x02018;where&#x02019; of perceiving signals of threat versus non-threat.</article-title> <source><italic>Neuroimage</italic></source> <volume>31</volume> <fpage>458</fpage>&#x02013;<lpage>467</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2005.12.009</pub-id></citation></ref>
<ref id="B71"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Williams</surname> <given-names>M. A.</given-names></name> <name><surname>McGlone</surname> <given-names>F.</given-names></name> <name><surname>Abbott</surname> <given-names>D. F.</given-names></name> <name><surname>Mattingley</surname> <given-names>J. B.</given-names></name></person-group> (<year>2005</year>). <article-title>Differential amygdala responses to happy and fearful facial expressions depend on selective attention.</article-title> <source><italic>Neuroimage</italic></source> <volume>24</volume> <fpage>417</fpage>&#x02013;<lpage>425</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2004.08.017</pub-id></citation></ref>
<ref id="B72"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Winston</surname> <given-names>J. S.</given-names></name> <name><surname>Gottfried</surname> <given-names>J. A.</given-names></name> <name><surname>Kilner</surname> <given-names>J. M.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2005</year>). <article-title>Integrated neural representations of odor intensity and affective valence in human amygdala.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>25</volume> <fpage>8903</fpage>&#x02013;<lpage>8907</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.1569-05.2005</pub-id></citation></ref>
<ref id="B73"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Woldorff</surname> <given-names>M. G.</given-names></name> <name><surname>Gallen</surname> <given-names>C. C.</given-names></name> <name><surname>Hampson</surname> <given-names>S. A.</given-names></name> <name><surname>Hillyard</surname> <given-names>S. A.</given-names></name> <name><surname>Pantev</surname> <given-names>C.</given-names></name> <name><surname>Sobel</surname> <given-names>D.</given-names></name><etal/></person-group> (<year>1993</year>). <article-title>Modulation of early sensory processing in human auditory cortex during auditory selective attention.</article-title> <source><italic>Proc. Natl. Acad. Sci. U.S.A.</italic></source> <volume>90</volume> <fpage>8722</fpage>&#x02013;<lpage>8726</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.90.18.8722</pub-id></citation></ref>
<ref id="B74"><citation citation-type="journal"><collab>World Medical Association.</collab> (<year>2000</year>). <article-title>World Medical Association Declaration of Helsinki: ethical principles for medical research involving human subjects.</article-title> <source><italic>J. Am. Med. Assoc.</italic></source> <volume>284</volume> <fpage>3043</fpage>&#x02013;<lpage>3045</lpage>. <pub-id pub-id-type="doi">10.1001/jama.284.23.3043</pub-id></citation></ref>
<ref id="B75"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Worrall</surname> <given-names>J. L.</given-names></name></person-group> (<year>2006</year>). <article-title>The discriminant validity of perceptual incivility measures.</article-title> <source><italic>Justice Q.</italic></source> <volume>23</volume> <fpage>360</fpage>&#x02013;<lpage>383</lpage>. <pub-id pub-id-type="doi">10.1080/07418820600869137</pub-id></citation></ref>
<ref id="B76"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wrzesniewski</surname> <given-names>A.</given-names></name> <name><surname>McCauley</surname> <given-names>C.</given-names></name> <name><surname>Rozin</surname> <given-names>P.</given-names></name></person-group> (<year>1999</year>). <article-title>Odor and affect: individual differences in the impact of odor on liking for places, things and people.</article-title> <source><italic>Chem. Sens.</italic></source> <volume>24</volume> <fpage>713</fpage>&#x02013;<lpage>721</lpage>. <pub-id pub-id-type="doi">10.1093/chemse/24.6.713</pub-id></citation></ref>
<ref id="B77"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yanigada</surname> <given-names>Y.</given-names></name> <name><surname>Adachi</surname> <given-names>T.</given-names></name> <name><surname>Miyasato</surname> <given-names>T.</given-names></name> <name><surname>Tomono</surname> <given-names>A.</given-names></name> <name><surname>Kawato</surname> <given-names>S.</given-names></name> <name><surname>Noma</surname> <given-names>H.</given-names></name><etal/></person-group> (<year>2005</year>). <article-title>&#x0201C;Integrating a projection-based olfactory display with interactive audio-visual contents,&#x0201D; in</article-title> <source><italic>HCI International 2005</italic></source> <publisher-loc>Las Vegas</publisher-loc>.</citation></ref>
<ref id="B78"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yanigada</surname> <given-names>Y.</given-names></name> <name><surname>Kawato</surname> <given-names>S.</given-names></name> <name><surname>Noma</surname> <given-names>H.</given-names></name> <name><surname>Tetsutani</surname> <given-names>N.</given-names></name> <name><surname>Tomono</surname> <given-names>A.</given-names></name></person-group> (<year>2003</year>). <article-title>&#x0201C;A nose-tracked, personal olfactory display,&#x0201D; in</article-title> <source><italic>International Conference on Computer Graphics and Interactive Techniques. ACM SIGGRAPH 2003 Sketches &#x00026; Applications</italic></source> (<publisher-loc>New York</publisher-loc>: <publisher-name>ACM</publisher-name>) <issue>1</issue>.</citation></ref>
<ref id="B79"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yanigada</surname> <given-names>Y.</given-names></name> <name><surname>Kawato</surname> <given-names>S.</given-names></name> <name><surname>Noma</surname> <given-names>H.</given-names></name> <name><surname>Tomono</surname> <given-names>A.</given-names></name> <name><surname>Tetsutani</surname> <given-names>N.</given-names></name></person-group> (<year>2004</year>). <article-title>&#x0201C;Projection-based olfactory display with nose tracking,&#x0201D; in</article-title> <source><italic>Proceedings of IEEE Virtual Reality 2004</italic></source> (<publisher-loc>Piscataway, NJ</publisher-loc>: <publisher-name>IEEE</publisher-name>) <fpage>43</fpage>&#x02013;<lpage>50</lpage>.</citation></ref>
<ref id="B80"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yu</surname> <given-names>J.</given-names></name> <name><surname>Yanigada</surname> <given-names>Y.</given-names></name> <name><surname>Kawato</surname> <given-names>S.</given-names></name> <name><surname>Tetsutani</surname> <given-names>N.</given-names></name></person-group> (<year>2003</year>). <article-title>&#x0201C;Air cannon design for projection-based olfactory display,&#x0201D; in</article-title> <source><italic>Proceedings of the 13th International Conference on Artificial Reality and Telexistence</italic></source> <publisher-loc>Tokyo</publisher-loc> <fpage>136</fpage>&#x02013;<lpage>142</lpage>.</citation></ref>
<ref id="B81"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zald</surname> <given-names>D. H.</given-names></name></person-group> (<year>2003</year>). <article-title>The human amygdala and the emotional evaluation of sensory stimuli.</article-title> <source><italic>Brain Res. Rev.</italic></source> <volume>41</volume> <fpage>88</fpage>&#x02013;<lpage>123</lpage>. <pub-id pub-id-type="doi">10.1016/S0165-0173(02)00248-5</pub-id></citation></ref>
<ref id="B82"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zelano</surname> <given-names>C.</given-names></name> <name><surname>Bensafi</surname> <given-names>M.</given-names></name> <name><surname>Porter</surname> <given-names>J.</given-names></name> <name><surname>Mainland</surname> <given-names>J.</given-names></name> <name><surname>Johnson</surname> <given-names>B.</given-names></name> <name><surname>Bremner</surname> <given-names>E.</given-names></name><etal/></person-group> (<year>2005</year>). <article-title>Attentional modulation in human primary olfactory cortex.</article-title> <source><italic>Nat. Neurosci.</italic></source> <volume>8</volume> <fpage>114</fpage>&#x02013;<lpage>120</lpage>. <pub-id pub-id-type="doi">10.1038/nn1368</pub-id></citation></ref>
<ref id="B83"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zellner</surname> <given-names>D. A.</given-names></name></person-group> (<year>2013</year>). <article-title>Color odor interactions: a review and model.</article-title> <source><italic>Chemsens. Percept.</italic></source> <fpage>1</fpage>&#x02013;<lpage>15</lpage>. <pub-id pub-id-type="doi">10.1007/s12078-013-9154-z</pub-id></citation></ref>
<ref id="B84"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zellner</surname> <given-names>D. A.</given-names></name> <name><surname>Bartoli</surname> <given-names>A. M.</given-names></name> <name><surname>Eckard</surname> <given-names>R.</given-names></name></person-group> (<year>1991</year>). <article-title>Influence of color on odor identification and liking ratings.</article-title> <source><italic>Am. J. Psychol.</italic></source> <volume>104</volume> <fpage>547</fpage>&#x02013;<lpage>561</lpage>. <pub-id pub-id-type="doi">10.2307/1422940</pub-id></citation></ref>
<ref id="B85"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zellner</surname> <given-names>D. A.</given-names></name> <name><surname>Kautz</surname> <given-names>M. A.</given-names></name></person-group> (<year>1990</year>). <article-title>Color affects perceived odor intensity.</article-title> <source><italic>J. Exp. Psychol. Hum. Percept. Perform.</italic></source> <volume>16</volume> <fpage>391</fpage>&#x02013;<lpage>397</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.16.2.391</pub-id></citation></ref>
<ref id="B86"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhou</surname> <given-names>W.</given-names></name> <name><surname>Chen</surname> <given-names>D.</given-names></name></person-group> (<year>2009</year>). <article-title>Fear-related chemosignals modulate recognition of fear in ambiguous facial expressions.</article-title> <source><italic>Psychol. Sci.</italic></source> <volume>20</volume> <fpage>177</fpage>&#x02013;<lpage>183</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-9280.2009.02263.x</pub-id></citation></ref>
<ref id="B87"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhou</surname> <given-names>W.</given-names></name> <name><surname>Jiang</surname> <given-names>Y.</given-names></name> <name><surname>He</surname> <given-names>S.</given-names></name> <name><surname>Chen</surname> <given-names>D.</given-names></name></person-group> (<year>2010</year>). <article-title>Olfaction modulates visual perception in binocular rivalry.</article-title> <source><italic>Curr. Biol.</italic></source> <volume>20</volume> <fpage>1356</fpage>&#x02013;<lpage>1358</lpage>. <pub-id pub-id-type="doi">10.1016/j.cub.2010.05.059</pub-id></citation></ref>
<ref id="B88"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhou</surname> <given-names>W.</given-names></name> <name><surname>Zhang</surname> <given-names>X.</given-names></name> <name><surname>Chen</surname> <given-names>J.</given-names></name> <name><surname>Wang</surname> <given-names>L.</given-names></name> <name><surname>Chen</surname> <given-names>D.</given-names></name></person-group> (<year>2012</year>). <article-title>Nostril-specific olfactory modulation of visual perception in binocular rivalry.</article-title> <source><italic>J. Neurosci</italic>.</source> <volume>32</volume> <fpage>17225</fpage>&#x02013;<lpage>17229</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.2649-12.2012</pub-id></citation></ref>
</ref-list>
</back>
</article>