<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="review-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Neurosci.</journal-id>
<journal-title>Frontiers in Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-4548</issn>
<issn pub-type="epub">1662-453X</issn>
<publisher>
<publisher-name>Frontiers Research Foundation</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnins.2011.00130</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Review Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Impact of Size and Delay on Neural Activity in the Rat Limbic Corticostriatal System</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Roesch</surname> <given-names>Matthew R.</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="author-notes" rid="fn001">&#x0002A;</xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Bryden</surname> <given-names>Daniel W.</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Psychology and Program in Neuroscience and Cognitive Science, University of Maryland</institution> <country>College Park, MD, USA</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Tobias Kalenscher, Heinrich-Heine University Duesseldorf, Germany</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Bernd Weber, Rheinische-Friedrich-Wilhelms Universit&#x000E4;t, Germany; Martin O&#x02019;Neill, University of Cambridge, UK</p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: Matthew R. Roesch, Department of Psychology and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, 20742 MD, USA. e-mail: <email>mroesch&#x00040;umd.edu</email></p></fn>
<fn fn-type="other" id="fn002"><p>This article was submitted to Frontiers in Decision Neuroscience, a specialty of Frontiers in Neuroscience.</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>07</day>
<month>12</month>
<year>2011</year>
</pub-date>
<pub-date pub-type="collection">
<year>2011</year>
</pub-date>
<volume>5</volume>
<elocation-id>130</elocation-id>
<history>
<date date-type="received">
<day>31</day>
<month>05</month>
<year>2011</year>
</date>
<date date-type="accepted">
<day>04</day>
<month>11</month>
<year>2011</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2011 Roesch and Bryden.</copyright-statement>
<copyright-year>2011</copyright-year>
<license license-type="open-access" xlink:href="http://www.frontiersin.org/licenseagreement"><p>This is an open-access article subject to a non-exclusive license between the authors and Frontiers Media SA, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and other Frontiers conditions are complied with.</p></license>
</permissions>
<abstract>
<p>A number of factors influence an animal&#x02019;s economic decisions. Two most commonly studied are the magnitude of and delay to reward. To investigate how these factors are represented in the firing rates of single neurons, we devised a behavioral task that independently manipulated the expected delay to and size of reward. Rats perceived the differently delayed and sized rewards as having different values and were more motivated under short delay and big-reward conditions than under long delay and small reward conditions as measured by percent choice, accuracy, and reaction time. Since the creation of this task, we have recorded from several different brain areas including, orbitofrontal cortex, striatum, amygdala, substantia nigra pars reticulata, and midbrain dopamine neurons. Here, we review and compare those data with a substantial focus on those areas that have been shown to be critical for performance on classic time discounting procedures and provide a potential mechanism by which they might interact when animals are deciding between differently delayed rewards. We found that most brain areas in the cortico-limbic circuit encode both the magnitude and delay to reward delivery in one form or another, but only a few encode them together at the single neuron level.</p>
</abstract>
<kwd-group>
<kwd>discounting</kwd>
<kwd>value</kwd>
<kwd>dopamine</kwd>
<kwd>orbitofrontal</kwd>
<kwd>striatum</kwd>
<kwd>amygdala</kwd>
<kwd>substantia nigra</kwd>
</kwd-group>
<counts>
<fig-count count="6"/>
<table-count count="0"/>
<equation-count count="0"/>
<ref-count count="142"/>
<page-count count="13"/>
<word-count count="12541"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="introduction">
<title>Introduction</title>
<p>Animals prefer an immediate reward over a delayed reward even when the delayed reward is more economically valuable in the long run. In the lab, the neural mechanisms underlying this aspect of decision-making are often studied in tasks that ask animals or humans to choose between a small reward delivered immediately and a large reward delivered after some delay (Herrnstein, <xref ref-type="bibr" rid="B49">1961</xref>; Ainslie, <xref ref-type="bibr" rid="B1">1974</xref>; Thaler, <xref ref-type="bibr" rid="B128">1981</xref>; Kahneman and Tverskey, <xref ref-type="bibr" rid="B66">1984</xref>; Rodriguez and Logue, <xref ref-type="bibr" rid="B106">1988</xref>; Lowenstein, <xref ref-type="bibr" rid="B78">1992</xref>; Evenden and Ryan, <xref ref-type="bibr" rid="B35">1996</xref>; Richards et al., <xref ref-type="bibr" rid="B105">1997</xref>; Ho et al., <xref ref-type="bibr" rid="B50">1999</xref>; Cardinal et al., <xref ref-type="bibr" rid="B20">2001</xref>; Mobini et al., <xref ref-type="bibr" rid="B86">2002</xref>; Winstanley et al., <xref ref-type="bibr" rid="B137">2004b</xref>; Kalenscher et al., <xref ref-type="bibr" rid="B68">2005</xref>; Kalenscher and Pennartz, <xref ref-type="bibr" rid="B67">2008</xref>; Ballard and Knutson, <xref ref-type="bibr" rid="B3">2009</xref>; Figner et al., <xref ref-type="bibr" rid="B37">2010</xref>). As the delay to the large reward becomes longer, subjects tend to start discounting the value of the large reward, biasing their choice behavior toward the small, immediate reward (temporal discounting). This choice behavior is considered to be impulsive because over the course of many trials it would be more economical to wait for the larger reward. Impulsive choice is exacerbated in several disorders such as drug addiction, attention-deficit/hyperactivity disorder, and schizophrenia, altering the breakpoint at which subjects abandon the large-delayed reward for the more immediate reward (Ernst et al., <xref ref-type="bibr" rid="B34">1998</xref>; Jentsch and Taylor, <xref ref-type="bibr" rid="B62">1999</xref>; Monterosso et al., <xref ref-type="bibr" rid="B92">2001</xref>; Bechara et al., <xref ref-type="bibr" rid="B6">2002</xref>; Coffey et al., <xref ref-type="bibr" rid="B26">2003</xref>; Heerey et al., <xref ref-type="bibr" rid="B48">2007</xref>; Roesch et al., <xref ref-type="bibr" rid="B109">2007c</xref>; Dalley et al., <xref ref-type="bibr" rid="B30">2008</xref>). Although considerable attention has been paid to the neuroanatomical and pharmacological basis of temporally discounted reward and impulsivity, few have examined the neural correlates involved. Specifically, few have asked how delays impact neural encoding in brain areas known to be involved in reinforcement learning and decision-making, and how that encoding might relate to less abstract manipulations of value such as magnitude. To address this issue we developed an inter-temporal choice task suitable for behavioral recording studies in rats (Roesch et al., <xref ref-type="bibr" rid="B115">2006</xref>, <xref ref-type="bibr" rid="B107">2007a</xref>,<xref ref-type="bibr" rid="B108">b</xref>, <xref ref-type="bibr" rid="B114">2009</xref>, <xref ref-type="bibr" rid="B111">2010b</xref>; Takahashi et al., <xref ref-type="bibr" rid="B126">2009</xref>; Calu et al., <xref ref-type="bibr" rid="B18">2010</xref>; Stalnaker et al., <xref ref-type="bibr" rid="B124">2010</xref>; Bryden et al., <xref ref-type="bibr" rid="B15">2011</xref>).</p>
<p>In this task, rats were trained to nosepoke into a central odor port. After 0.5&#x02009;s, one of three odors was presented. One odor signaled for the rat to go left (forced-choice), another signaled go right (forced-choice), and the third odor signaled that the rat was free to choose either the left or right well (free-choice) to receive liquid sucrose reward. The two wells were located below the odor port as illustrated in Figure <xref ref-type="fig" rid="F1">1</xref>B. After responding to the well, rats had to wait 0.5 or 1&#x02013;7&#x02009;s to receive reward, depending on trial type (Figure <xref ref-type="fig" rid="F1">1</xref>A). The task was designed to allow for equal samples of left and rightward responses (forced-choice) while at the same time having a direct measure of the animal&#x02019;s preference (free-choice). In addition, use of free- and forced-choice trials has allowed us to determine whether the brain processes free-choice differently than forced instrumental responding and whether or not observed neural correlates reflect sensory or motor processing.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p><bold>Size and Delay Behavioral Choice Task</bold>. <bold>(A)</bold> Figure shows sequence of events in each trial in 4 blocks in which we manipulated the time to reward or the size of reward. Trials were signaled by illumination of the panel lights inside the box. When these lights were on, nosepoke into the odor port resulted in delivery of the odor cue to a small hemicylinder located behind this opening. One of three different odors was delivered to the port on each trial, in a pseudorandom order. At odor offset, the rat had 3&#x02009;s to make a response at one of the two fluid wells located below the port. One odor instructed the rat to go to the left to get reward, a second odor instructed the rat to go to the right to get reward, and a third odor indicated that the rat could obtain reward at either well. At the start of each recording session one well was randomly designated as short (a 0.5&#x02009;s delay before reward) and the other long (a 1 to 7&#x02009;s delay before reward) (block 1). In the second block of trials these contingencies were switched (block 2). In blocks 3 and 4 delays were held constant (0.5&#x02009;s) and reward size was manipulated. sh&#x02009;&#x0003D;&#x02009;short; bg&#x02009;&#x0003D;&#x02009;big; lo&#x02009;&#x0003D;&#x02009;long; sm&#x02009;&#x0003D;&#x02009;small; <bold>(B)</bold> Picture of apparatus. <bold>(C)</bold> Percent licking behavior averaged over all recording sessions during trials when a small reward was delayed versus when a small reward was delivered after 0.5&#x02009;s. Licking is aligned to well entry (left) and reward delivery (right). Adapted from (Roesch et al., <xref ref-type="bibr" rid="B115">2006</xref>, Roesch et al., <xref ref-type="bibr" rid="B108">2007b</xref>, Takahashi et al., <xref ref-type="bibr" rid="B126">2009</xref>).</p></caption>
<graphic xlink:href="fnins-05-00130-g001.tif"/>
</fig>
<p>At the start of each session, we shifted the rats&#x02019; response bias to the left or to the right by increasing the delay preceding reward delivery in one of the two fluid wells (1&#x02013;7&#x02009;s). During delay blocks, each well yielded one bolus of 10% sucrose solution. After &#x0223C;60&#x02013;80 trials, the response direction associated with the delayed well switched unexpectedly. Thus, the response direction that was associated with a short delay became long, whereas the response direction associated with the long delay in the first block of trials became short. During delay blocks, the intertrial intervals were normalized so that the length of short and long delay trials were equal, thus there was no overall benefit to choosing the short delay, but as we will describe, rats did so regardless.</p>
<p>These contingencies continued for &#x0223C;60&#x02013;80 trials at which time both delays were set to 0.5&#x02009;s and the well that was associated with the long delay, now produced a large reward (two to three boli). These contingencies were again switched in the fourth block of trials. Trial block switches were not cued, thus animals had to detect changes in reward contingencies and update behavior from block to block.</p>
<p>It is important to emphasize that reward size and delay were varied independently, unlike common delay discounting tasks. Whereas other studies have investigated the neuronal coding of temporally discounted reward in paradigms that have manipulated size and delay simultaneously, our task allows us to dissociate correlates related to size and delay to better understand how each manipulation of value is coded independently from the other. As we will show below, rats prefer or value short over long delays to reward and large over small reward as indicated by choice performance. We felt it was necessary to dissociate size correlates from delay correlates because certain disorders and brain manipulations have been shown to impair size and delay processing independently, sometimes in an opposing manner (Roesch et al., <xref ref-type="bibr" rid="B109">2007c</xref>). Although we do not manipulate the size of the reward along with the length of the delay preceding reward delivery in the traditional sense, any effects on choice behavior and neural firing must be dependent on the delay and reflect how time spent waiting for a reward reduces the value of reward. The depreciation of the reward value due to delay has been referred to as the temporally discounted value of the reward (Kalenscher and Pennartz, <xref ref-type="bibr" rid="B67">2008</xref>).</p>
<p>In each of the studies that we will describe below, rats were significantly more accurate and faster on high value reward trials (large reward and short delay) as compared to low value reward trials (small reward and long delay) on forced-choice trials (Roesch et al., <xref ref-type="bibr" rid="B115">2006</xref>, <xref ref-type="bibr" rid="B107">2007a</xref>,<xref ref-type="bibr" rid="B108">b</xref>, <xref ref-type="bibr" rid="B114">2009</xref>, <xref ref-type="bibr" rid="B111">2010b</xref>; Takahashi et al., <xref ref-type="bibr" rid="B126">2009</xref>; Calu et al., <xref ref-type="bibr" rid="B18">2010</xref>; Stalnaker et al., <xref ref-type="bibr" rid="B124">2010</xref>; Bryden et al., <xref ref-type="bibr" rid="B15">2011</xref>). On free-choice trials, rats chose high over low value and switched their preference rapidly after block changes. Thus, rats discounted delayed rewards, choosing it less often and working less hard to achieve it. Preference of immediate reward over delayed reward was not significantly different than preference of the large over small reward.</p>
<p>In addition, behavioral measures have illustrated that delayed rewards were less predictable than more immediate rewards in this task (Takahashi et al., <xref ref-type="bibr" rid="B126">2009</xref>). Even after learning, the rats could not predict the delayed reward with great precision. Licking increased rapidly prior to the small, more immediate reward and showed no change prior to delivery of the delayed small reward (Figure <xref ref-type="fig" rid="F1">1</xref>C; Takahashi et al., <xref ref-type="bibr" rid="B126">2009</xref>). Instead, rats&#x02019; licking behavior increased around 0.5&#x02009;s after well entry on delayed trials (Figure <xref ref-type="fig" rid="F1">1</xref>C) corresponding to the time when delivery of immediate reward would have happened in the preceding block of trials (Figure <xref ref-type="fig" rid="F1">1</xref>C). Thus, rats anticipated delivery of immediate reward, even on long delay trials and, although they knew that the delayed reward would eventually arrive, they could not predict exactly when. Similar findings have been described in primates (Kobayashi and Schultz, <xref ref-type="bibr" rid="B75">2008</xref>).</p>
<p>In this article, we review neural correlates related to performance of this task from several brain areas, with a stronger focus on those areas that have been shown to disrupt behavior on standard delay discounting tasks after lesions, inactivation, or other pharmacological manipulations (Cardinal et al., <xref ref-type="bibr" rid="B22">2004</xref>; Floresco et al., <xref ref-type="bibr" rid="B40">2008</xref>).</p>
</sec>
<sec>
<title>Orbitofrontal Cortex</title>
<p>Impulsive choice in humans has long been attributed to damage of orbitofrontal cortex (OFC), but the role that OFC plays in inter-temporal choices remains unclear. OFC lesions decrease and increase discounting functions depending on experimental design and lesion location (Mobini et al., <xref ref-type="bibr" rid="B86">2002</xref>; Winstanley et al., <xref ref-type="bibr" rid="B137">2004b</xref>; Rudebeck et al., <xref ref-type="bibr" rid="B116">2006</xref>; Winstanley, <xref ref-type="bibr" rid="B135">2007</xref>; Churchwell et al., <xref ref-type="bibr" rid="B25">2009</xref>; Sellitto et al., <xref ref-type="bibr" rid="B121">2010</xref>; Zeeb et al., <xref ref-type="bibr" rid="B142">2010</xref>; Mar et al., <xref ref-type="bibr" rid="B80">2011</xref>). From these data it has been clear that OFC is involved in inter-temporal choice suggesting that it must carry information related to the length of delay preceding the delivery of reward.</p>
<p>To investigate how delay and size were encoded in OFC, we recorded from single neurons while rats performed the task described above (Roesch et al., <xref ref-type="bibr" rid="B115">2006</xref>). Consistent with previous work, lateral OFC neurons fired in anticipation of delayed reward. As illustrated in Figure <xref ref-type="fig" rid="F2">2</xref>A, activity of many single neurons continuously fired until the delayed reward was delivered, resulting in higher levels of activity for rewards that were delayed (Figure <xref ref-type="fig" rid="F2">2</xref>A; bottom; gray).</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p><bold>Orbitofrontal cortex (OFC)</bold>. <bold>(A)</bold> Single cell example of reward expectancy activity. <bold>(B)</bold> Single cell example of a neuron that exhibits reduced activity when rewards are delayed compared to when rewards are delivered immediately (black). Activity is plotted for the last 10 trials in a block in which reward was delivered in the cell&#x02019;s preferred direction after 0.5&#x02009;s (black) followed by trials in which the reward was delayed by 1&#x02013;4&#x02009;s (gray). Each row represents a single trial, each tick mark represents a single action potential and the colored lines indicate when reward was delivered. <bold>(C)</bold> Averaged firing rate of all OFC neurons that fired significantly (<italic>p</italic>&#x02009;&#x0003C;&#x02009;0.05) more strongly during a 1-s period after reward delivery compared to baseline (adapted from Roesch et al., <xref ref-type="bibr" rid="B115">2006</xref>).</p></caption>
<graphic xlink:href="fnins-05-00130-g002.tif"/>
</fig>
<p>Surprisingly, the majority of OFC neurons in our study did not show this pattern of activity (Roesch et al., <xref ref-type="bibr" rid="B115">2006</xref>). Most OFC neurons did not maintain firing across the delay as illustrated by the single cell example in Figure <xref ref-type="fig" rid="F2">2</xref>B. Under short delay conditions, this neuron fired in anticipation of and during the delivery of immediate reward (top; black). When the reward was delayed (gray), activity declined until the delayed reward was delivered, and thus, did not bridge the gap between the response and reward as in the previous example (Figure <xref ref-type="fig" rid="F2">2</xref>A). Interestingly, activity seemed to reflect the expectation of reward by continuing to fire when the reward would have been delivered on previous trials (i.e., 0.5&#x02009;s after the response). This old expectancy signal slowly dissipated with learning (Figure <xref ref-type="fig" rid="F2">2</xref>B).</p>
<p>Thus, it appears that although many OFC neurons maintained representations of the reward across the delay, most did not. Overall activity across the population of reward-responsive neurons was stronger during delivery of immediate reward as compared to delayed reward (Figure <xref ref-type="fig" rid="F2">2</xref>C). These changes in firing likely had a profound impact on inter-temporal choice. Indeed, firing of OFC neurons was correlated with the tendency for the rat to choose the short delay on future free-choice trials (Roesch et al., <xref ref-type="bibr" rid="B115">2006</xref>, <xref ref-type="bibr" rid="B107">2007a</xref>).</p>
<p>We suspect that these two types of signals play very different roles during performance of standard delay discounting paradigms (Roesch et al., <xref ref-type="bibr" rid="B115">2006</xref>, <xref ref-type="bibr" rid="B107">2007a</xref>). Reward expectancy signals that maintain a representation of the delayed delivery of reward (Figure <xref ref-type="fig" rid="F2">2</xref>A) might be critical for facilitating the formation of associative representations in other brain regions during learning. For example, it has been shown that input from OFC is important for rapid changes in cue&#x02013;outcome encoding in basolateral amygdala (Saddoris et al., <xref ref-type="bibr" rid="B117">2005</xref>) and prediction error signaling in DA neurons in ventral tegmental area (VTA). Loss of cue&#x02013;outcome encoding in downstream areas after OFC lesions may be due to the loss of expectancy signals generated in OFC (Schoenbaum and Roesch, <xref ref-type="bibr" rid="B118">2005</xref>). If the purpose of expectancy signals in OFC is to maintain a representation of the reward when it is delayed so that downstream areas can develop cue or response&#x02013;outcome associations, then animals with OFC lesions would be less likely to choose those cues or responses when they result in the delayed reward. This interpretation is consistent with reports that lesions of OFC can cause more impulsive responding (Rudebeck et al., <xref ref-type="bibr" rid="B116">2006</xref>).</p>
<p>The majority of OFC neurons fired more strongly for immediate reward (Figure <xref ref-type="fig" rid="F2">2</xref>B). These neurons likely represent when an immediate reward is or is about to be delivered. When the reward is delayed, this expectation of immediate reward is violated and a negative prediction error is generated in downstream areas. Negative prediction error signals would subsequently weaken associations between cues and responses that predict the now delayed reward. These changes would drive behavior away from responses that predict the delayed reward. Elimination of this signal could make animals less likely to abandon the delayed reward as has been shown in previous studies (Winstanley et al., <xref ref-type="bibr" rid="B137">2004b</xref>).</p>
<p>To add to this complexity, a recent study suggests that different regions of OFC might serve opposing functions related to inter-temporal choice (Mar et al., <xref ref-type="bibr" rid="B80">2011</xref>). In this study, Mar and colleagues showed that lesions of medial OFC make rats discount slower, encouraging responding to the larger delayed reward, whereas lateral OFC lesions make rats discount faster, decreasing preference for the larger delayed reward. How does this relate to our data? It suggests that neurons that bridge the gap during the delay preceding reward delivery might be more prominent in lateral OFC. This hypothesis is consistent with human imaging studies showing a positive correlation between OFC activation and preference for delayed reward (McClure et al., <xref ref-type="bibr" rid="B82">2004</xref>, <xref ref-type="bibr" rid="B81">2007</xref>; Hariri et al., <xref ref-type="bibr" rid="B44">2006</xref>; Boettiger et al., <xref ref-type="bibr" rid="B11">2007</xref>; Mar et al., <xref ref-type="bibr" rid="B80">2011</xref>). These data also suggest that neurons that exhibit reduced activity for delayed reward, firing more strongly for immediate reward, might be more prominent in medial OFC. This hypothesis is consistent with human imaging studies showing that activation of medial OFC is positively correlated with preference of more immediate reward (McClure et al., <xref ref-type="bibr" rid="B82">2004</xref>, <xref ref-type="bibr" rid="B81">2007</xref>; Hariri et al., <xref ref-type="bibr" rid="B44">2006</xref>; Mar et al., <xref ref-type="bibr" rid="B80">2011</xref>). Future studies will have to examine whether this theory is true and/or if other signals might be involved in generating the opposing symptoms observed after medial and lateral OFC lesions.</p>
<p>Notably, a number of other prefrontal cortical areas are thought to be involved in processing delayed reward. Most of this work has come from humans and in studies examining neural activity in monkeys. For example, Kim et al. (<xref ref-type="bibr" rid="B74">2008</xref>) found that single neurons in monkey prefrontal cortex (PFC) were modulated by both expected size of and delay to reward in a task in which monkeys chose between targets that predicted both magnitude and delay. Human studies have backed these findings and have further suggested that PFC, unlike OFC, might be more critical in evaluating rewards that are more extensively delayed (e.g., months to years; McClure et al., <xref ref-type="bibr" rid="B82">2004</xref>, <xref ref-type="bibr" rid="B81">2007</xref>; Tanaka et al., <xref ref-type="bibr" rid="B127">2004</xref>; Kable and Glimcher, <xref ref-type="bibr" rid="B65">2007</xref>; Ballard and Knutson, <xref ref-type="bibr" rid="B3">2009</xref>; Figner et al., <xref ref-type="bibr" rid="B37">2010</xref>).</p>
</sec>
<sec>
<title>Basolateral Amygdala</title>
<p>Much of the evidence we have described for the general role of OFC in anticipating future events and consequences can also be found in studies of amygdalar function, in particular, the ABL (Jones and Mishkin, <xref ref-type="bibr" rid="B64">1972</xref>; Kesner and Williams, <xref ref-type="bibr" rid="B71">1995</xref>; Hatfield et al., <xref ref-type="bibr" rid="B47">1996</xref>; Malkova et al., <xref ref-type="bibr" rid="B79">1997</xref>; Bechara et al., <xref ref-type="bibr" rid="B5">1999</xref>; Parkinson et al., <xref ref-type="bibr" rid="B102">2001</xref>; Cousens and Otto, <xref ref-type="bibr" rid="B27">2003</xref>; Winstanley et al., <xref ref-type="bibr" rid="B139">2004d</xref>). This is perhaps not surprising given the strong reciprocal connections between OFC and ABL and the role that ABL is proposed to play in associative learning. ABL also appears to play a critical role during inter-temporal choice. Rats with ABL lesions are more impulsive when rewards are delayed, abandoning the delayed reward more quickly than controls (Winstanley et al., <xref ref-type="bibr" rid="B137">2004b</xref>; Cardinal, <xref ref-type="bibr" rid="B19">2006</xref>; Churchwell et al., <xref ref-type="bibr" rid="B25">2009</xref>; Ghods-Sharifi et al., <xref ref-type="bibr" rid="B43">2009</xref>).</p>
<p>As in many studies, activity patterns observed in ABL during performance of our task were similar to those observed in OFC; neurons represented predicted outcomes at the time of cue presentation and in anticipation of reward (Roesch et al., <xref ref-type="bibr" rid="B111">2010b</xref>). The two areas differed in that signals related to anticipated reward and delivery did not appear to be as reduced in ABL as they were in OFC when rewards were delayed. This is evident by comparing population histograms from both areas (OFC: Figure <xref ref-type="fig" rid="F2">2</xref>C versus ABL: Figure <xref ref-type="fig" rid="F3">3</xref>A). The counts of neurons that fired significantly more strongly for immediate reward did not outnumber the number of neurons that fired more strongly for delayed reward in ABL as they did in OFC.</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p><bold>Basolateral amygdala (ABL)</bold>. <bold>(A)</bold> Average firing rate for all reward-responsive neurons in ABL on the last 10 trials during immediate (gray) and delayed (black) reward after learning. <bold>(B)</bold> Activity in ABL was correlated with odor port orienting as defined by the speed at which rats initiated trials after house light illumination during the first and last 10 trials in blocks 2&#x02013;4. These data were normalized to the maximum and inverted. Error bars indicate SEM (adapted from Roesch et al., <xref ref-type="bibr" rid="B111">2010b</xref>).</p></caption>
<graphic xlink:href="fnins-05-00130-g003.tif"/>
</fig>
<p>Another difference between ABL and OFC was that neurons in ABL also fired more strongly when reward was delivered unexpectedly. For example, many ABL neurons fired strongly when the big-reward was delivered at the start of blocks 3 and 4 (Figure <xref ref-type="fig" rid="F1">1</xref>A). Although the mainstream view holds that amygdala is important for acquiring and storing associative information (LeDoux, <xref ref-type="bibr" rid="B77">2000</xref>; Murray, <xref ref-type="bibr" rid="B94">2007</xref>), these data and others like it have recently suggested that amygdala may also support other functions related to associative learning such as detecting the need for increased attention when reward expectations are violated (Gallagher et al., <xref ref-type="bibr" rid="B42">1990</xref>; Holland and Gallagher, <xref ref-type="bibr" rid="B52">1993b</xref>, <xref ref-type="bibr" rid="B54">1999</xref>; Breiter et al., <xref ref-type="bibr" rid="B13">2001</xref>; Yacubian et al., <xref ref-type="bibr" rid="B141">2006</xref>; Belova et al., <xref ref-type="bibr" rid="B7">2007</xref>; Tye et al., <xref ref-type="bibr" rid="B130">2010</xref>). Consistent with this hypothesis, we have shown that activity during unexpected reward delivery and omission was correlated with changes in attention that occur at the start of trial blocks (Figure <xref ref-type="fig" rid="F3">3</xref>B) and that inactivation of ABL makes rats less likely to detect changes in reward contingencies (Roesch et al., <xref ref-type="bibr" rid="B111">2010b</xref>).</p>
<p>Unfortunately, it is still unclear what sustained activity during the delay represents in ABL. Sustained activity in ABL might reflect unexpected omission of reward, signaling to the rat to attend more thoroughly to that location so that new learning might occur. It might also serve to help maintain learned associations and/or to learn new associations when delays are introduced between responses and reward delivery. Consistent with this hypothesis, ABL lesions have been shown to reduce the selectivity of neural firing in OFC and ventral striatum (VS; Schoenbaum et al., <xref ref-type="bibr" rid="B119">2003</xref>; Ambroggi et al., <xref ref-type="bibr" rid="B2">2008</xref>). If ABL&#x02019;s role is to help maintain expectancies or attention across the gap between responding and delivery of delayed reward, then loss of this signal would increase impulsive choice as has been shown by other labs (Winstanley et al., <xref ref-type="bibr" rid="B137">2004b</xref>).</p>
<p>Finally, it is worth noting that other parts of the amygdala might be critical for inter-temporal decision-making. Most prominent is the central nucleus of amygdala (CeA), which is critical for changes in attention or variations in event processing that occur during learning when rewards downshift from high to low value (Holland and Gallagher, <xref ref-type="bibr" rid="B51">1993a</xref>,<xref ref-type="bibr" rid="B53">c</xref>, <xref ref-type="bibr" rid="B55">2006</xref>; Holland and Kenmuir, <xref ref-type="bibr" rid="B56">2005</xref>; Bucci and Macleod, <xref ref-type="bibr" rid="B16">2007</xref>). We have recently shown that downshifts in value, including when rewards are unexpectedly delayed, increase firing in CeA during learning (Calu et al., <xref ref-type="bibr" rid="B18">2010</xref>). Changes in firing were correlated with behavioral measures of attention observed when reward contingencies were violated, which were lost after CeA inactivation (Calu et al., <xref ref-type="bibr" rid="B18">2010</xref>). Surprisingly, inactivation of CeA did not impact temporal choice in our task (Calu et al., <xref ref-type="bibr" rid="B18">2010</xref>). This might reflect control of behavior via detection of unexpected reward delivery which happens concurrently with unexpected reward omission during each block switch. To the best of our knowledge, it is unknown how CeA lesions would impact performance on the standard small-immediate versus large-delayed reward temporal discounting task, but we suspect that rats would be less impulsive.</p>
</sec>
<sec>
<title>Dopamine Neurons in Ventral Tegmental Area</title>
<p>Manipulations of DA can either increase or decrease how much animals discount delayed reward (Cardinal et al., <xref ref-type="bibr" rid="B21">2000</xref>, <xref ref-type="bibr" rid="B22">2004</xref>; Wade et al., <xref ref-type="bibr" rid="B132">2000</xref>; Kheramin et al., <xref ref-type="bibr" rid="B73">2004</xref>; Roesch et al., <xref ref-type="bibr" rid="B109">2007c</xref>); however, few studies have examined how DA neurons respond when rewards are unexpectedly delayed or delivered after long delay (Fiorillo et al., <xref ref-type="bibr" rid="B38">2008</xref>; Kobayashi and Schultz, <xref ref-type="bibr" rid="B75">2008</xref>; Schultz, <xref ref-type="bibr" rid="B120">2010</xref>). As in previous work, unexpected manipulation of reward size in our task impacted firing of DA neurons in VTA. Activity was high or low depending on whether reward was unexpectedly larger (positive prediction error) or smaller (negative prediction error), respectively, and activity was high or low depending on whether the odor predicted large or small reward, respectively. Thus, consistent with previous work, the activity of DA neurons appeared to signal errors in reward prediction during the presentation of unconditioned and conditioned stimuli (Mirenowicz and Schultz, <xref ref-type="bibr" rid="B83">1994</xref>; Montague et al., <xref ref-type="bibr" rid="B91">1996</xref>; Hollerman and Schultz, <xref ref-type="bibr" rid="B57">1998a</xref>,<xref ref-type="bibr" rid="B58">b</xref>; Waelti et al., <xref ref-type="bibr" rid="B133">2001</xref>; Fiorillo et al., <xref ref-type="bibr" rid="B39">2003</xref>; Tobler et al., <xref ref-type="bibr" rid="B129">2003</xref>; Nakahara et al., <xref ref-type="bibr" rid="B95">2004</xref>; Bayer and Glimcher, <xref ref-type="bibr" rid="B4">2005</xref>; Pan et al., <xref ref-type="bibr" rid="B101">2005</xref>; Morris et al., <xref ref-type="bibr" rid="B93">2006</xref>).</p>
<p>DA neurons also signaled errors in reward prediction when rewards were delayed (Roesch et al., <xref ref-type="bibr" rid="B108">2007b</xref>). Delivery of an unexpected immediate reward elicited a strong DA response (Figure <xref ref-type="fig" rid="F4">4</xref>B; immediate reward; red blotch; first 10 trials), which was subsequently replaced by firing to cues that predicted the short delay after learning (Figure <xref ref-type="fig" rid="F4">4</xref>B; last 10 trials). That is, activity was stronger at the end of the block (dashed blue line) than during the first several trials of that same block (solid blue line) just after odor presentation (Figure <xref ref-type="fig" rid="F4">4</xref>C). Overall, population activity was the strongest during cues that predicted the immediate reward (Figures <xref ref-type="fig" rid="F4">4</xref>A&#x02013;C; Roesch et al., <xref ref-type="bibr" rid="B108">2007b</xref>). Moreover, neurons that tended to fire more strongly for immediate reward also fired more strongly for cues that predicted large reward (Figure <xref ref-type="fig" rid="F4">4</xref>E).</p>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption><p><bold>Dopamine (DA)</bold>. <bold>(A)</bold> Average firing rate of dopamine neurons over forced- and free-choice trials. Color indicates the length of the delay preceding reward delivery from 0.5 to 7&#x02009;s. Activity is aligned on odor onset (left) and well entry (right). <bold>(B,D)</bold> Heat plots showing average activity of all cue/reward-responsive dopamine neurons during the first and last forced-choice trials in the second delay block when reward are presented earlier <bold>(B)</bold> or later than expected <bold>(D)</bold>. Activity is shown, aligned on odor onset and reward delivery. Hotter colors equal higher firing rates. <bold>(C)</bold> Plots the average firing over short and long delay trials aligned on odor onset. Dashed and solid lines represent activity during early and late periods of learning. Gray bar indicates analysis epoch for &#x0201C;E.&#x0201D; <bold>(E)</bold> Cue-evoked activity in reward-responsive dopamine neurons covaries with the delay and size of the predicted reward and its relative value. Comparison of the difference in firing rate on high- versus low value trials for each cue/reward-responsive DA neuron, calculated separately for &#x0201C;delay&#x0201D; (short minus long) and &#x0201C;reward&#x0201D; blocks (big minus small). Colored dots represent those neurons that showed a significant difference in firing between &#x0201C;high&#x0201D; and &#x0201C;low&#x0201D; conditions (<italic>t</italic>-test; <italic>p</italic>&#x02009;&#x0003C;&#x02009;0.05; Blue: delay; Green: reward; Black: both reward and delay). Data is taken after learning (last 15 trials; adapted from Roesch et al., <xref ref-type="bibr" rid="B108">2007b</xref>).</p></caption>
<graphic xlink:href="fnins-05-00130-g004.tif"/>
</fig>
<p>When rewards were unexpectedly delayed, DA neurons were inhibited at the time when the reward should have arrived on short delay trials (Figure <xref ref-type="fig" rid="F4">4</xref>D; omitted reward; first 10 trials). Again, this negative prediction error signal transferred to cues that predicted the delayed reward after learning (Figure <xref ref-type="fig" rid="F4">4</xref>D; last 10 trials). That is, cue-related activity was still strong at the start of the block before the rat realized that the cue no longer signaled short delay. During odor sampling activity was the weakest when that cue signaled the longest delay (Figure <xref ref-type="fig" rid="F4">4</xref>A; 7&#x02009;s; cue onset).</p>
<p>Finally, consistent with delayed rewards being unpredictable (Figure <xref ref-type="fig" rid="F1">1</xref>C), rewards delivered after long delay elicited strong firing (Figure <xref ref-type="fig" rid="F4">4</xref>A; 7&#x02009;s; cue onset and Figure <xref ref-type="fig" rid="F4">4</xref>D; delayed reward). Activity after 2&#x02009;s did not increase with each successive delay increase. This is likely due to rats updating their expectations as the delay period grew second by second. All of these findings are consistent with the notion that activity in midbrain DA signals errors in reward prediction.</p>
<p>Importantly, our results are consistent with work in humans and primates. Human fMRI studies demonstrate that VTA&#x02019;s efferents are active when participants are making decisions related to more immediate reward (McClure et al., <xref ref-type="bibr" rid="B82">2004</xref>, <xref ref-type="bibr" rid="B81">2007</xref>; Tanaka et al., <xref ref-type="bibr" rid="B127">2004</xref>; Kable and Glimcher, <xref ref-type="bibr" rid="B65">2007</xref>; Ballard and Knutson, <xref ref-type="bibr" rid="B3">2009</xref>). Direct recordings from primate DA neurons during performance of a simple pavlovian task are also consistent with our results (Fiorillo et al., <xref ref-type="bibr" rid="B38">2008</xref>; Kobayashi and Schultz, <xref ref-type="bibr" rid="B75">2008</xref>). As in our study, activity during delivery of delayed reward was positively correlated with the delay preceding it reflecting the uncertainty or unpredictability of the delayed reward. This might reflect the possibility that longer delays are harder to time (Church and Gibbon, <xref ref-type="bibr" rid="B24">1982</xref>; Kobayashi and Schultz, <xref ref-type="bibr" rid="B75">2008</xref>). Consistent with this interpretation, monkeys could not accurately predict the delivery of the delayed reward as measured by anticipatory licking (Kobayashi and Schultz, <xref ref-type="bibr" rid="B75">2008</xref>). Also consistent with our work, activity during sampling of cues that predicted reward was discounted by the expected delay. Specifically, the activity of DA neurons resembled the hyperbolic function typical of animal temporal discounting studies, reflecting stronger discounting of delayed reward when delays were relatively short.</p>
<p>Thus, across species, it is clear that signals related to prediction errors are modulated by cues that predict delayed reward. Such modulation must act on downstream neurons in cortex and basal ganglia to promote and suppress behavior during inter-temporal choice. Prominent in the current literature is the idea that DA transmission ultimately impacts behavioral output by influencing basal ganglia output structures such as SNr via modulation of D1 and D2 type receptors in dorsal striatum (DS; Bromberg-Martin et al., <xref ref-type="bibr" rid="B14">2010</xref>; Hong and Hikosaka, <xref ref-type="bibr" rid="B59">2011</xref>). Indeed, we and others have recently shown that DS and SNr neurons incorporate anticipated delay into their response selective firing during and prior to the decision to move (Stalnaker et al., <xref ref-type="bibr" rid="B124">2010</xref>; Bryden et al., <xref ref-type="bibr" rid="B15">2011</xref>; Cai et al., <xref ref-type="bibr" rid="B17">2011</xref>).</p>
<p>We propose that bursting of DA neurons to rewards that are delivered earlier than expected and the cues that come to predict them would activate the D1 mediated direct pathway, directing behavior toward the more immediate reward (Bromberg-Martin et al., <xref ref-type="bibr" rid="B14">2010</xref>). Low levels of dopamine, as observed when rewards are unexpectedly delayed would activate the D2 mediated indirect pathway so that movement toward the well that elicited the delayed reward would be suppressed (Frank, <xref ref-type="bibr" rid="B41">2005</xref>; Bromberg-Martin et al., <xref ref-type="bibr" rid="B14">2010</xref>). Consistent with this hypothesis, it has been shown that high and low DA receptor activation promotes potentiation of the direct and indirect pathway, respectively (Shen et al., <xref ref-type="bibr" rid="B123">2008</xref>), and that striatal D1 receptor blockade selectively impairs movements to rewarded targets, whereas D2 receptor blockade selectively suppresses movements to non-rewarded locations (Nakamura and Hikosaka, <xref ref-type="bibr" rid="B96">2006</xref>).</p>
</sec>
<sec>
<title>Ventral Striatum</title>
<p>Post-training lesions of VS, in particular nucleus accumbens core, induces impulsive choice of small-immediate reward over large-delayed reward (Cousins et al., <xref ref-type="bibr" rid="B28">1996</xref>; Cardinal et al., <xref ref-type="bibr" rid="B20">2001</xref>, <xref ref-type="bibr" rid="B22">2004</xref>; Bezzina et al., <xref ref-type="bibr" rid="B8">2007</xref>; Floresco et al., <xref ref-type="bibr" rid="B40">2008</xref>; Kalenscher and Pennartz, <xref ref-type="bibr" rid="B67">2008</xref>). Although there are several theories about the function of VS, one prominent theory suggests that VS serves as a limbic-motor interface, integrating value information with motor output (Mogenson et al., <xref ref-type="bibr" rid="B89">1980</xref>). Consistent with this notion, several labs have shown that VS incorporates expected value information into its neural firing (Bowman et al., <xref ref-type="bibr" rid="B12">1996</xref>; Hassani et al., <xref ref-type="bibr" rid="B46">2001</xref>; Carelli, <xref ref-type="bibr" rid="B23">2002</xref>; Cromwell and Schultz, <xref ref-type="bibr" rid="B29">2003</xref>; Setlow et al., <xref ref-type="bibr" rid="B122">2003</xref>; Janak et al., <xref ref-type="bibr" rid="B61">2004</xref>; Tanaka et al., <xref ref-type="bibr" rid="B127">2004</xref>; Nicola, <xref ref-type="bibr" rid="B97">2007</xref>; Khamassi et al., <xref ref-type="bibr" rid="B72">2008</xref>; Ito and Doya, <xref ref-type="bibr" rid="B60">2009</xref>; van der Meer and Redish, <xref ref-type="bibr" rid="B131">2009</xref>). Until recently, it was unknown whether VS incorporated expected delay information into this value calculation, possibly serving as a potential source by which representations of delayed reward might impact inter-temporal choice.</p>
<p>We have recently shown that single neurons in VS signal the value of the chosen action during performance of our choice task (Roesch et al., <xref ref-type="bibr" rid="B114">2009</xref>). The majority of cue-responsive neurons in VS fired significantly more strongly when rats anticipated high value reward in one of the two movement directions. This is illustrated in Figures <xref ref-type="fig" rid="F5">5</xref>A&#x02013;D, which plots the average firing rate of all cue-responsive neurons in VS for responses made in each cell&#x02019;s preferred and non-preferred movement fields. Activity was stronger prior to a response in the cell&#x02019;s preferred direction (left column) when the expected outcome was either a short delay (Figure <xref ref-type="fig" rid="F5">5</xref>A; black) or a large reward (Figure <xref ref-type="fig" rid="F5">5</xref>C; black) compared to a long delay (Figure <xref ref-type="fig" rid="F5">5</xref>A; gray) or a small reward (Figure <xref ref-type="fig" rid="F5">5</xref>C; gray), respectively. This activity most likely reflects common changes in motivation because neural firing during this period was correlated with the motivational level of the rat, which was high under short delay and large reward conditions (Figure <xref ref-type="fig" rid="F5">5</xref>E,F; Roesch et al., <xref ref-type="bibr" rid="B114">2009</xref>). This result is consistent with previous work showing that activity in VS is modulated during inter-temporal choice for immediate rewards (McClure et al., <xref ref-type="bibr" rid="B82">2004</xref>, <xref ref-type="bibr" rid="B81">2007</xref>; Kable and Glimcher, <xref ref-type="bibr" rid="B65">2007</xref>; Ballard and Knutson, <xref ref-type="bibr" rid="B3">2009</xref>) suggesting that VS is involved in decisions regarding discounted reward (but see Day et al., <xref ref-type="bibr" rid="B31">2011</xref>). We suspect that increased activation of neurons that signal movement during short delay trials might cause animals to choose the more immediate reward over the delayed reward through some sort of winner take all mechanism (Pennartz et al., <xref ref-type="bibr" rid="B103">1994</xref>; Redgrave et al., <xref ref-type="bibr" rid="B104">1999</xref>; Nicola, <xref ref-type="bibr" rid="B97">2007</xref>; Taha et al., <xref ref-type="bibr" rid="B125">2007</xref>).</p>
<fig id="F5" position="float">
<label>Figure 5</label>
<caption><p><bold>Ventral striatum (VS)</bold>. Population activity of odor-responsive neurons reflected motivational value and response direction on forced-choice trials. <bold>(A&#x02013;D)</bold> Curves representing normalized population firing rate during performance of forced-choice trials for the odor-responsive neurons as a function of time under the eight task conditions (high value&#x02009;&#x0003D;&#x02009;black; low value&#x02009;&#x0003D;&#x02009;gray). Data are aligned on odor port exit. Preferred and non-preferred directions are represented in left and right columns, respectively. For each neuron, the direction that yielded the maximal response was designated as preferred. Correlations in the preferred <bold>(E)</bold> and non-preferred <bold>(F)</bold> direction between value indices (short&#x02009;&#x02212;&#x02009;long/short&#x02009;&#x0002B;&#x02009;long and big&#x02009;&#x02212;&#x02009;small/big&#x02009;&#x0002B;&#x02009;small) computed for firing rate (during odor sampling) and reaction time (speed at which rats exited the odor port after sampling the odor; adapted from Roesch et al., <xref ref-type="bibr" rid="B114">2009</xref>).</p></caption>
<graphic xlink:href="fnins-05-00130-g005.tif"/>
</fig>
<p>Others suggest that temporally discounted value signals in VS have less to do with the actual choice &#x02013; which appears to be more reliably encoded in DS &#x02013; and more to do with encoding the sum of the temporally discounted values of the available options, that is, the overall goodness of the situation. Unlike our task, monkeys, on each trial, were presented with two options simultaneously. Each option varied in magnitude and delay, and the location of the better reward varied randomly. Color and number of cues signaled size and delay, respectively. These contingencies did not change over the course of the experiment.</p>
<p>Not only was activity in VS modulated by the value of the delayed reward in this study, neurons in VS were more likely to encode the sum of the temporally discounted value of the two targets than the differences between them or the choice that the monkey was about to make (Cai et al., <xref ref-type="bibr" rid="B17">2011</xref>). Our results are similar to these in that activity in VS was modulated by size and delay, however we clearly show that activity in VS signaled the value and the direction of the chosen option. This difference likely reflects differences in the task design. In our task, rats form response biases to one direction over the other during the course of the block and rats constantly had to modify their behavior when contingencies changed, thus response&#x02013;outcome contingencies were very important in our task. Further, we could not access whether or not activity in VS represented the overall value of the two options because the overall value of the reward did not change from block to block. This was an important and interesting feature of the monkey task and it is highly likely that monkeys paid close attention to the overall value associated with each trial before deciding which option to ultimately choose.</p>
<p>Several studies, including ours, have also shown that VS neurons fire in anticipation of the reward (Roesch et al., <xref ref-type="bibr" rid="B114">2009</xref>). This is apparent in Figure <xref ref-type="fig" rid="F5">5</xref>A, which illustrates that activity was higher after the response in the cell&#x02019;s preferred direction on long delay (gray) compared to short delay trials (black). Interestingly, the difference in firing between short and long delay trials after the behavioral response was also correlated with reaction time, however the direction of this correlation was the opposite of that prior to the movement. That is, slower responding on long delay trials, after the choice, resulted in stronger firing rates during the delay preceding reward delivery. If activity in VS during decision-making reflects motivation, as we have suggested, then activity during this period may reflect the exertion of increased will to remain in the well to receive reward. As described above for OFC and ABL, this expectancy signal might be critical for maintaining responding when rewards become delayed. Loss of this signal would reduce the rat&#x02019;s capacity to maintain motivation during reward delays as described in other contexts (Cousins et al., <xref ref-type="bibr" rid="B28">1996</xref>; Cardinal et al., <xref ref-type="bibr" rid="B20">2001</xref>, <xref ref-type="bibr" rid="B22">2004</xref>; Bezzina et al., <xref ref-type="bibr" rid="B8">2007</xref>; Floresco et al., <xref ref-type="bibr" rid="B40">2008</xref>; Kalenscher and Pennartz, <xref ref-type="bibr" rid="B67">2008</xref>).</p>
<p>Our data suggest two conflicting roles for VS in delay discounting. We speculate that different training procedures might change the relative contributions of these two functions. For example, if animals were highly trained to reverse behaviors based on discounted reward, as in the recording setting used here, they might be less reliant on VS to maintain the value of the discounted reward. In this situation, the primary effect of VS manipulations might be to reduce impulsive choice of the more immediate reward. On the other hand, maintaining reward information across the delay might be more critical early on during learning, when rats are learning contingencies between responses and their outcomes. Increasing delays between the instrumental response and reinforcer impairs learning in normal animals and is exacerbated after VS lesions (Cardinal et al., <xref ref-type="bibr" rid="B22">2004</xref>).</p>
<p>Besides directly driving behavior, as proposed above, other theories suggest that VS might also be involved in providing expectancy information to downstream areas as part of the &#x0201C;Critic&#x0201D; in the actor&#x02013;critic model (Joel et al., <xref ref-type="bibr" rid="B63">2002</xref>; O&#x02019;Doherty et al., <xref ref-type="bibr" rid="B98">2004</xref>). In this model the Critic stores and learns values of states which in turn are used to compute prediction errors necessary for learning and adaptive behavior. Neural instantiations of this model suggests that it is VS that signals the predicted value of the upcoming decision, which in turn impacts prediction error encoding by dopamine neurons. Subsequently, DA prediction errors modify behavior via connections with the DS (Actor) and update predicted value signals in VS. Thus, signaling of immediate and delayed reward by VS would have a profound impact on reinforcement learning in this circuit as we will discuss below.</p>
</sec>
<sec>
<title>Integration of Size and Delay Encoding</title>
<p>Do brain areas integrate size and delay information, providing a context-free representation of value (Montague and Berns, <xref ref-type="bibr" rid="B90">2002</xref>; Kringelbach, <xref ref-type="bibr" rid="B76">2005</xref>; Padoa-Schioppa, <xref ref-type="bibr" rid="B100">2011</xref>)? If this hypothesis is so, then neural activity that encodes the delay to reward should also be influenced by changes in reward magnitude, either at a single-unit or population level. We found that when delay and reward size were manipulated across different blocks of trials, OFC, ABL, and DS maintained dissociable representations of the value of differently delayed and sized rewards. Even in VS, where the population neurons fired more strongly to short delay and large reward conditions and activity was correlated with motivational strength, there was only a slight insignificant tendency for single neurons to represent both size and delay components. Although many neurons did encode reward size and delay length at the single cell level, many neurons encoded one but not the other. This apparent trend toward common encoding likely reflects the integration of value into motor signals at the level of VS which is further downstream than areas such as OFC and ABL.</p>
<p>Consistent with this hypothesis, when we recorded from neurons more closely tied to the output of the basal ganglia, we found that activity in SNr showed a significant positive correlation between reward size and delay (Bryden et al., <xref ref-type="bibr" rid="B15">2011</xref>). This is illustrated in the single cell example in Figure <xref ref-type="fig" rid="F6">6</xref>A. Activity was higher for short delay and large reward conditions for movements made into the right well. Unlike OFC, ABL, and VS, those SNr neurons that fired more strongly for cues that predicted short delay (over long delay) significantly tended to fire more strongly for cues that predicted the large reward (over small reward; Figure <xref ref-type="fig" rid="F6">6</xref>B) similar to what we described for DA neurons in VTA (Figure <xref ref-type="fig" rid="F4">4</xref>E).</p>
<fig id="F6" position="float">
<label>Figure 6</label>
<caption><p><bold>Substantial nigra pars reticulata (SNr)</bold>. Activity of single neurons in SNr reflects an interaction between expected value and direction. <bold>(A)</bold> Activity of a single SNr neuron averaged over all trials for each condition aligned on odor port exit during all eight conditions (four rewards&#x02009;&#x000D7;&#x02009;two directions). Histogram represents average activity over the last 10 trials (after learning) for each condition in a block of trials. Each tick mark is an action potential and trials are represented by rows. All trials are shown. <bold>(B)</bold> Correlation between size (big&#x02009;&#x02212;&#x02009;small/big&#x02009;&#x0002B;&#x02009;small) and delay (short&#x02009;&#x02212;&#x02009;long/short&#x02009;&#x0002B;&#x02009;long) effects averaged across direction (odor onset to odor port exit). Data was taken after learning (last 10 trials for each condition within each block; Bryden et al., <xref ref-type="bibr" rid="B15">2011</xref>).</p></caption>
<graphic xlink:href="fnins-05-00130-g006.tif"/>
</fig>
<p>Although these results are consistent with the notion that activity in SNr reflects a common output, even in SNr, correlations between delay and size were relatively weak; leaving open the interpretation that SNr might also maintain independent representations of expected size and delay. These data suggest that we have to move very close to the motor system before delay and size are represented as a common signal, and it is not clear whether such representations exist in many regions upstream. According to our data, the majority of brain areas involved in the circuit critical for learning and decision-making based on expected outcomes and violations of those expectations encode delayed reward independently from reward size.</p>
<p>The fact that we were able to dissociate the effects of reward size and delay on single-unit activity in these areas indicates that encoding of discounted reward might involve different neural processes than those that signal expected reward value. This dissociation is perhaps not surprising considering recent behavioral data that supports the view that learning about sensory and temporal features of stimuli involve different underlying systems (Delamater and Oakeshott, <xref ref-type="bibr" rid="B32">2007</xref>) and that studies that report abnormal delay discounting functions often report no observable change in behaviors guided by reward size.</p>
<p>With that said, other studies have shown that neural activity related to size and delay are correlated in several frontal areas in primate cortex (Roesch and Olson, <xref ref-type="bibr" rid="B112">2005a</xref>,<xref ref-type="bibr" rid="B113">b</xref>; Kim et al., <xref ref-type="bibr" rid="B74">2008</xref>). For example, in primates, OFC neurons that fired more strongly for shorter delays tended to fire more strongly for larger rewards. Our ability to detect independent encoding might reflect a species difference and/or a number of other task parameters; however we would like to think that differences might emerge from different levels of training. With extended training, OFC neurons may become optimized to provide generic value representations. This would have interesting implications as it would suggest that OFC and possibly other brain areas might refrain from putting delay and size on a common value scale until they have been integrated for an extended time. This might be why single neurons in primate frontal cortex and striatum have been shown to be modulated by both size and delay (Kim et al., <xref ref-type="bibr" rid="B74">2008</xref>; Cai et al., <xref ref-type="bibr" rid="B17">2011</xref>).</p>
<p>Another possibility is that common value signals observed in primates reflect the fact that, over time, short delay trials sometimes led to more reward. That is, since short delay trials took less time to complete, more reward could be obtained over the course of the recording session. Unlike the rat work, delays were not normalized in some of these studies (Roesch and Olson, <xref ref-type="bibr" rid="B112">2005a</xref>,<xref ref-type="bibr" rid="B113">b</xref>), thus raising the possibility that brain areas might commonly encode size and delay only when shorter delays are genuinely more valuable, not just subjectively preferred. The possibility that these variables might be encoded separately in primates is also consistent with recent work showing that risk is sometimes encoded separately from reward size in primate OFC (Kennerley and Wallis, <xref ref-type="bibr" rid="B70">2009</xref>; Kennerley et al., <xref ref-type="bibr" rid="B69">2009</xref>, O&#x02019;Neill and Schultz, <xref ref-type="bibr" rid="B99">2010</xref>; Schultz, <xref ref-type="bibr" rid="B120">2010</xref>; Wallis and Kennerley, <xref ref-type="bibr" rid="B134">2010</xref>).</p>
<p>A final possibility is that we did not vary delay and magnitude simultaneously. True discounting studies manipulate size and delay at the same time to demonstrate the antagonistic effects of reward magnitude and delay. Certainly, Lee and colleagues have found more integrative encoding of value in the brain than we have using this procedure (Kim et al., <xref ref-type="bibr" rid="B74">2008</xref>; Cai et al., <xref ref-type="bibr" rid="B17">2011</xref>). This would suggest that when size and delay are manipulated simultaneously the brain encodes them together, but when they are split apart, they are represented independently. More work is necessary to determine if this theory holds up. Still, there are other differences between tasks that might impact how the brain encodes these two variables. In our task rats are constantly forming and updating response&#x02013;outcome associations as they learn to bias behavior in one direction when rewards change in size or delay. Independent representations of size and delay might help the brain cope with these changing circumstances.</p>
<p>That fact that size and delay are not strongly correlated in most brain areas that we have tested does not mean that the rat or that other brain areas might treat them similarly. Remarkably, out of all the brain areas that we have recorded from in this task only the firing of DA neurons in VTA showed a strong clear cut relationship between manipulations of delay and size (Figure <xref ref-type="fig" rid="F4">4</xref>E). DA neurons fired more strongly to cues that predicted a short delay and large reward and were inhibited by cues that predicted a small reward and a long delay (Figure <xref ref-type="fig" rid="F4">4</xref>E). These were the same neurons in which activity reflected prediction errors during unexpected reward delivery and omission when reward was made larger or smaller than expected and when reward was delivered earlier or later than expected (Figure <xref ref-type="fig" rid="F4">4</xref>). The fact that the activity of DA neurons represents cues that predict expected size and delay similarly does not fit well with the finding that other areas do not, considering that it is dopaminergic input that is thought to train up associations in these areas. Why and how delay information remains represented separately from value is an intriguing question and requires further investigation.</p>
</sec>
<sec>
<title>Conclusion</title>
<p>Here we speculate on the circuit that drives discounting behavior based on the neural correlates related to size and delay as described above. It is important to remember that much of this is based on neural correlates and we are currently trying to work out the circuit using lesion and inactivation techniques combined with single-unit recordings.</p>
<p>According to our data, when an immediate reward is delivered unexpectedly, DA neurons burst, consistent with a signal that detects errors in reward prediction (Roesch et al., <xref ref-type="bibr" rid="B108">2007b</xref>). ABL neurons also respond to unexpected immediate reward but several trials later, consistent with signals that detect the need for increased attention or event processing during learning (Roesch et al., <xref ref-type="bibr" rid="B111">2010b</xref>). As the rat learns to anticipate the reward, expectancy signals in OFC, ABL, and VS develop. We suspect that development of expectancy signals first occurs in OFC as a consequence of error detection by DA neurons, and that OFC is critical for the development of expectancy signals in ABL and VS. Although all three areas fire in anticipation of reward, they might be carrying unique signals related to reward outcome values, attention, and motivation, respectively. As expectancy signals increase, prediction error signaling at the time of reward delivery decrease and firing of DA neurons start to fire to cues that predict the immediate reward (Figure <xref ref-type="fig" rid="F4">4</xref>). Cue-evoked responses that develop in DA neurons subsequently stamp in associations in OFC, VS, and DS. Interactions between ABL and these areas might be particularly important in this process; lesions of ABL impairs cue development in OFC and VS (Schoenbaum et al., <xref ref-type="bibr" rid="B119">2003</xref>; Schoenbaum and Roesch, <xref ref-type="bibr" rid="B118">2005</xref>; Ambroggi et al., <xref ref-type="bibr" rid="B2">2008</xref>). It is still unclear whether ABL&#x02013;DA interactions are necessary for cue selectivity to develop in themselves and in downstream areas (Roesch et al., <xref ref-type="bibr" rid="B110">2010a</xref>). After learning, OFC and VS guide decision-making via reward specific outcome values and affective/motivational associations, respectively and with DS guiding behavior by signaling action&#x02013;value and stimulus&#x02013;response associations (Stalnaker et al., <xref ref-type="bibr" rid="B124">2010</xref>). Positive prediction errors likely impact striatal output via D1 mediated direct pathways to SNr promoting movement via disinhibition of downstream motor areas (Bromberg-Martin et al., <xref ref-type="bibr" rid="B14">2010</xref>).</p>
<p>So what happens when rewards are delayed? After learning, there are strong expectancy signals in OFC for the immediate reward. Expectancy activity in OFC for the immediate reward persists even when reward is no longer available at that time (e.g., Figure <xref ref-type="fig" rid="F2">2</xref>B). Thus, when the immediate reward is not delivered, a strong negative prediction error is generated by DA neurons (Figure <xref ref-type="fig" rid="F4">4</xref>B). Inhibition of DA should reduce associability with reward in downstream areas, thus inhibiting responses to cues signaling the location of the delayed reward. Further, attenuated expectancy signals generated in OFC would reduce expectancy signals reliant on it, as shown for ABL and possibly for VS, that might aid in maintaining responding for the now delayed reward (Saddoris et al., <xref ref-type="bibr" rid="B117">2005</xref>). Reduction of these signals might further decrease associability with the delayed reward. Subsequently, DA neurons start to inhibit firing to cues that predict the delayed reward, weakening associations in downstream areas such as OFC, ABL, and striatum. Decreased DA transmission in striatum would impact the D2 mediated indirect pathway inducing increases in SNr firing that increases suppression of movement by inhibiting downstream motor structures (Bromberg-Martin et al., <xref ref-type="bibr" rid="B14">2010</xref>).</p>
<p>It is important to note that these are not the only brain areas involved in temporal discounting and inter-temporal choice. Serotonin clearly plays a role but exactly what role it plays is still a little murky. Serotonin depletion, sometimes, but not always, steepens the discounting of delayed rewards making animals more impulsive (Wogar et al., <xref ref-type="bibr" rid="B140">1993</xref>; Harrison et al., <xref ref-type="bibr" rid="B45">1997</xref>; Bizot et al., <xref ref-type="bibr" rid="B9">1999</xref>; Evenden and Ryan, <xref ref-type="bibr" rid="B36">1999</xref>; Mobini et al., <xref ref-type="bibr" rid="B87">2000a</xref>,<xref ref-type="bibr" rid="B88">b</xref>; Cardinal et al., <xref ref-type="bibr" rid="B22">2004</xref>; Winstanley et al., <xref ref-type="bibr" rid="B136">2004a</xref>,<xref ref-type="bibr" rid="B138">c</xref>; Denk et al., <xref ref-type="bibr" rid="B33">2005</xref>; Cardinal, <xref ref-type="bibr" rid="B19">2006</xref>), and increased extracellular serotonin concentrations promotes selection of large-delayed reward over smaller immediate reward (Bizot et al., <xref ref-type="bibr" rid="B10">1988</xref>, <xref ref-type="bibr" rid="B9">1999</xref>). Furthermore, recent data demonstrates that serotonin efflux in rat dorsal raphe nucleus increase when animals have to wait for reward and single dorsal raphe neurons fire in anticipation of delayed reward (Miyazaki et al., <xref ref-type="bibr" rid="B84">2011a</xref>,<xref ref-type="bibr" rid="B85">b</xref>).</p>
<p>Work in humans has also clearly defined a role for PFC and other cortical areas in inter-temporal choice especially when decisions have to made for rewards that will arrive in the distant future (e.g., months to years; McClure et al., <xref ref-type="bibr" rid="B82">2004</xref>, <xref ref-type="bibr" rid="B81">2007</xref>; Tanaka et al., <xref ref-type="bibr" rid="B127">2004</xref>; Kable and Glimcher, <xref ref-type="bibr" rid="B65">2007</xref>; Ballard and Knutson, <xref ref-type="bibr" rid="B3">2009</xref>; Figner et al., <xref ref-type="bibr" rid="B37">2010</xref>). These systems likely interact on several levels to control behavior when expected rewards are delayed.</p>
<p>In conclusion, it is clear that discounting behavior is complicated and impacts a number of systems. From the results described above, when an individual chooses between an immediate versus delayed reward, the decision ultimately depends on previous experience with the delayed reward and the impact that a delayed reward has on neural processes related to reward expectation, prediction error encoding, attention, motivation, and the development of associations between stimuli, responses, and outcome values. To elucidate the underlying cause of the many disorders that impact impulsivity, we must address which of these processes are impaired and further test the circuit involved in inter-temporal choice.</p>
</sec>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ack>
<p>This article was supported by grants from the NIDA (K01DA021609, Matthew R. Roesch; R01-DA031695, Matthew R. Roesch).</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ainslie</surname> <given-names>G. W.</given-names></name></person-group> (<year>1974</year>). <article-title>Impulse control in pigeons</article-title>. <source>J. Exp. Anal. Behav.</source> <volume>21</volume>, <fpage>485</fpage>&#x02013;<lpage>489</lpage>.<pub-id pub-id-type="doi">10.1901/jeab.1974.21-485</pub-id><pub-id pub-id-type="pmid">16811760</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ambroggi</surname> <given-names>F.</given-names></name> <name><surname>Ishikawa</surname> <given-names>A.</given-names></name> <name><surname>Fields</surname> <given-names>H. L.</given-names></name> <name><surname>Nicola</surname> <given-names>S. M.</given-names></name></person-group> (<year>2008</year>). <article-title>Basolateral amygdala neurons facilitate reward-seeking behavior by exciting nucleus accumbens neurons</article-title>. <source>Neuron</source> <volume>59</volume>, <fpage>648</fpage>&#x02013;<lpage>661</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2008.07.004</pub-id><pub-id pub-id-type="pmid">18760700</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ballard</surname> <given-names>K.</given-names></name> <name><surname>Knutson</surname> <given-names>B.</given-names></name></person-group> (<year>2009</year>). <article-title>Dissociable neural representations of future reward magnitude and delay during temporal discounting</article-title>. <source>Neuroimage</source> <volume>45</volume>, <fpage>143</fpage>&#x02013;<lpage>150</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuroimage.2008.10.052</pub-id><pub-id pub-id-type="pmid">19071223</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bayer</surname> <given-names>H. M.</given-names></name> <name><surname>Glimcher</surname> <given-names>P. W.</given-names></name></person-group> (<year>2005</year>). <article-title>Midbrain dopamine neurons encode a quantitative reward prediction error signal</article-title>. <source>Neuron</source> <volume>47</volume>, <fpage>129</fpage>&#x02013;<lpage>141</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2005.05.020</pub-id><pub-id pub-id-type="pmid">15996553</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bechara</surname> <given-names>A.</given-names></name> <name><surname>Damasio</surname> <given-names>H.</given-names></name> <name><surname>Damasio</surname> <given-names>A. R.</given-names></name> <name><surname>Lee</surname> <given-names>G. P.</given-names></name></person-group> (<year>1999</year>). <article-title>Different contributions of the human amygdala and ventromedial prefrontal cortex to decision-making</article-title>. <source>J. Neurosci.</source> <volume>19</volume>, <fpage>5473</fpage>&#x02013;<lpage>5481</lpage>.<pub-id pub-id-type="pmid">10377356</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bechara</surname> <given-names>A.</given-names></name> <name><surname>Dolan</surname> <given-names>S.</given-names></name> <name><surname>Hindes</surname> <given-names>A.</given-names></name></person-group> (<year>2002</year>). <article-title>Decision-making and addiction (part II): myopia for the future or hypersensitivity to reward?</article-title> <source>Neuropsychologia</source> <volume>40</volume>, <fpage>1690</fpage>&#x02013;<lpage>1705</lpage>.<pub-id pub-id-type="doi">10.1016/S0028-3932(02)00015-5</pub-id><pub-id pub-id-type="pmid">11992657</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Belova</surname> <given-names>M. A.</given-names></name> <name><surname>Paton</surname> <given-names>J. J.</given-names></name> <name><surname>Morrison</surname> <given-names>S. E.</given-names></name> <name><surname>Salzman</surname> <given-names>C. D.</given-names></name></person-group> (<year>2007</year>). <article-title>Expectation modulates neural responses to pleasant and aversive stimuli in primate amygdala</article-title>. <source>Neuron</source> <volume>55</volume>, <fpage>970</fpage>&#x02013;<lpage>984</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2007.08.004</pub-id><pub-id pub-id-type="pmid">17880899</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bezzina</surname> <given-names>G.</given-names></name> <name><surname>Cheung</surname> <given-names>T. H.</given-names></name> <name><surname>Asgari</surname> <given-names>K.</given-names></name> <name><surname>Hampson</surname> <given-names>C. L.</given-names></name> <name><surname>Body</surname> <given-names>S.</given-names></name> <name><surname>Bradshaw</surname> <given-names>C. M.</given-names></name> <name><surname>Szabadi</surname> <given-names>E.</given-names></name> <name><surname>Deakin</surname> <given-names>J. F.</given-names></name> <name><surname>Anderson</surname> <given-names>I. M.</given-names></name></person-group> (<year>2007</year>). <article-title>Effects of quinolinic acid-induced lesions of the nucleus accumbens core on inter-temporal choice: a quantitative analysis</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>195</volume>, <fpage>71</fpage>&#x02013;<lpage>84</lpage>.<pub-id pub-id-type="doi">10.1007/s00213-007-0882-0</pub-id><pub-id pub-id-type="pmid">17659381</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bizot</surname> <given-names>J.</given-names></name> <name><surname>Le Bihan</surname> <given-names>C.</given-names></name> <name><surname>Puech</surname> <given-names>A. J.</given-names></name> <name><surname>Hamon</surname> <given-names>M.</given-names></name> <name><surname>Thiebot</surname> <given-names>M.</given-names></name></person-group> (<year>1999</year>). <article-title>Serotonin and tolerance to delay of reward in rats</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>146</volume>, <fpage>400</fpage>&#x02013;<lpage>412</lpage>.<pub-id pub-id-type="doi">10.1007/PL00005485</pub-id><pub-id pub-id-type="pmid">10550490</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bizot</surname> <given-names>J. C.</given-names></name> <name><surname>Thiebot</surname> <given-names>M. H.</given-names></name> <name><surname>Le Bihan</surname> <given-names>C.</given-names></name> <name><surname>Soubrie</surname> <given-names>P.</given-names></name> <name><surname>Simon</surname> <given-names>P.</given-names></name></person-group> (<year>1988</year>). <article-title>Effects of imipramine-like drugs and serotonin uptake blockers on delay of reward in rats. Possible implication in the behavioral mechanism of action of antidepressants</article-title>. <source>J. Pharmacol. Exp. Ther.</source> <volume>246</volume>, <fpage>1144</fpage>&#x02013;<lpage>1151</lpage>.<pub-id pub-id-type="pmid">3418513</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Boettiger</surname> <given-names>C. A.</given-names></name> <name><surname>Mitchell</surname> <given-names>J. M.</given-names></name> <name><surname>Tavares</surname> <given-names>V. C.</given-names></name> <name><surname>Robertson</surname> <given-names>M.</given-names></name> <name><surname>Joslyn</surname> <given-names>G.</given-names></name> <name><surname>D&#x02019;Esposito</surname> <given-names>M.</given-names></name> <name><surname>Fields</surname> <given-names>H. L.</given-names></name></person-group> (<year>2007</year>). <article-title>Immediate reward bias in humans: fronto-parietal networks and a role for the catechol-<italic>O</italic>-methyltransferase 158(Val/Val) genotype</article-title>. <source>J. Neurosci.</source> <volume>27</volume>, <fpage>14383</fpage>&#x02013;<lpage>14391</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.2551-07.2007</pub-id><pub-id pub-id-type="pmid">18160646</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bowman</surname> <given-names>E. M.</given-names></name> <name><surname>Aigner</surname> <given-names>T. G.</given-names></name> <name><surname>Richmond</surname> <given-names>B. J.</given-names></name></person-group> (<year>1996</year>). <article-title>Neural signals in the monkey ventral striatum related to motivation for juice and cocaine rewards</article-title>. <source>J. Neurophysiol.</source> <volume>75</volume>, <fpage>1061</fpage>&#x02013;<lpage>1073</lpage>.<pub-id pub-id-type="pmid">8867118</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Breiter</surname> <given-names>H. C.</given-names></name> <name><surname>Aharon</surname> <given-names>I.</given-names></name> <name><surname>Kahneman</surname> <given-names>D.</given-names></name> <name><surname>Dale</surname> <given-names>A.</given-names></name> <name><surname>Shizgal</surname> <given-names>P.</given-names></name></person-group> (<year>2001</year>). <article-title>Functional imaging of neural responses to expectancy and experience of monetary gains and losses</article-title>. <source>Neuron</source> <volume>30</volume>, <fpage>619</fpage>&#x02013;<lpage>639</lpage>.<pub-id pub-id-type="doi">10.1016/S0896-6273(01)00303-8</pub-id><pub-id pub-id-type="pmid">11395019</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bromberg-Martin</surname> <given-names>E. S.</given-names></name> <name><surname>Matsumoto</surname> <given-names>M.</given-names></name> <name><surname>Hikosaka</surname> <given-names>O.</given-names></name></person-group> (<year>2010</year>). <article-title>Dopamine in motivational control: rewarding, aversive, and alerting</article-title>. <source>Neuron</source> <volume>68</volume>, <fpage>815</fpage>&#x02013;<lpage>834</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2010.11.022</pub-id><pub-id pub-id-type="pmid">21144997</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bryden</surname> <given-names>D. W.</given-names></name> <name><surname>Johnson</surname> <given-names>E. E.</given-names></name> <name><surname>Diao</surname> <given-names>X.</given-names></name> <name><surname>Roesch</surname> <given-names>M. R.</given-names></name></person-group> (<year>2011</year>). <article-title>Impact of expected value on neural activity in rat substantia nigra pars reticulata</article-title>. <source>Eur. J. Neurosci.</source> <volume>33</volume>, <fpage>2308</fpage>&#x02013;<lpage>2317</lpage>.<pub-id pub-id-type="doi">10.1111/j.1460-9568.2011.07705.x</pub-id><pub-id pub-id-type="pmid">21645133</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bucci</surname> <given-names>D. J.</given-names></name> <name><surname>Macleod</surname> <given-names>J. E.</given-names></name></person-group> (<year>2007</year>). <article-title>Changes in neural activity associated with a surprising change in the predictive validity of a conditioned stimulus</article-title>. <source>Eur. J. Neurosci.</source> <volume>26</volume>, <fpage>2669</fpage>&#x02013;<lpage>2676</lpage>.<pub-id pub-id-type="doi">10.1111/j.1460-9568.2007.05902.x</pub-id><pub-id pub-id-type="pmid">17970737</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cai</surname> <given-names>X.</given-names></name> <name><surname>Kim</surname> <given-names>S.</given-names></name> <name><surname>Lee</surname> <given-names>D.</given-names></name></person-group> (<year>2011</year>). <article-title>Heterogeneous coding of temporally discounted values in the dorsal and ventral striatum during intertemporal choice</article-title>. <source>Neuron</source> <volume>69</volume>, <fpage>170</fpage>&#x02013;<lpage>182</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2010.11.041</pub-id><pub-id pub-id-type="pmid">21220107</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Calu</surname> <given-names>D. J.</given-names></name> <name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Haney</surname> <given-names>R. Z.</given-names></name> <name><surname>Holland</surname> <given-names>P. C.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>Neural correlates of variations in event processing during learning in central nucleus of amygdala</article-title>. <source>Neuron</source> <volume>68</volume>, <fpage>991</fpage>&#x02013;<lpage>1001</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2010.11.019</pub-id><pub-id pub-id-type="pmid">21145010</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cardinal</surname> <given-names>R. N.</given-names></name></person-group> (<year>2006</year>). <article-title>Neural systems implicated in delayed and probabilistic reinforcement</article-title>. <source>Neural. Netw.</source> <volume>19</volume>, <fpage>1277</fpage>&#x02013;<lpage>1301</lpage>.<pub-id pub-id-type="doi">10.1016/j.neunet.2006.03.004</pub-id><pub-id pub-id-type="pmid">16938431</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cardinal</surname> <given-names>R. N.</given-names></name> <name><surname>Pennicott</surname> <given-names>D. R.</given-names></name> <name><surname>Sugathapala</surname> <given-names>C. L.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name> <name><surname>Everitt</surname> <given-names>B. J.</given-names></name></person-group> (<year>2001</year>). <article-title>Impulsive choice induced in rats by lesions of the nucleus accumbens core</article-title>. <source>Science</source> <volume>292</volume>, <fpage>2499</fpage>&#x02013;<lpage>2501</lpage>.<pub-id pub-id-type="doi">10.1126/science.1060818</pub-id><pub-id pub-id-type="pmid">11375482</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cardinal</surname> <given-names>R. N.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name> <name><surname>Everitt</surname> <given-names>B. J.</given-names></name></person-group> (<year>2000</year>). <article-title>The effects of d-amphetamine, chlordiazepoxide, alpha-flupentixol and behavioural manipulations on choice of signalled and unsignalled delayed reinforcement in rats</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>152</volume>, <fpage>362</fpage>&#x02013;<lpage>375</lpage>.<pub-id pub-id-type="doi">10.1007/s002130000536</pub-id><pub-id pub-id-type="pmid">11140328</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cardinal</surname> <given-names>R. N.</given-names></name> <name><surname>Winstanley</surname> <given-names>C. A.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name> <name><surname>Everitt</surname> <given-names>B. J.</given-names></name></person-group> (<year>2004</year>). <article-title>Limbic corticostriatal systems and delayed reinforcement</article-title>. <source>Ann. N. Y. Acad. Sci.</source> <volume>1021</volume>, <fpage>33</fpage>&#x02013;<lpage>50</lpage>.<pub-id pub-id-type="doi">10.1196/annals.1308.004</pub-id><pub-id pub-id-type="pmid">15251872</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carelli</surname> <given-names>R. M.</given-names></name></person-group> (<year>2002</year>). <article-title>Nucleus accumbens cell firing during goal-directed behaviors for cocaine vs &#x02018;natural&#x02019; reinforcement</article-title>. <source>Physiol. Behav.</source> <volume>76</volume>, <fpage>379</fpage>&#x02013;<lpage>387</lpage>.<pub-id pub-id-type="doi">10.1016/S0031-9384(02)00760-6</pub-id><pub-id pub-id-type="pmid">12117574</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Church</surname> <given-names>R. M.</given-names></name> <name><surname>Gibbon</surname> <given-names>J.</given-names></name></person-group> (<year>1982</year>). <article-title>Temporal generalization</article-title>. <source>J. Exp. Psychol. Anim. Behav. Process.</source> <volume>8</volume>, <fpage>165</fpage>&#x02013;<lpage>186</lpage>.<pub-id pub-id-type="doi">10.1037/0097-7403.8.2.165</pub-id><pub-id pub-id-type="pmid">7069377</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Churchwell</surname> <given-names>J. C.</given-names></name> <name><surname>Morris</surname> <given-names>A. M.</given-names></name> <name><surname>Heurtelou</surname> <given-names>N. M.</given-names></name> <name><surname>Kesner</surname> <given-names>R. P.</given-names></name></person-group> (<year>2009</year>). <article-title>Interactions between the prefrontal cortex and amygdala during delay discounting and reversal</article-title>. <source>Behav. Neurosci.</source> <volume>123</volume>, <fpage>1185</fpage>&#x02013;<lpage>1196</lpage>.<pub-id pub-id-type="doi">10.1037/a0017734</pub-id><pub-id pub-id-type="pmid">20001103</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Coffey</surname> <given-names>S. F.</given-names></name> <name><surname>Gudleski</surname> <given-names>G. D.</given-names></name> <name><surname>Saladin</surname> <given-names>M. E.</given-names></name> <name><surname>Brady</surname> <given-names>K. T.</given-names></name></person-group> (<year>2003</year>). <article-title>Impulsivity and rapid discounting of delayed hypothetical rewards in cocaine-dependent individuals</article-title>. <source>Exp. Clin. Psychopharmacol.</source> <volume>11</volume>, <fpage>18</fpage>&#x02013;<lpage>25</lpage>.<pub-id pub-id-type="doi">10.1037/1064-1297.11.1.18</pub-id><pub-id pub-id-type="pmid">12622340</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cousens</surname> <given-names>G. A.</given-names></name> <name><surname>Otto</surname> <given-names>T.</given-names></name></person-group> (<year>2003</year>). <article-title>Neural substrates of olfactory discrimination learning with auditory secondary reinforcement. I. Contributions of the basolateral amygdaloid complex and orbitofrontal cortex</article-title>. <source>Int. Physiol. Behav. Sci.</source> <volume>38</volume>, <fpage>272</fpage>&#x02013;<lpage>294</lpage>.<pub-id pub-id-type="doi">10.1007/BF02688858</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cousins</surname> <given-names>M. S.</given-names></name> <name><surname>Atherton</surname> <given-names>A.</given-names></name> <name><surname>Turner</surname> <given-names>L.</given-names></name> <name><surname>Salamone</surname> <given-names>J. D.</given-names></name></person-group> (<year>1996</year>). <article-title>Nucleus accumbens dopamine depletions alter relative response allocation in a T-maze cost/benefit task</article-title>. <source>Behav. Brain Res.</source> <volume>74</volume>, <fpage>189</fpage>&#x02013;<lpage>197</lpage>.<pub-id pub-id-type="doi">10.1016/0166-4328(95)00151-4</pub-id><pub-id pub-id-type="pmid">8851929</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cromwell</surname> <given-names>H. C.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2003</year>). <article-title>Effects of expectations for different reward magnitudes on neuronal activity in primate striatum</article-title>. <source>J. Neurophysiol.</source> <volume>89</volume>, <fpage>2823</fpage>&#x02013;<lpage>2838</lpage>.<pub-id pub-id-type="doi">10.1152/jn.01014.2002</pub-id><pub-id pub-id-type="pmid">12611937</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dalley</surname> <given-names>J. W.</given-names></name> <name><surname>Mar</surname> <given-names>A. C.</given-names></name> <name><surname>Economidou</surname> <given-names>D.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name></person-group> (<year>2008</year>). <article-title>Neurobehavioral mechanisms of impulsivity: fronto-striatal systems and functional neurochemistry</article-title>. <source>Pharmacol. Biochem. Behav.</source> <volume>90</volume>, <fpage>250</fpage>&#x02013;<lpage>260</lpage>.<pub-id pub-id-type="doi">10.1016/j.pbb.2007.12.021</pub-id><pub-id pub-id-type="pmid">18272211</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Day</surname> <given-names>J. J.</given-names></name> <name><surname>Jones</surname> <given-names>J. L.</given-names></name> <name><surname>Carelli</surname> <given-names>R. M.</given-names></name></person-group> (<year>2011</year>). <article-title>Nucleus accumbens neurons encode predicted and ongoing reward costs in rats</article-title>. <source>Eur. J. Neurosci.</source> <volume>33</volume>, <fpage>308</fpage>&#x02013;<lpage>321</lpage>.<pub-id pub-id-type="doi">10.1111/j.1460-9568.2010.07531.x</pub-id><pub-id pub-id-type="pmid">21198983</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Delamater</surname> <given-names>A. R.</given-names></name> <name><surname>Oakeshott</surname> <given-names>S.</given-names></name></person-group> (<year>2007</year>). <article-title>Learning about multiple attributes of reward in Pavlovian conditioning</article-title>. <source>Ann. N. Y. Acad. Sci.</source> <volume>1104</volume>, <fpage>1</fpage>&#x02013;<lpage>20</lpage>.<pub-id pub-id-type="doi">10.1196/annals.1390.008</pub-id><pub-id pub-id-type="pmid">17344542</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Denk</surname> <given-names>F.</given-names></name> <name><surname>Walton</surname> <given-names>M. E.</given-names></name> <name><surname>Jennings</surname> <given-names>K. A.</given-names></name> <name><surname>Sharp</surname> <given-names>T.</given-names></name> <name><surname>Rushworth</surname> <given-names>M. F.</given-names></name> <name><surname>Bannerman</surname> <given-names>D. M.</given-names></name></person-group> (<year>2005</year>). <article-title>Differential involvement of serotonin and dopamine systems in cost-benefit decisions about delay or effort</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>179</volume>, <fpage>587</fpage>&#x02013;<lpage>596</lpage>.<pub-id pub-id-type="doi">10.1007/s00213-004-2059-4</pub-id><pub-id pub-id-type="pmid">15864561</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ernst</surname> <given-names>M.</given-names></name> <name><surname>Zametkin</surname> <given-names>A. J.</given-names></name> <name><surname>Matochik</surname> <given-names>J. A.</given-names></name> <name><surname>Jons</surname> <given-names>P. H.</given-names></name> <name><surname>Cohen</surname> <given-names>R. M.</given-names></name></person-group> (<year>1998</year>). <article-title>DOPA decarboxylase activity in attention deficit hyperactivity disorder adults. A [fluorine-18]fluorodopa positron emission tomographic study</article-title>. <source>J. Neurosci.</source> <volume>18</volume>, <fpage>5901</fpage>&#x02013;<lpage>5907</lpage>.<pub-id pub-id-type="pmid">9671677</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Evenden</surname> <given-names>J. L.</given-names></name> <name><surname>Ryan</surname> <given-names>C. N.</given-names></name></person-group> (<year>1996</year>). <article-title>The pharmacology of impulsive behaviour in rats: the effects of drugs on response choice with varying delays of reinforcement</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>128</volume>, <fpage>161</fpage>&#x02013;<lpage>170</lpage>.<pub-id pub-id-type="doi">10.1007/s002130050121</pub-id><pub-id pub-id-type="pmid">8956377</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Evenden</surname> <given-names>J. L.</given-names></name> <name><surname>Ryan</surname> <given-names>C. N.</given-names></name></person-group> (<year>1999</year>). <article-title>The pharmacology of impulsive behaviour in rats VI: the effects of ethanol and selective serotonergic drugs on response choice with varying delays of reinforcement</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>146</volume>, <fpage>413</fpage>&#x02013;<lpage>421</lpage>.<pub-id pub-id-type="doi">10.1007/PL00005481</pub-id><pub-id pub-id-type="pmid">10550491</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Figner</surname> <given-names>B.</given-names></name> <name><surname>Knoch</surname> <given-names>D.</given-names></name> <name><surname>Johnson</surname> <given-names>E. J.</given-names></name> <name><surname>Krosch</surname> <given-names>A. R.</given-names></name> <name><surname>Lisanby</surname> <given-names>S. H.</given-names></name> <name><surname>Fehr</surname> <given-names>E.</given-names></name> <name><surname>Weber</surname> <given-names>E. U.</given-names></name></person-group> (<year>2010</year>). <article-title>Lateral prefrontal cortex and self-control in intertemporal choice</article-title>. <source>Nat. Neurosci.</source> <volume>13</volume>, <fpage>538</fpage>&#x02013;<lpage>539</lpage>.<pub-id pub-id-type="doi">10.1038/nn.2516</pub-id><pub-id pub-id-type="pmid">20348919</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fiorillo</surname> <given-names>C. D.</given-names></name> <name><surname>Newsome</surname> <given-names>W. T.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2008</year>). <article-title>The temporal precision of reward prediction in dopamine neurons</article-title>. <source>Nat. Neurosci.</source> <volume>11</volume>, <fpage>966</fpage>&#x02013;<lpage>973</lpage>.<pub-id pub-id-type="doi">10.1038/nn.2159</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fiorillo</surname> <given-names>C. D.</given-names></name> <name><surname>Tobler</surname> <given-names>P. N.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2003</year>). <article-title>Discrete coding of reward probability and uncertainty by dopamine neurons</article-title>. <source>Science</source> <volume>299</volume>, <fpage>1856</fpage>&#x02013;<lpage>1902</lpage>.<pub-id pub-id-type="doi">10.1126/science.1077349</pub-id><pub-id pub-id-type="pmid">12649473</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Floresco</surname> <given-names>S. B.</given-names></name> <name><surname>St Onge</surname> <given-names>J. R.</given-names></name> <name><surname>Ghods-Sharifi</surname> <given-names>S.</given-names></name> <name><surname>Winstanley</surname> <given-names>C. A.</given-names></name></person-group> (<year>2008</year>). <article-title>Cortico-limbic-striatal circuits subserving different forms of cost-benefit decision making</article-title>. <source>Cogn. Affect. Behav. Neurosci.</source> <volume>8</volume>, <fpage>375</fpage>&#x02013;<lpage>389</lpage>.<pub-id pub-id-type="doi">10.3758/CABN.8.4.375</pub-id><pub-id pub-id-type="pmid">19033236</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Frank</surname> <given-names>M. J.</given-names></name></person-group> (<year>2005</year>). <article-title>Dynamic dopamine modulation in the basal ganglia: a neurocomputational account of cognitive deficits in medicated and nonmedicated Parkinsonism</article-title>. <source>J. Cogn. Neurosci.</source> <volume>17</volume>, <fpage>51</fpage>&#x02013;<lpage>72</lpage>.<pub-id pub-id-type="doi">10.1162/0898929052880093</pub-id><pub-id pub-id-type="pmid">15701239</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gallagher</surname> <given-names>M.</given-names></name> <name><surname>Graham</surname> <given-names>P. W.</given-names></name> <name><surname>Holland</surname> <given-names>P. C.</given-names></name></person-group> (<year>1990</year>). <article-title>The amygdala central nucleus and appetitive Pavlovian conditioning: lesions impair one class of conditioned behavior</article-title>. <source>J. Neurosci.</source> <volume>10</volume>, <fpage>1906</fpage>&#x02013;<lpage>1911</lpage>.<pub-id pub-id-type="pmid">2355257</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ghods-Sharifi</surname> <given-names>S.</given-names></name> <name><surname>St Onge</surname> <given-names>J. R.</given-names></name> <name><surname>Floresco</surname> <given-names>S. B.</given-names></name></person-group> (<year>2009</year>). <article-title>Fundamental contribution by the basolateral amygdala to different forms of decision making</article-title>. <source>J. Neurosci.</source> <volume>29</volume>, <fpage>5251</fpage>&#x02013;<lpage>5259</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.0315-09.2009</pub-id><pub-id pub-id-type="pmid">19386921</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hariri</surname> <given-names>A. R.</given-names></name> <name><surname>Brown</surname> <given-names>S. M.</given-names></name> <name><surname>Williamson</surname> <given-names>D. E.</given-names></name> <name><surname>Flory</surname> <given-names>J. D.</given-names></name> <name><surname>de Wit</surname> <given-names>H.</given-names></name> <name><surname>Manuck</surname> <given-names>S. B.</given-names></name></person-group> (<year>2006</year>). <article-title>Preference for immediate over delayed rewards is associated with magnitude of ventral striatal activity</article-title>. <source>J. Neurosci.</source> <volume>26</volume>, <fpage>13213</fpage>&#x02013;<lpage>13217</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.3446-06.2006</pub-id><pub-id pub-id-type="pmid">17182771</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Harrison</surname> <given-names>A. A.</given-names></name> <name><surname>Everitt</surname> <given-names>B. J.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name></person-group> (<year>1997</year>). <article-title>Central 5-HT depletion enhances impulsive responding without affecting the accuracy of attentional performance: interactions with dopaminergic mechanisms</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>133</volume>, <fpage>329</fpage>&#x02013;<lpage>342</lpage>.<pub-id pub-id-type="doi">10.1007/s002130050410</pub-id><pub-id pub-id-type="pmid">9372531</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hassani</surname> <given-names>O. K.</given-names></name> <name><surname>Cromwell</surname> <given-names>H. C.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2001</year>). <article-title>Influence of expectation of different rewards on behavior-related neuronal activity in the striatum</article-title>. <source>J. Neurophysiol.</source> <volume>85</volume>, <fpage>2477</fpage>&#x02013;<lpage>2489</lpage>.<pub-id pub-id-type="pmid">11387394</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hatfield</surname> <given-names>T.</given-names></name> <name><surname>Han</surname> <given-names>J. S.</given-names></name> <name><surname>Conley</surname> <given-names>M.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name> <name><surname>Holland</surname> <given-names>P.</given-names></name></person-group> (<year>1996</year>). <article-title>Neurotoxic lesions of basolateral, but not central, amygdala interfere with Pavlovian second-order conditioning and reinforcer devaluation effects</article-title>. <source>J. Neurosci.</source> <volume>16</volume>, <fpage>5256</fpage>&#x02013;<lpage>5265</lpage>.<pub-id pub-id-type="pmid">8756453</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Heerey</surname> <given-names>E. A.</given-names></name> <name><surname>Robinson</surname> <given-names>B. M.</given-names></name> <name><surname>McMahon</surname> <given-names>R. P.</given-names></name> <name><surname>Gold</surname> <given-names>J. M.</given-names></name></person-group> (<year>2007</year>). <article-title>Delay discounting in schizophrenia</article-title>. <source>Cogn. Neuropsychiatry</source> <volume>12</volume>, <fpage>213</fpage>&#x02013;<lpage>221</lpage>.<pub-id pub-id-type="doi">10.1080/13546800601005900</pub-id><pub-id pub-id-type="pmid">17453902</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Herrnstein</surname> <given-names>R. J.</given-names></name></person-group> (<year>1961</year>). <article-title>Relative and absolute strength of response as a function of frequency of reinforcement</article-title>. <source>J. Exp. Anal. Behav.</source> <volume>4</volume>, <fpage>267</fpage>&#x02013;<lpage>272</lpage>.<pub-id pub-id-type="doi">10.1901/jeab.1961.4-267</pub-id><pub-id pub-id-type="pmid">13713775</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ho</surname> <given-names>M. Y.</given-names></name> <name><surname>Mobini</surname> <given-names>S.</given-names></name> <name><surname>Chiang</surname> <given-names>T. J.</given-names></name> <name><surname>Bradshaw</surname> <given-names>C. M.</given-names></name> <name><surname>Szabadi</surname> <given-names>E.</given-names></name></person-group> (<year>1999</year>). <article-title>Theory and method in the quantitative analysis of &#x0201C;impulsive choice&#x0201D; behaviour: implications for psychopharmacology</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>146</volume>, <fpage>362</fpage>&#x02013;<lpage>372</lpage>.<pub-id pub-id-type="doi">10.1007/PL00005482</pub-id><pub-id pub-id-type="pmid">10550487</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Holland</surname> <given-names>P. C.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name></person-group> (<year>1993a</year>). <article-title>Amygdala central nucleus lesions disrupt increments, but not decrements, in conditioned stimulus processing</article-title>. <source>Behav. Neurosci.</source> <volume>107</volume>, <fpage>246</fpage>&#x02013;<lpage>253</lpage>.<pub-id pub-id-type="doi">10.1037/0735-7044.107.2.235</pub-id></citation></ref>
<ref id="B52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Holland</surname> <given-names>P. C.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name></person-group> (<year>1993b</year>). <article-title>Effects of amygdala central nucleus lesions on blocking and unblocking</article-title>. <source>Behav. Neurosci.</source> <volume>107</volume>, <fpage>235</fpage>&#x02013;<lpage>245</lpage>.<pub-id pub-id-type="doi">10.1037/0735-7044.107.2.235</pub-id></citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Holland</surname> <given-names>P. C.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name></person-group> (<year>1993c</year>). <article-title>Effects of amygdala central nucleus lesions on blocking and unblocking</article-title>. <source>Behav. Neurosci.</source> <volume>107</volume>, <fpage>235</fpage>&#x02013;<lpage>245</lpage>.<pub-id pub-id-type="doi">10.1037/0735-7044.107.2.235</pub-id></citation></ref>
<ref id="B54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Holland</surname> <given-names>P. C.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name></person-group> (<year>1999</year>). <article-title>Amygdala circuitry in attentional and representational processes</article-title>. <source>Trends Cogn. Sci. (Regul. Ed.)</source> <volume>3</volume>, <fpage>65</fpage>&#x02013;<lpage>73</lpage>.<pub-id pub-id-type="doi">10.1016/S1364-6613(98)01271-6</pub-id><pub-id pub-id-type="pmid">10234229</pub-id></citation></ref>
<ref id="B55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Holland</surname> <given-names>P. C.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name></person-group> (<year>2006</year>). <article-title>Different roles for amygdala central nucleus and substantia innominata in the surprise-induced enhancement of learning</article-title>. <source>J. Neurosci.</source> <volume>26</volume>, <fpage>3791</fpage>&#x02013;<lpage>3797</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.0390-06.2006</pub-id><pub-id pub-id-type="pmid">16597732</pub-id></citation></ref>
<ref id="B56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Holland</surname> <given-names>P. C.</given-names></name> <name><surname>Kenmuir</surname> <given-names>C.</given-names></name></person-group> (<year>2005</year>). <article-title>Variations in unconditioned stimulus processing in unblocking</article-title>. <source>J. Exp. Psychol. Anim. Behav. Process.</source> <volume>31</volume>, <fpage>155</fpage>&#x02013;<lpage>171</lpage>.<pub-id pub-id-type="doi">10.1037/0097-7403.31.2.155</pub-id><pub-id pub-id-type="pmid">15839773</pub-id></citation></ref>
<ref id="B57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hollerman</surname> <given-names>J. R.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>1998a</year>). <article-title>Dopamine neurons report an error in the temporal prediction of reward during learning</article-title>. <source>Nat. Neurosci.</source> <volume>1</volume>, <fpage>304</fpage>&#x02013;<lpage>309</lpage>.<pub-id pub-id-type="doi">10.1038/1124</pub-id></citation></ref>
<ref id="B58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hollerman</surname> <given-names>J. R.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>1998b</year>). <article-title>Dopamine neurons report an error in the temporal prediction of reward during learning</article-title>. <source>Nat. Neurosci.</source> <volume>1</volume>, <fpage>304</fpage>&#x02013;<lpage>309</lpage>.<pub-id pub-id-type="doi">10.1038/1124</pub-id></citation></ref>
<ref id="B59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hong</surname> <given-names>S.</given-names></name> <name><surname>Hikosaka</surname> <given-names>O.</given-names></name></person-group> (<year>2011</year>). <article-title>Dopamine-mediated learning and switching in cortico-striatal circuit explain behavioral changes in reinforcement learning</article-title>. <source>Front. Behav. Neurosci.</source> <volume>5</volume>:<fpage>15</fpage>.<pub-id pub-id-type="doi">10.3389/fnbeh.2011.00015</pub-id></citation></ref>
<ref id="B60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ito</surname> <given-names>M.</given-names></name> <name><surname>Doya</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>Validation of decision-making models and analysis of decision variables in the rat basal ganglia</article-title>. <source>J. Neurosci.</source> <volume>29</volume>, <fpage>9861</fpage>&#x02013;<lpage>9874</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.3454-09.2009</pub-id><pub-id pub-id-type="pmid">19657038</pub-id></citation></ref>
<ref id="B61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Janak</surname> <given-names>P. H.</given-names></name> <name><surname>Chen</surname> <given-names>M. T.</given-names></name> <name><surname>Caulder</surname> <given-names>T.</given-names></name></person-group> (<year>2004</year>). <article-title>Dynamics of neural coding in the accumbens during extinction and reinstatement of rewarded behavior</article-title>. <source>Behav. Brain Res.</source> <volume>154</volume>, <fpage>125</fpage>&#x02013;<lpage>135</lpage>.<pub-id pub-id-type="doi">10.1016/j.bbr.2004.02.003</pub-id><pub-id pub-id-type="pmid">15302118</pub-id></citation></ref>
<ref id="B62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jentsch</surname> <given-names>J. D.</given-names></name> <name><surname>Taylor</surname> <given-names>J. R.</given-names></name></person-group> (<year>1999</year>). <article-title>Impulsivity resulting from frontostriatal dysfunction in drug abuse: implications for the control of behavior by reward-related stimuli</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>146</volume>, <fpage>373</fpage>&#x02013;<lpage>390</lpage>.<pub-id pub-id-type="doi">10.1007/PL00005483</pub-id><pub-id pub-id-type="pmid">10550488</pub-id></citation></ref>
<ref id="B63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Joel</surname> <given-names>D.</given-names></name> <name><surname>Niv</surname> <given-names>Y.</given-names></name> <name><surname>Ruppin</surname> <given-names>E.</given-names></name></person-group> (<year>2002</year>). <article-title>Actor-critic models of the basal ganglia: new anatomical and computational perspectives</article-title>. <source>Neural. Netw.</source> <volume>15</volume>, <fpage>535</fpage>&#x02013;<lpage>547</lpage>.<pub-id pub-id-type="doi">10.1016/S0893-6080(02)00047-3</pub-id><pub-id pub-id-type="pmid">12371510</pub-id></citation></ref>
<ref id="B64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jones</surname> <given-names>B.</given-names></name> <name><surname>Mishkin</surname> <given-names>M.</given-names></name></person-group> (<year>1972</year>). <article-title>Limbic lesions and the problem of stimulus-reinforcement associations</article-title>. <source>Exp. Neurol.</source> <volume>36</volume>, <fpage>362</fpage>&#x02013;<lpage>377</lpage>.<pub-id pub-id-type="doi">10.1016/0014-4886(72)90030-1</pub-id><pub-id pub-id-type="pmid">4626489</pub-id></citation></ref>
<ref id="B65"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kable</surname> <given-names>J. W.</given-names></name> <name><surname>Glimcher</surname> <given-names>P. W.</given-names></name></person-group> (<year>2007</year>). <article-title>The neural correlates of subjective value during intertemporal choice</article-title>. <source>Nat. Neurosci.</source> <volume>10</volume>, <fpage>1625</fpage>&#x02013;<lpage>1633</lpage>.<pub-id pub-id-type="doi">10.1038/nn2007</pub-id><pub-id pub-id-type="pmid">17982449</pub-id></citation></ref>
<ref id="B66"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kahneman</surname> <given-names>D.</given-names></name> <name><surname>Tverskey</surname> <given-names>A.</given-names></name></person-group> (<year>1984</year>). <article-title>Choices, values, and frames</article-title>. <source>Am. Psychol.</source> <volume>39</volume>, <fpage>341</fpage>&#x02013;<lpage>350</lpage>.<pub-id pub-id-type="doi">10.1037/0003-066X.39.4.341</pub-id></citation></ref>
<ref id="B67"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kalenscher</surname> <given-names>T.</given-names></name> <name><surname>Pennartz</surname> <given-names>C. M.</given-names></name></person-group> (<year>2008</year>). <article-title>Is a bird in the hand worth two in the future? The neuroeconomics of intertemporal decision-making</article-title>. <source>Prog. Neurobiol.</source> <volume>84</volume>, <fpage>284</fpage>&#x02013;<lpage>315</lpage>.<pub-id pub-id-type="doi">10.1016/j.pneurobio.2007.11.004</pub-id><pub-id pub-id-type="pmid">18207301</pub-id></citation></ref>
<ref id="B68"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kalenscher</surname> <given-names>T.</given-names></name> <name><surname>Windmann</surname> <given-names>S.</given-names></name> <name><surname>Diekamp</surname> <given-names>B.</given-names></name> <name><surname>Rose</surname> <given-names>J.</given-names></name> <name><surname>Gunturkun</surname> <given-names>O.</given-names></name> <name><surname>Colombo</surname> <given-names>M.</given-names></name></person-group> (<year>2005</year>). <article-title>Single units in the pigeon brain integrate reward amount and time-to-reward in an impulsive choice task</article-title>. <source>Curr. Biol.</source> <volume>15</volume>, <fpage>594</fpage>&#x02013;<lpage>602</lpage>.<pub-id pub-id-type="doi">10.1016/j.cub.2005.02.052</pub-id><pub-id pub-id-type="pmid">15823531</pub-id></citation></ref>
<ref id="B69"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kennerley</surname> <given-names>S. W.</given-names></name> <name><surname>Dahmubed</surname> <given-names>A. F.</given-names></name> <name><surname>Lara</surname> <given-names>A. H.</given-names></name> <name><surname>Wallis</surname> <given-names>J. D.</given-names></name></person-group> (<year>2009</year>). <article-title>Neurons in the frontal lobe encode the value of multiple decision variables</article-title>. <source>J. Cogn. Neurosci.</source> <volume>21</volume>, <fpage>1162</fpage>&#x02013;<lpage>1178</lpage>.<pub-id pub-id-type="doi">10.1162/jocn.2009.21100</pub-id><pub-id pub-id-type="pmid">18752411</pub-id></citation></ref>
<ref id="B70"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kennerley</surname> <given-names>S. W.</given-names></name> <name><surname>Wallis</surname> <given-names>J. D.</given-names></name></person-group> (<year>2009</year>). <article-title>Evaluating choices by single neurons in the frontal lobe: outcome value encoded across multiple decision variables</article-title>. <source>Eur. J. Neurosci.</source> <volume>29</volume>, <fpage>2061</fpage>&#x02013;<lpage>2073</lpage>.<pub-id pub-id-type="doi">10.1111/j.1460-9568.2009.06743.x</pub-id><pub-id pub-id-type="pmid">19453638</pub-id></citation></ref>
<ref id="B71"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kesner</surname> <given-names>R. P.</given-names></name> <name><surname>Williams</surname> <given-names>J. M.</given-names></name></person-group> (<year>1995</year>). <article-title>Memory for magnitude of reinforcement: dissociation between amygdala and hippocampus</article-title>. <source>Neurobiol. Learn. Mem.</source> <volume>64</volume>, <fpage>237</fpage>&#x02013;<lpage>244</lpage>.<pub-id pub-id-type="doi">10.1006/nlme.1995.0006</pub-id><pub-id pub-id-type="pmid">8564377</pub-id></citation></ref>
<ref id="B72"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Khamassi</surname> <given-names>M.</given-names></name> <name><surname>Mulder</surname> <given-names>A. B.</given-names></name> <name><surname>Tabuchi</surname> <given-names>E.</given-names></name> <name><surname>Douchamps</surname> <given-names>V.</given-names></name> <name><surname>Wiener</surname> <given-names>S. I.</given-names></name></person-group> (<year>2008</year>). <article-title>Anticipatory reward signals in ventral striatal neurons of behaving rats</article-title>. <source>Eur. J. Neurosci.</source> <volume>28</volume>, <fpage>1849</fpage>&#x02013;<lpage>1866</lpage>.<pub-id pub-id-type="doi">10.1111/j.1460-9568.2008.06480.x</pub-id><pub-id pub-id-type="pmid">18973599</pub-id></citation></ref>
<ref id="B73"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kheramin</surname> <given-names>S.</given-names></name> <name><surname>Body</surname> <given-names>S.</given-names></name> <name><surname>Ho</surname> <given-names>M. Y.</given-names></name> <name><surname>Velazquez-Martinez</surname> <given-names>D. N.</given-names></name> <name><surname>Bradshaw</surname> <given-names>C. M.</given-names></name> <name><surname>Szabadi</surname> <given-names>E.</given-names></name> <name><surname>Deakin</surname> <given-names>J. F.</given-names></name> <name><surname>Anderson</surname> <given-names>I. M.</given-names></name></person-group> (<year>2004</year>). <article-title>Effects of orbital prefrontal cortex dopamine depletion on inter-temporal choice: a quantitative analysis</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>175</volume>, <fpage>206</fpage>&#x02013;<lpage>214</lpage>.<pub-id pub-id-type="doi">10.1007/s00213-004-1813-y</pub-id><pub-id pub-id-type="pmid">14991223</pub-id></citation></ref>
<ref id="B74"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kim</surname> <given-names>S.</given-names></name> <name><surname>Hwang</surname> <given-names>J.</given-names></name> <name><surname>Lee</surname> <given-names>D.</given-names></name></person-group> (<year>2008</year>). <article-title>Prefrontal coding of temporally discounted values during intertemporal choice</article-title>. <source>Neuron</source> <volume>59</volume>, <fpage>161</fpage>&#x02013;<lpage>172</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2008.05.010</pub-id><pub-id pub-id-type="pmid">18614037</pub-id></citation></ref>
<ref id="B75"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kobayashi</surname> <given-names>S.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2008</year>). <article-title>Influence of reward delays on responses of dopamine neurons</article-title>. <source>J. Neurosci.</source> <volume>28</volume>, <fpage>7837</fpage>&#x02013;<lpage>7846</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.5589-07.2008</pub-id><pub-id pub-id-type="pmid">18667616</pub-id></citation></ref>
<ref id="B76"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kringelbach</surname> <given-names>M. L.</given-names></name></person-group> (<year>2005</year>). <article-title>The human orbitofrontal cortex: linking reward to hedonic experience</article-title>. <source>Nat. Rev. Neurosci.</source> <volume>6</volume>, <fpage>691</fpage>&#x02013;<lpage>702</lpage>.<pub-id pub-id-type="doi">10.1038/nrn1747</pub-id><pub-id pub-id-type="pmid">16136173</pub-id></citation></ref>
<ref id="B77"><citation citation-type="book"><person-group person-group-type="author"><name><surname>LeDoux</surname> <given-names>J. E.</given-names></name></person-group> (<year>2000</year>). <article-title>&#x0201C;The amygdala and emotion: a view through fear,&#x0201D;</article-title> in <source>The Amygdala: A Functional Analysis</source>, ed. <person-group person-group-type="editor"><name><surname>Aggleton</surname> <given-names>J. P.</given-names></name></person-group> (<publisher-loc>New York</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>), <fpage>289</fpage>&#x02013;<lpage>310</lpage>.</citation></ref>
<ref id="B78"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Lowenstein</surname> <given-names>G. E. J.</given-names></name></person-group> (<year>1992</year>). <source>Choice Over Time</source>. <publisher-loc>New York</publisher-loc>: <publisher-name>Russel Sage Foundation</publisher-name>.</citation></ref>
<ref id="B79"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Malkova</surname> <given-names>L.</given-names></name> <name><surname>Gaffan</surname> <given-names>D.</given-names></name> <name><surname>Murray</surname> <given-names>E. A.</given-names></name></person-group> (<year>1997</year>). <article-title>Excitotoxic lesions of the amygdala fail to produce impairment in visual learning for auditory secondary reinforcement but interfere with reinforcer devaluation effects in rhesus monkeys</article-title>. <source>J. Neurosci.</source> <volume>17</volume>, <fpage>6011</fpage>&#x02013;<lpage>6020</lpage>.<pub-id pub-id-type="pmid">9221797</pub-id></citation></ref>
<ref id="B80"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mar</surname> <given-names>A. C.</given-names></name> <name><surname>Walker</surname> <given-names>A. L.</given-names></name> <name><surname>Theobald</surname> <given-names>D. E.</given-names></name> <name><surname>Eagle</surname> <given-names>D. M.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name></person-group> (<year>2011</year>). <article-title>Dissociable effects of lesions to orbitofrontal cortex subregions on impulsive choice in the rat</article-title>. <source>J. Neurosci.</source> <volume>31</volume>, <fpage>6398</fpage>&#x02013;<lpage>6404</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.6620-10.2011</pub-id><pub-id pub-id-type="pmid">21525280</pub-id></citation></ref>
<ref id="B81"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>McClure</surname> <given-names>S. M.</given-names></name> <name><surname>Ericson</surname> <given-names>K. M.</given-names></name> <name><surname>Laibson</surname> <given-names>D. I.</given-names></name> <name><surname>Loewenstein</surname> <given-names>G.</given-names></name> <name><surname>Cohen</surname> <given-names>J. D.</given-names></name></person-group> (<year>2007</year>). <article-title>Time discounting for primary rewards</article-title>. <source>J. Neurosci.</source> <volume>27</volume>, <fpage>5796</fpage>&#x02013;<lpage>5804</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.4246-06.2007</pub-id><pub-id pub-id-type="pmid">17522323</pub-id></citation></ref>
<ref id="B82"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>McClure</surname> <given-names>S. M.</given-names></name> <name><surname>Laibson</surname> <given-names>D. I.</given-names></name> <name><surname>Loewenstein</surname> <given-names>G.</given-names></name> <name><surname>Cohen</surname> <given-names>J. D.</given-names></name></person-group> (<year>2004</year>). <article-title>Separate neural systems value immediate and delayed monetary rewards</article-title>. <source>Science</source> <volume>306</volume>, <fpage>503</fpage>&#x02013;<lpage>507</lpage>.<pub-id pub-id-type="doi">10.1126/science.1100907</pub-id><pub-id pub-id-type="pmid">15486304</pub-id></citation></ref>
<ref id="B83"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mirenowicz</surname> <given-names>J.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>1994</year>). <article-title>Importance of unpredictability for reward responses in primate dopamine neurons</article-title>. <source>J. Neurophysiol.</source> <volume>72</volume>, <fpage>1024</fpage>&#x02013;<lpage>1027</lpage>.<pub-id pub-id-type="pmid">7983508</pub-id></citation></ref>
<ref id="B84"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Miyazaki</surname> <given-names>K.</given-names></name> <name><surname>Miyazaki</surname> <given-names>K. W.</given-names></name> <name><surname>Doya</surname> <given-names>K.</given-names></name></person-group> (<year>2011a</year>). <article-title>Activation of dorsal raphe serotonin neurons underlies waiting for delayed rewards</article-title>. <source>J. Neurosci.</source> <volume>31</volume>, <fpage>469</fpage>&#x02013;<lpage>479</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.3714-10.2011</pub-id></citation></ref>
<ref id="B85"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Miyazaki</surname> <given-names>K. W.</given-names></name> <name><surname>Miyazaki</surname> <given-names>K.</given-names></name> <name><surname>Doya</surname> <given-names>K.</given-names></name></person-group> (<year>2011b</year>). <article-title>Activation of the central serotonergic system in response to delayed but not omitted rewards</article-title>. <source>Eur. J. Neurosci.</source> <volume>33</volume>, <fpage>153</fpage>&#x02013;<lpage>160</lpage>.<pub-id pub-id-type="doi">10.1111/j.1460-9568.2010.07472.x</pub-id></citation></ref>
<ref id="B86"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mobini</surname> <given-names>S.</given-names></name> <name><surname>Body</surname> <given-names>S.</given-names></name> <name><surname>Ho</surname> <given-names>M. Y.</given-names></name> <name><surname>Bradshaw</surname> <given-names>C. M.</given-names></name> <name><surname>Szabadi</surname> <given-names>E.</given-names></name> <name><surname>Deakin</surname> <given-names>J. F.</given-names></name> <name><surname>Anderson</surname> <given-names>I. M.</given-names></name></person-group> (<year>2002</year>). <article-title>Effects of lesions of the orbitofrontal cortex on sensitivity to delayed and probabilistic reinforcement</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>160</volume>, <fpage>290</fpage>&#x02013;<lpage>298</lpage>.<pub-id pub-id-type="doi">10.1007/s00213-001-0983-0</pub-id><pub-id pub-id-type="pmid">11889498</pub-id></citation></ref>
<ref id="B87"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mobini</surname> <given-names>S.</given-names></name> <name><surname>Chiang</surname> <given-names>T. J.</given-names></name> <name><surname>Al-Ruwaitea</surname> <given-names>A. S.</given-names></name> <name><surname>Ho</surname> <given-names>M. Y.</given-names></name> <name><surname>Bradshaw</surname> <given-names>C. M.</given-names></name> <name><surname>Szabadi</surname> <given-names>E.</given-names></name></person-group> (<year>2000a</year>). <article-title>Effect of central 5-hydroxytryptamine depletion on inter-temporal choice: a quantitative analysis</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>149</volume>, <fpage>313</fpage>&#x02013;<lpage>318</lpage>.<pub-id pub-id-type="doi">10.1007/s002130000385</pub-id></citation></ref>
<ref id="B88"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mobini</surname> <given-names>S.</given-names></name> <name><surname>Chiang</surname> <given-names>T. J.</given-names></name> <name><surname>Ho</surname> <given-names>M. Y.</given-names></name> <name><surname>Bradshaw</surname> <given-names>C. M.</given-names></name> <name><surname>Szabadi</surname> <given-names>E.</given-names></name></person-group> (<year>2000b</year>). <article-title>Effects of central 5-hydroxytryptamine depletion on sensitivity to delayed and probabilistic reinforcement</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>152</volume>, <fpage>390</fpage>&#x02013;<lpage>397</lpage>.<pub-id pub-id-type="doi">10.1007/s002130000542</pub-id></citation></ref>
<ref id="B89"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mogenson</surname> <given-names>G. J.</given-names></name> <name><surname>Jones</surname> <given-names>D. L.</given-names></name> <name><surname>Yim</surname> <given-names>C. Y.</given-names></name></person-group> (<year>1980</year>). <article-title>From motivation to action: functional interface between the limbic system and the motor system</article-title>. <source>Prog. Neurobiol.</source> <volume>14</volume>, <fpage>69</fpage>&#x02013;<lpage>97</lpage>.<pub-id pub-id-type="doi">10.1016/0301-0082(80)90018-0</pub-id><pub-id pub-id-type="pmid">6999537</pub-id></citation></ref>
<ref id="B90"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Montague</surname> <given-names>P. R.</given-names></name> <name><surname>Berns</surname> <given-names>G. S.</given-names></name></person-group> (<year>2002</year>). <article-title>Neural economics and the biological substrates of valuation</article-title>. <source>Neuron</source> <volume>36</volume>, <fpage>265</fpage>&#x02013;<lpage>284</lpage>.<pub-id pub-id-type="doi">10.1016/S0896-6273(02)00974-1</pub-id><pub-id pub-id-type="pmid">12383781</pub-id></citation></ref>
<ref id="B91"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Montague</surname> <given-names>P. R.</given-names></name> <name><surname>Dayan</surname> <given-names>P.</given-names></name> <name><surname>Sejnowski</surname> <given-names>T. J.</given-names></name></person-group> (<year>1996</year>). <article-title>A framework for mesencephalic dopamine systems based on predictive Hebbian learning</article-title>. <source>J. Neurosci.</source> <volume>16</volume>, <fpage>1936</fpage>&#x02013;<lpage>1947</lpage>.<pub-id pub-id-type="pmid">8774460</pub-id></citation></ref>
<ref id="B92"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Monterosso</surname> <given-names>J.</given-names></name> <name><surname>Ehrman</surname> <given-names>R.</given-names></name> <name><surname>Napier</surname> <given-names>K. L.</given-names></name> <name><surname>O&#x02019;Brien</surname> <given-names>C. P.</given-names></name> <name><surname>Childress</surname> <given-names>A. R.</given-names></name></person-group> (<year>2001</year>). <article-title>Three decision-making tasks in cocaine-dependent patients: do they measure the same construct?</article-title> <source>Addiction</source> <volume>96</volume>, <fpage>1825</fpage>&#x02013;<lpage>1837</lpage>.<pub-id pub-id-type="doi">10.1046/j.1360-0443.2001.9612182512.x</pub-id><pub-id pub-id-type="pmid">11784475</pub-id></citation></ref>
<ref id="B93"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morris</surname> <given-names>G.</given-names></name> <name><surname>Nevet</surname> <given-names>A.</given-names></name> <name><surname>Arkadir</surname> <given-names>D.</given-names></name> <name><surname>Vaadia</surname> <given-names>E.</given-names></name> <name><surname>Bergman</surname> <given-names>H.</given-names></name></person-group> (<year>2006</year>). <article-title>Midbrain dopamine neurons encode decisions for future action</article-title>. <source>Nat. Neurosci.</source> <volume>9</volume>, <fpage>1057</fpage>&#x02013;<lpage>1063</lpage>.<pub-id pub-id-type="doi">10.1038/nn1743</pub-id><pub-id pub-id-type="pmid">16862149</pub-id></citation></ref>
<ref id="B94"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Murray</surname> <given-names>E. A.</given-names></name></person-group> (<year>2007</year>). <article-title>The amygdala, reward and emotion</article-title>. <source>Trends Cogn. Sci. (Regul. Ed.)</source> <volume>11</volume>, <fpage>489</fpage>&#x02013;<lpage>497</lpage>.<pub-id pub-id-type="doi">10.1016/j.tics.2007.08.013</pub-id><pub-id pub-id-type="pmid">17988930</pub-id></citation></ref>
<ref id="B95"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nakahara</surname> <given-names>H.</given-names></name> <name><surname>Itoh</surname> <given-names>H.</given-names></name> <name><surname>Kawagoe</surname> <given-names>R.</given-names></name> <name><surname>Takikawa</surname> <given-names>Y.</given-names></name> <name><surname>Hikosaka</surname> <given-names>O.</given-names></name></person-group> (<year>2004</year>). <article-title>Dopamine neurons can represent context-dependent prediction error</article-title>. <source>Neuron</source> <volume>41</volume>, <fpage>269</fpage>&#x02013;<lpage>280</lpage>.<pub-id pub-id-type="doi">10.1016/S0896-6273(03)00869-9</pub-id><pub-id pub-id-type="pmid">14741107</pub-id></citation></ref>
<ref id="B96"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nakamura</surname> <given-names>K.</given-names></name> <name><surname>Hikosaka</surname> <given-names>O.</given-names></name></person-group> (<year>2006</year>). <article-title>Role of dopamine in the primate caudate nucleus in reward modulation of saccades</article-title>. <source>J. Neurosci.</source> <volume>26</volume>, <fpage>5360</fpage>&#x02013;<lpage>5369</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.4853-05.2006</pub-id><pub-id pub-id-type="pmid">16707788</pub-id></citation></ref>
<ref id="B97"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nicola</surname> <given-names>S. M.</given-names></name></person-group> (<year>2007</year>). <article-title>The nucleus accumbens as part of a basal ganglia action selection circuit</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>191</volume>, <fpage>521</fpage>&#x02013;<lpage>550</lpage>.<pub-id pub-id-type="doi">10.1007/s00213-006-0510-4</pub-id><pub-id pub-id-type="pmid">16983543</pub-id></citation></ref>
<ref id="B98"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>O&#x02019;Doherty</surname> <given-names>J.</given-names></name> <name><surname>Dayan</surname> <given-names>P.</given-names></name> <name><surname>Schultz</surname> <given-names>J.</given-names></name> <name><surname>Deichmann</surname> <given-names>R.</given-names></name> <name><surname>Friston</surname> <given-names>K. J.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2004</year>). <article-title>Dissociable roles of ventral and dorsal striatum in instrumental conditioning</article-title>. <source>Science</source> <volume>304</volume>, <fpage>452</fpage>&#x02013;<lpage>454</lpage>.<pub-id pub-id-type="doi">10.1126/science.1094285</pub-id><pub-id pub-id-type="pmid">15087550</pub-id></citation></ref>
<ref id="B99"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>O&#x02019;Neill</surname> <given-names>M.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2010</year>). <article-title>Coding of reward risk by orbitofrontal neurons is mostly distinct from coding of reward value</article-title>. <source>Neuron</source> <volume>68</volume>, <fpage>789</fpage>&#x02013;<lpage>800</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2010.09.031</pub-id><pub-id pub-id-type="pmid">21092866</pub-id></citation></ref>
<ref id="B100"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Padoa-Schioppa</surname> <given-names>C.</given-names></name></person-group> (<year>2011</year>). <article-title>Neurobiology of economic choice: a good-based model</article-title>. <source>Annu. Rev. Neurosci.</source> <volume>34</volume>, <fpage>333</fpage>&#x02013;<lpage>359</lpage>.<pub-id pub-id-type="doi">10.1146/annurev-neuro-061010-113648</pub-id><pub-id pub-id-type="pmid">21456961</pub-id></citation></ref>
<ref id="B101"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pan</surname> <given-names>W. X.</given-names></name> <name><surname>Schmidt</surname> <given-names>R.</given-names></name> <name><surname>Wickens</surname> <given-names>J. R.</given-names></name> <name><surname>Hyland</surname> <given-names>B. I.</given-names></name></person-group> (<year>2005</year>). <article-title>Dopamine cells respond to predicted events during classical conditioning: evidence for eligibility traces in the reward-learning network</article-title>. <source>J. Neurosci.</source> <volume>25</volume>, <fpage>6235</fpage>&#x02013;<lpage>6242</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.1478-05.2005</pub-id><pub-id pub-id-type="pmid">15987953</pub-id></citation></ref>
<ref id="B102"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Parkinson</surname> <given-names>J. A.</given-names></name> <name><surname>Crofts</surname> <given-names>H. S.</given-names></name> <name><surname>McGuigan</surname> <given-names>M.</given-names></name> <name><surname>Tomic</surname> <given-names>D. L.</given-names></name> <name><surname>Everitt</surname> <given-names>B. J.</given-names></name> <name><surname>Roberts</surname> <given-names>A. C.</given-names></name></person-group> (<year>2001</year>). <article-title>The role of the primate amygdala in conditioned reinforcement</article-title>. <source>J. Neurosci.</source> <volume>21</volume>, <fpage>7770</fpage>&#x02013;<lpage>7780</lpage>.<pub-id pub-id-type="pmid">11567067</pub-id></citation></ref>
<ref id="B103"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pennartz</surname> <given-names>C. M.</given-names></name> <name><surname>Groenewegen</surname> <given-names>H. J.</given-names></name> <name><surname>Lopes da Silva</surname> <given-names>F. H.</given-names></name></person-group> (<year>1994</year>). <article-title>The nucleus accumbens as a complex of functionally distinct neuronal ensembles: an integration of behavioural, electrophysiological and anatomical data</article-title>. <source>Prog. Neurobiol.</source> <volume>42</volume>, <fpage>719</fpage>&#x02013;<lpage>761</lpage>.<pub-id pub-id-type="doi">10.1016/0301-0082(94)90025-6</pub-id><pub-id pub-id-type="pmid">7938546</pub-id></citation></ref>
<ref id="B104"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Redgrave</surname> <given-names>P.</given-names></name> <name><surname>Prescott</surname> <given-names>T. J.</given-names></name> <name><surname>Gurney</surname> <given-names>K.</given-names></name></person-group> (<year>1999</year>). <article-title>The basal ganglia: a vertebrate solution to the selection problem?</article-title> <source>Neuroscience</source> <volume>89</volume>, <fpage>1009</fpage>&#x02013;<lpage>1023</lpage>.<pub-id pub-id-type="doi">10.1016/S0306-4522(98)00319-4</pub-id><pub-id pub-id-type="pmid">10362291</pub-id></citation></ref>
<ref id="B105"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Richards</surname> <given-names>J. B.</given-names></name> <name><surname>Mitchell</surname> <given-names>S. H.</given-names></name> <name><surname>de Wit</surname> <given-names>H.</given-names></name> <name><surname>Seiden</surname> <given-names>L. S.</given-names></name></person-group> (<year>1997</year>). <article-title>Determination of discount functions in rats with an adjusting-amount procedure</article-title>. <source>J. Exp. Anal. Behav.</source> <volume>67</volume>, <fpage>353</fpage>&#x02013;<lpage>366</lpage>.<pub-id pub-id-type="doi">10.1901/jeab.1997.67-353</pub-id><pub-id pub-id-type="pmid">9163939</pub-id></citation></ref>
<ref id="B106"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rodriguez</surname> <given-names>M. L.</given-names></name> <name><surname>Logue</surname> <given-names>A. W.</given-names></name></person-group> (<year>1988</year>). <article-title>Adjusting delay to reinforcement: comparing choice in pigeons and humans</article-title>. <source>J. Exp. Psychol. Anim. Behav. Process.</source> <volume>14</volume>, <fpage>105</fpage>&#x02013;<lpage>117</lpage>.<pub-id pub-id-type="doi">10.1037/0097-7403.14.1.105</pub-id><pub-id pub-id-type="pmid">3351438</pub-id></citation></ref>
<ref id="B107"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Calu</surname> <given-names>D. J.</given-names></name> <name><surname>Burke</surname> <given-names>K. A.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2007a</year>). <article-title>Should I stay or should I go? Transformation of time-discounted rewards in orbitofrontal cortex and associated brain circuits</article-title>. <source>Ann. N. Y. Acad. Sci.</source> <volume>1104</volume>, <fpage>21</fpage>&#x02013;<lpage>34</lpage>.<pub-id pub-id-type="doi">10.1196/annals.1390.001</pub-id></citation></ref>
<ref id="B108"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Calu</surname> <given-names>D. J.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2007b</year>). <article-title>Dopamine neurons encode the better option in rats deciding between differently delayed or sized rewards</article-title>. <source>Nat. Neurosci.</source> <volume>10</volume>, <fpage>1615</fpage>&#x02013;<lpage>1624</lpage>.<pub-id pub-id-type="doi">10.1038/nn2013</pub-id></citation></ref>
<ref id="B109"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Takahashi</surname> <given-names>Y.</given-names></name> <name><surname>Gugsa</surname> <given-names>N.</given-names></name> <name><surname>Bissonette</surname> <given-names>G. B.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2007c</year>). <article-title>Previous cocaine exposure makes rats hypersensitive to both delay and reward magnitude</article-title>. <source>J. Neurosci.</source> <volume>27</volume>, <fpage>245</fpage>&#x02013;<lpage>250</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.4080-06.2007</pub-id></citation></ref>
<ref id="B110"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Calu</surname> <given-names>D. J.</given-names></name> <name><surname>Esber</surname> <given-names>G. R.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2010a</year>). <article-title>All that glitters... dissociating attention and outcome expectancy from prediction errors signals</article-title>. <source>J. Neurophysiol.</source> <volume>104</volume>, <fpage>587</fpage>&#x02013;<lpage>595</lpage>.<pub-id pub-id-type="doi">10.1152/jn.00173.2010</pub-id></citation></ref>
<ref id="B111"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Calu</surname> <given-names>D. J.</given-names></name> <name><surname>Esber</surname> <given-names>G. R.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2010b</year>). <article-title>Neural correlates of variations in event processing during learning in basolateral amygdala</article-title>. <source>J. Neurosci.</source> <volume>30</volume>, <fpage>2464</fpage>&#x02013;<lpage>2471</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.5781-09.2010</pub-id></citation></ref>
<ref id="B112"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Olson</surname> <given-names>C. R.</given-names></name></person-group> (<year>2005a</year>). <article-title>Neuronal activity dependent on anticipated and elapsed delay in macaque prefrontal cortex, frontal and supplementary eye fields, and premotor cortex</article-title>. <source>J. Neurophysiol.</source> <volume>94</volume>, <fpage>1469</fpage>&#x02013;<lpage>1497</lpage>.<pub-id pub-id-type="doi">10.1152/jn.00064.2005</pub-id></citation></ref>
<ref id="B113"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Olson</surname> <given-names>C. R.</given-names></name></person-group> (<year>2005b</year>). <article-title>Neuronal activity in primate orbitofrontal cortex reflects the value of time</article-title>. <source>J. Neurophysiol.</source> <volume>94</volume>, <fpage>2457</fpage>&#x02013;<lpage>2471</lpage>.<pub-id pub-id-type="doi">10.1152/jn.00064.2005</pub-id></citation></ref>
<ref id="B114"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Singh</surname> <given-names>T.</given-names></name> <name><surname>Brown</surname> <given-names>P. L.</given-names></name> <name><surname>Mullins</surname> <given-names>S. E.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2009</year>). <article-title>Ventral striatal neurons encode the value of the chosen action in rats deciding between differently delayed or sized rewards</article-title>. <source>J. Neurosci.</source> <volume>29</volume>, <fpage>13365</fpage>&#x02013;<lpage>13376</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.2572-09.2009</pub-id><pub-id pub-id-type="pmid">19846724</pub-id></citation></ref>
<ref id="B115"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Taylor</surname> <given-names>A. R.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2006</year>). <article-title>Encoding of time-discounted rewards in orbitofrontal cortex is independent of value representation</article-title>. <source>Neuron</source> <volume>51</volume>, <fpage>509</fpage>&#x02013;<lpage>520</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2006.06.027</pub-id><pub-id pub-id-type="pmid">16908415</pub-id></citation></ref>
<ref id="B116"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rudebeck</surname> <given-names>P. H.</given-names></name> <name><surname>Walton</surname> <given-names>M. E.</given-names></name> <name><surname>Smyth</surname> <given-names>A. N.</given-names></name> <name><surname>Bannerman</surname> <given-names>D. M.</given-names></name> <name><surname>Rushworth</surname> <given-names>M. F.</given-names></name></person-group> (<year>2006</year>). <article-title>Separate neural pathways process different decision costs</article-title>. <source>Nat. Neurosci.</source> <volume>9</volume>, <fpage>1161</fpage>&#x02013;<lpage>1168</lpage>.<pub-id pub-id-type="doi">10.1038/nn1756</pub-id><pub-id pub-id-type="pmid">16921368</pub-id></citation></ref>
<ref id="B117"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Saddoris</surname> <given-names>M. P.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2005</year>). <article-title>Rapid associative encoding in basolateral amygdala depends on connections with orbitofrontal cortex</article-title>. <source>Neuron</source> <volume>46</volume>, <fpage>321</fpage>&#x02013;<lpage>331</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2005.02.018</pub-id><pub-id pub-id-type="pmid">15848809</pub-id></citation></ref>
<ref id="B118"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schoenbaum</surname> <given-names>G.</given-names></name> <name><surname>Roesch</surname> <given-names>M.</given-names></name></person-group> (<year>2005</year>). <article-title>Orbitofrontal cortex, associative learning, and expectancies</article-title>. <source>Neuron</source> <volume>47</volume>, <fpage>633</fpage>&#x02013;<lpage>636</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2005.07.018</pub-id><pub-id pub-id-type="pmid">16129393</pub-id></citation></ref>
<ref id="B119"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schoenbaum</surname> <given-names>G.</given-names></name> <name><surname>Setlow</surname> <given-names>B.</given-names></name> <name><surname>Saddoris</surname> <given-names>M. P.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name></person-group> (<year>2003</year>). <article-title>Encoding predicted outcome and acquired value in orbitofrontal cortex during cue sampling depends upon input from basolateral amygdala</article-title>. <source>Neuron</source> <volume>39</volume>, <fpage>855</fpage>&#x02013;<lpage>867</lpage>.<pub-id pub-id-type="doi">10.1016/S0896-6273(03)00474-4</pub-id><pub-id pub-id-type="pmid">12948451</pub-id></citation></ref>
<ref id="B120"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2010</year>). <article-title>Subjective neuronal coding of reward: temporal value discounting and risk</article-title>. <source>Eur. J. Neurosci.</source> <volume>31</volume>, <fpage>2124</fpage>&#x02013;<lpage>2135</lpage>.<pub-id pub-id-type="doi">10.1111/j.1460-9568.2010.07282.x</pub-id><pub-id pub-id-type="pmid">20497474</pub-id></citation></ref>
<ref id="B121"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sellitto</surname> <given-names>M.</given-names></name> <name><surname>Ciaramelli</surname> <given-names>E.</given-names></name> <name><surname>di Pellegrino</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>Myopic discounting of future rewards after medial orbitofrontal damage in humans</article-title>. <source>J. Neurosci.</source> <volume>30</volume>, <fpage>16429</fpage>&#x02013;<lpage>16436</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.2516-10.2010</pub-id><pub-id pub-id-type="pmid">21147982</pub-id></citation></ref>
<ref id="B122"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Setlow</surname> <given-names>B.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name> <name><surname>Gallagher</surname> <given-names>M.</given-names></name></person-group> (<year>2003</year>). <article-title>Neural encoding in ventral striatum during olfactory discrimination learning</article-title>. <source>Neuron</source> <volume>38</volume>, <fpage>625</fpage>&#x02013;<lpage>636</lpage>.<pub-id pub-id-type="doi">10.1016/S0896-6273(03)00264-2</pub-id><pub-id pub-id-type="pmid">12765613</pub-id></citation></ref>
<ref id="B123"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shen</surname> <given-names>W.</given-names></name> <name><surname>Flajolet</surname> <given-names>M.</given-names></name> <name><surname>Greengard</surname> <given-names>P.</given-names></name> <name><surname>Surmeier</surname> <given-names>D. J.</given-names></name></person-group> (<year>2008</year>). <article-title>Dichotomous dopaminergic control of striatal synaptic plasticity</article-title>. <source>Science</source> <volume>321</volume>, <fpage>848</fpage>&#x02013;<lpage>851</lpage>.<pub-id pub-id-type="doi">10.1126/science.1160575</pub-id><pub-id pub-id-type="pmid">18687967</pub-id></citation></ref>
<ref id="B124"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stalnaker</surname> <given-names>T. A.</given-names></name> <name><surname>Calhoon</surname> <given-names>G. G.</given-names></name> <name><surname>Ogawa</surname> <given-names>M.</given-names></name> <name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>Neural correlates of stimulus-response and response-outcome associations in dorsolateral versus dorsomedial striatum</article-title>. <source>Front. Integr. Neurosci.</source> <volume>4</volume>:<fpage>12</fpage>.<pub-id pub-id-type="doi">10.3389/fnint.2010.00012</pub-id></citation></ref>
<ref id="B125"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Taha</surname> <given-names>S. A.</given-names></name> <name><surname>Nicola</surname> <given-names>S. M.</given-names></name> <name><surname>Fields</surname> <given-names>H. L.</given-names></name></person-group> (<year>2007</year>). <article-title>Cue-evoked encoding of movement planning and execution in the rat nucleus accumbens</article-title>. <source>J. Physiol. (Lond.)</source> <volume>584</volume>, <fpage>801</fpage>&#x02013;<lpage>818</lpage>.<pub-id pub-id-type="doi">10.1113/jphysiol.2007.140236</pub-id><pub-id pub-id-type="pmid">17761777</pub-id></citation></ref>
<ref id="B126"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Takahashi</surname> <given-names>Y. K.</given-names></name> <name><surname>Roesch</surname> <given-names>M. R.</given-names></name> <name><surname>Stalnaker</surname> <given-names>T. A.</given-names></name> <name><surname>Haney</surname> <given-names>R. Z.</given-names></name> <name><surname>Calu</surname> <given-names>D. J.</given-names></name> <name><surname>Taylor</surname> <given-names>A. R.</given-names></name> <name><surname>Burke</surname> <given-names>K. A.</given-names></name> <name><surname>Schoenbaum</surname> <given-names>G.</given-names></name></person-group> (<year>2009</year>). <article-title>The orbitofrontal cortex and ventral tegmental area are necessary for learning from unexpected outcomes</article-title>. <source>Neuron</source> <volume>62</volume>, <fpage>269</fpage>&#x02013;<lpage>280</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuron.2009.03.005</pub-id><pub-id pub-id-type="pmid">19409271</pub-id></citation></ref>
<ref id="B127"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tanaka</surname> <given-names>S. C.</given-names></name> <name><surname>Doya</surname> <given-names>K.</given-names></name> <name><surname>Okada</surname> <given-names>G.</given-names></name> <name><surname>Ueda</surname> <given-names>K.</given-names></name> <name><surname>Okamoto</surname> <given-names>Y.</given-names></name> <name><surname>Yamawaki</surname> <given-names>S.</given-names></name></person-group> (<year>2004</year>). <article-title>Prediction of immediate and future rewards differentially recruits cortico-basal ganglia loops</article-title>. <source>Nat. Neurosci.</source> <volume>7</volume>, <fpage>887</fpage>&#x02013;<lpage>893</lpage>.<pub-id pub-id-type="doi">10.1038/nn1279</pub-id><pub-id pub-id-type="pmid">15235607</pub-id></citation></ref>
<ref id="B128"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Thaler</surname> <given-names>R.</given-names></name></person-group> (<year>1981</year>). <article-title>Some empirical evidence on dynamic inconsistency</article-title>. <source>Econ. Lett.</source> <volume>8</volume>, <fpage>201</fpage>&#x02013;<lpage>207</lpage>.<pub-id pub-id-type="doi">10.1016/0165-1765(81)90067-7</pub-id></citation></ref>
<ref id="B129"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tobler</surname> <given-names>P. N.</given-names></name> <name><surname>Dickinson</surname> <given-names>A.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2003</year>). <article-title>Coding of predicted reward omission by dopamine neurons in a conditioned inhibition paradigm</article-title>. <source>J. Neurosci.</source> <volume>23</volume>, <fpage>10402</fpage>&#x02013;<lpage>10410</lpage>.<pub-id pub-id-type="pmid">14614099</pub-id></citation></ref>
<ref id="B130"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tye</surname> <given-names>K. M.</given-names></name> <name><surname>Cone</surname> <given-names>J. J.</given-names></name> <name><surname>Schairer</surname> <given-names>W. W.</given-names></name> <name><surname>Janak</surname> <given-names>P. H.</given-names></name></person-group> (<year>2010</year>). <article-title>Amygdala neural encoding of the absence of reward during extinction</article-title>. <source>J. Neurosci.</source> <volume>30</volume>, <fpage>116</fpage>&#x02013;<lpage>125</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.4240-09.2010</pub-id><pub-id pub-id-type="pmid">20053894</pub-id></citation></ref>
<ref id="B131"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van der Meer</surname> <given-names>M. A.</given-names></name> <name><surname>Redish</surname> <given-names>A. D.</given-names></name></person-group> (<year>2009</year>). <article-title>Covert expectation of reward in rat ventral striatum at decision points</article-title>. <source>Front. Integr. Neurosci.</source> <volume>3</volume>:<fpage>1</fpage>.<pub-id pub-id-type="doi">10.3389/neuro.07.001.2009</pub-id></citation></ref>
<ref id="B132"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wade</surname> <given-names>T. R.</given-names></name> <name><surname>de Wit</surname> <given-names>H.</given-names></name> <name><surname>Richards</surname> <given-names>J. B.</given-names></name></person-group> (<year>2000</year>). <article-title>Effects of dopaminergic drugs on delayed reward as a measure of impulsive behavior in rats</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>150</volume>, <fpage>90</fpage>&#x02013;<lpage>101</lpage>.<pub-id pub-id-type="doi">10.1007/s002130000402</pub-id><pub-id pub-id-type="pmid">10867981</pub-id></citation></ref>
<ref id="B133"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Waelti</surname> <given-names>P.</given-names></name> <name><surname>Dickinson</surname> <given-names>A.</given-names></name> <name><surname>Schultz</surname> <given-names>W.</given-names></name></person-group> (<year>2001</year>). <article-title>Dopamine responses comply with basic assumptions of formal learning theory</article-title>. <source>Nature</source> <volume>412</volume>, <fpage>43</fpage>&#x02013;<lpage>48</lpage>.<pub-id pub-id-type="doi">10.1038/35083500</pub-id><pub-id pub-id-type="pmid">11452299</pub-id></citation></ref>
<ref id="B134"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wallis</surname> <given-names>J. D.</given-names></name> <name><surname>Kennerley</surname> <given-names>S. W.</given-names></name></person-group> (<year>2010</year>). <article-title>Heterogeneous reward signals in prefrontal cortex</article-title>. <source>Curr. Opin. Neurobiol.</source> <volume>20</volume>, <fpage>191</fpage>&#x02013;<lpage>198</lpage>.<pub-id pub-id-type="doi">10.1016/j.conb.2010.02.009</pub-id><pub-id pub-id-type="pmid">20303739</pub-id></citation></ref>
<ref id="B135"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Winstanley</surname> <given-names>C. A.</given-names></name></person-group> (<year>2007</year>). <article-title>The orbitofrontal cortex, impulsivity, and addiction: probing orbitofrontal dysfunction at the neural, neurochemical, and molecular level</article-title>. <source>Ann. N. Y. Acad. Sci.</source> <volume>1121</volume>, <fpage>639</fpage>&#x02013;<lpage>655</lpage>.<pub-id pub-id-type="doi">10.1196/annals.1401.024</pub-id><pub-id pub-id-type="pmid">17846162</pub-id></citation></ref>
<ref id="B136"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Winstanley</surname> <given-names>C. A.</given-names></name> <name><surname>Dalley</surname> <given-names>J. W.</given-names></name> <name><surname>Theobald</surname> <given-names>D. E.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name></person-group> (<year>2004a</year>). <article-title>Fractionating impulsivity: contrasting effects of central 5-HT depletion on different measures of impulsive behavior</article-title>. <source>Neuropsychopharmacology</source> <volume>29</volume>, <fpage>1331</fpage>&#x02013;<lpage>1343</lpage>.<pub-id pub-id-type="doi">10.1038/sj.npp.1300434</pub-id></citation></ref>
<ref id="B137"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Winstanley</surname> <given-names>C. A.</given-names></name> <name><surname>Theobald</surname> <given-names>D. E.</given-names></name> <name><surname>Cardinal</surname> <given-names>R. N.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name></person-group> (<year>2004b</year>). <article-title>Contrasting roles of basolateral amygdala and orbitofrontal cortex in impulsive choice</article-title>. <source>J. Neurosci.</source> <volume>24</volume>, <fpage>4718</fpage>&#x02013;<lpage>4722</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.5606-03.2004</pub-id></citation></ref>
<ref id="B138"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Winstanley</surname> <given-names>C. A.</given-names></name> <name><surname>Theobald</surname> <given-names>D. E.</given-names></name> <name><surname>Dalley</surname> <given-names>J. W.</given-names></name> <name><surname>Glennon</surname> <given-names>J. C.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name></person-group> (<year>2004c</year>). <article-title>5-HT2A and 5-HT2C receptor antagonists have opposing effects on a measure of impulsivity: interactions with global 5-HT depletion</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>176</volume>, <fpage>376</fpage>&#x02013;<lpage>385</lpage>.<pub-id pub-id-type="doi">10.1007/s00213-004-1884-9</pub-id></citation></ref>
<ref id="B139"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Winstanley</surname> <given-names>C. A.</given-names></name> <name><surname>Theobald</surname> <given-names>D. E. H.</given-names></name> <name><surname>Cardinal</surname> <given-names>R. N.</given-names></name> <name><surname>Robbins</surname> <given-names>T. W.</given-names></name></person-group> (<year>2004d</year>). <article-title>Contrasting roles of basolateral amygdala and orbitofrontal cortex in impulsive choice</article-title>. <source>J. Neurosci.</source> <volume>24</volume>, <fpage>4718</fpage>&#x02013;<lpage>4722</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.5606-03.2004</pub-id></citation></ref>
<ref id="B140"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wogar</surname> <given-names>M. A.</given-names></name> <name><surname>Bradshaw</surname> <given-names>C. M.</given-names></name> <name><surname>Szabadi</surname> <given-names>E.</given-names></name></person-group> (<year>1993</year>). <article-title>Effect of lesions of the ascending 5-hydroxytryptaminergic pathways on choice between delayed reinforcers</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>111</volume>, <fpage>239</fpage>&#x02013;<lpage>243</lpage>.<pub-id pub-id-type="doi">10.1007/BF02245530</pub-id><pub-id pub-id-type="pmid">7870958</pub-id></citation></ref>
<ref id="B141"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yacubian</surname> <given-names>J.</given-names></name> <name><surname>Glascher</surname> <given-names>J.</given-names></name> <name><surname>Schroeder</surname> <given-names>K.</given-names></name> <name><surname>Sommer</surname> <given-names>T.</given-names></name> <name><surname>Braus</surname> <given-names>D. F.</given-names></name> <name><surname>Buchel</surname> <given-names>C.</given-names></name></person-group> (<year>2006</year>). <article-title>Dissociable systems for gain- and loss-related value predictions and errors of prediction in the human brain</article-title>. <source>J. Neurosci.</source> <volume>26</volume>, <fpage>9530</fpage>&#x02013;<lpage>9537</lpage>.<pub-id pub-id-type="doi">10.1523/JNEUROSCI.2915-06.2006</pub-id><pub-id pub-id-type="pmid">16971537</pub-id></citation></ref>
<ref id="B142"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zeeb</surname> <given-names>F. D.</given-names></name> <name><surname>Floresco</surname> <given-names>S. B.</given-names></name> <name><surname>Winstanley</surname> <given-names>C. A.</given-names></name></person-group> (<year>2010</year>). <article-title>Contributions of the orbitofrontal cortex to impulsive choice: interactions with basal levels of impulsivity, dopamine signalling, and reward-related cues</article-title>. <source>Psychopharmacology (Berl.)</source> <volume>211</volume>, <fpage>87</fpage>&#x02013;<lpage>98</lpage>.<pub-id pub-id-type="doi">10.1007/s00213-010-1871-2</pub-id><pub-id pub-id-type="pmid">20428999</pub-id></citation></ref>
</ref-list>
</back>
</article>