<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Comput. Neurosci.</journal-id>
<journal-title>Frontiers in Computational Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Comput. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5188</issn>
<publisher>
<publisher-name>Frontiers Research Foundation</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fncom.2010.00129</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Compensating Inhomogeneities of Neuromorphic VLSI Devices Via Short-Term Synaptic Plasticity</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Bill</surname> <given-names>Johannes</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="author-notes" rid="fn001">&#x0002A;</xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Schuch</surname> <given-names>Klaus</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Br&#x000FC;derle</surname> <given-names>Daniel</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Schemmel</surname> <given-names>Johannes</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Maass</surname> <given-names>Wolfgang</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Meier</surname> <given-names>Karlheinz</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Kirchhoff Institute for Physics, University of Heidelberg</institution> <country>Heidelberg, Germany</country></aff>
<aff id="aff2"><sup>2</sup><institution>Institute for Theoretical Computer Science, Graz University of Technology</institution> <country>Graz, Austria</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Stefano Fusi, Columbia University, USA</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Larry F. Abbott, Columbia University, USA; Walter Senn, University of Bern, Switzerland</p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: Johannes Bill, Institute for Theoretical Computer Science, Graz University of Technology, Inffeldgasse 16b/1, A&#x02013;8010 Graz, Austria. e-mail: <email>bill&#x00040;igi.tugraz.at</email></p></fn>
</author-notes>
<pub-date pub-type="epreprint">
<day>01</day>
<month>03</month>
<year>2010</year>
</pub-date>
<pub-date pub-type="epub">
<day>08</day>
<month>10</month>
<year>2010</year>
</pub-date>
<pub-date pub-type="collection">
<year>2010</year>
</pub-date>
<volume>4</volume>
<elocation-id>129</elocation-id>
<history>
<date date-type="received">
<day>25</day>
<month>02</month>
<year>2010</year>
</date>
<date date-type="accepted">
<day>11</day>
<month>08</month>
<year>2010</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2010 Bill, Schuch, Br&#x000FC;derle, Schemmel, Maass and Meier.</copyright-statement>
<copyright-year>2010</copyright-year>
<license license-type="open-access" xlink:href="http://www.frontiersin.org/licenseagreement"><p>This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.</p></license>
</permissions>
<abstract>
<p>Recent developments in neuromorphic hardware engineering make mixed-signal VLSI neural network models promising candidates for neuroscientific research tools and massively parallel computing devices, especially for tasks which exhaust the computing power of software simulations. Still, like all analog hardware systems, neuromorphic models suffer from a constricted configurability and production-related fluctuations of device characteristics. Since also future systems, involving ever-smaller structures, will inevitably exhibit such inhomogeneities on the unit level, self-regulation properties become a crucial requirement for their successful operation. By applying a cortically inspired self-adjusting network architecture, we show that the activity of generic spiking neural networks emulated on a neuromorphic hardware system can be kept within a biologically realistic firing regime and gain a remarkable robustness against transistor-level variations. As a first approach of this kind in engineering practice, the short-term synaptic depression and facilitation mechanisms implemented within an analog VLSI model of I&#x00026;F neurons are functionally utilized for the purpose of network level stabilization. We present experimental data acquired both from the hardware model and from comparative software simulations which prove the applicability of the employed paradigm to neuromorphic VLSI devices.</p>
</abstract>
<kwd-group>
<kwd>neuromorphic hardware</kwd>
<kwd>spiking neural networks</kwd>
<kwd>self-regulation</kwd>
<kwd>short-term synaptic plasticity</kwd>
<kwd>robustness</kwd>
<kwd>leaky integrate-and-fire neuron</kwd>
<kwd>parallel computing</kwd>
<kwd>PCSIM</kwd>
</kwd-group>
<counts>
<fig-count count="8"/>
<table-count count="1"/>
<equation-count count="4"/>
<ref-count count="46"/>
<page-count count="14"/>
<word-count count="10813"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="introduction">
<title>Introduction</title>
<p>Software simulators have become an indispensable tool for investigating the dynamics of spiking neural networks (Brette et al., <xref ref-type="bibr" rid="B8">2007</xref>). But when it comes to studying large-scale networks or long-time learning, their usage easily results in lengthy computing times (Morrison et al., <xref ref-type="bibr" rid="B33">2005</xref>). A common solution, the distribution of a task to multiple CPUs, raises both required space and power consumption. Thus, the usage of neural networks in embedded systems remains complicated.</p>
<p>An alternative approach implements neuron and synapse models as physical entities in electronic circuitry (Mead, <xref ref-type="bibr" rid="B29">1989</xref>). This technique provides a fast emulation at a maintainable wattage (Douglas et al., <xref ref-type="bibr" rid="B15">1995</xref>). Furthermore, as all units inherently evolve in parallel, the speed of computation is widely independent of the network size. Several groups have made significant progress in this field during the last years (see for example Indiveri et al., <xref ref-type="bibr" rid="B23">2006</xref>; Merolla and Boahen, <xref ref-type="bibr" rid="B30">2006</xref>; Schemmel et al., <xref ref-type="bibr" rid="B37">2007</xref>, <xref ref-type="bibr" rid="B38">2008</xref>; Vogelstein et al., <xref ref-type="bibr" rid="B45">2007</xref>; Mitra et al., <xref ref-type="bibr" rid="B31">2009</xref>). The successful application of such neuromorphic hardware in neuroscientific modeling, robotics and novel data processing systems will essentially depend on the achievement of a high spatial integration density of neurons and synapses. As a consequence of ever-smaller integrated circuits, analog neuromorphic VLSI devices inevitably suffer from imperfections of their components due to variations in the productions process (Dally and Poulton, <xref ref-type="bibr" rid="B12">1998</xref>). The impact of such imperfections can reach from parameter inaccuracies up to serious malfunctioning of individual units. In conclusion, the particular, selected emulation device might distort the network behavior.</p>
<p>For that reason, designers of neuromorphic hardware often include auxiliary parameters which allow to readjust the characteristics of many components. But since such calibration abilities require additional circuitry, their possible extent of use usually has to be limited to parameters that are crucial for the operation. Hence, further concepts are needed in order to compensate the influence of hardware variations on network dynamics. Besides increasing the accuracy of unit parameters like threshold voltages or synaptic time constants, a possible solution is to take advantage of self-regulating effects in the dynamics of neural networks. While individual units might lack adequate precision, populations of properly interconnected neurons can still feature a faultless performance.</p>
<p>Long-term synaptic potentiation and depression (Morrison et al., <xref ref-type="bibr" rid="B32">2008</xref>) might be effective mechanisms to tailor neural dynamics to the properties of the respective hardware substrate. Still, such persistent changes of synaptic efficacy can drastically reshape the connectivity of a network. In contrast, short-term synaptic plasticity (Zucker and Regehr, <xref ref-type="bibr" rid="B46">2002</xref>) alters synaptic strength transiently. As the effect fades after some hundred milliseconds, the network topology is preserved.</p>
<p>We show that short-term synaptic plasticity enables neural networks, that are emulated on a neuromorphic hardware system, to reliably adjust their activity to a moderate level. The achievement of such a <italic>substrate on a network level</italic> is an important step toward the establishment of neuromorphic hardware as a valuable scientific modeling tool as well as its application as a novel type of adaptive and highly parallel computing device.</p>
<p>For this purpose, we examine a generic network architecture as proposed and studied by Sussillo et al. (<xref ref-type="bibr" rid="B44">2007</xref>), which was proven to feature self-adjustment capabilities. As such networks only consist of randomly connected excitatory and inhibitory neurons and exhibit little specialized structures, they can be found in various cortical network models. In other words, properties of this architecture are likely to be valid in a variety of experiments.</p>
<p>Still, the results of Sussillo et al. (<xref ref-type="bibr" rid="B44">2007</xref>) not necessarily hold for neuromorphic hardware devices: The referred work addressed networks of 5000 neurons. As the employed prototype hardware system (Schemmel et al., <xref ref-type="bibr" rid="B39">2006</xref>, <xref ref-type="bibr" rid="B37">2007</xref>) only supports some hundred neurons, it remained unclear whether the architecture is suitable for smaller networks, too. Furthermore, the applicability to the specific inhomogeneities of the hardware substrate have not been investigated before. We proof that even small networks are capable of leveling their activity. This suggests that the studied architecture can enhance the usability of upcoming neuromorphic hardware systems, which will comprise millions of synapses.</p>
<p>The successful implementation of short-term synaptic plasticity into neuromorphic hardware has been achieved by several work groups, see, e.g., Boegershausen et al. (<xref ref-type="bibr" rid="B7">2003</xref>) or Bartolozzi and Indiveri (<xref ref-type="bibr" rid="B3">2007</xref>). Nevertheless, this work presents the first functional application of this feature within emulated networks. It is noteworthy, that the biological interpretation of the used hardware parameters is in accord with physiological data as measured by Markram et al. (<xref ref-type="bibr" rid="B28">1998</xref>) and Gupta et al. (<xref ref-type="bibr" rid="B20">2000</xref>).</p>
<p>Since the utilized system is in a prototype state of development, the emulations have been prepared and counter-checked using the well-established software simulator Parallel neural Circuit SIMulator (PCSIM; Pecevski et al., <xref ref-type="bibr" rid="B35">2009</xref>). In addition, this tool allowed a decent analysis of network dynamics because the internal states of all neurons and synapses can be accessed and monitored continuously.</p>
</sec>
<sec sec-type="materials|methods">
<title>Materials and Methods</title>
<p>The applied setup and workflow involve an iterative process using two complementary simulation back-ends: Within the FACETS research project (FACETS, <xref ref-type="bibr" rid="B16">2009</xref>), the FACETS Stage 1 Hardware system (Schemmel et al., <xref ref-type="bibr" rid="B39">2006</xref>, <xref ref-type="bibr" rid="B37">2007</xref>) and the software simulator PCSIM (Pecevski et al., <xref ref-type="bibr" rid="B35">2009</xref>) are being developed.</p>
<p>First, it had to be investigated whether the employed network architecture exhibits its self-adjustment ability in small networks fitting onto the current prototype hardware system. For this purpose, simulations have been set up on PCSIM which only roughly respected details of the hardware characteristics, but comprised a sufficiently small number of neurons and synapses. Since the trial yielded promising results, the simulations were transferred to the FACETS Hardware. At this stage the setup had to be readjusted in order to meet all properties and limitations of the hardware substrate. Finally, the parameters used during the hardware emulations were transferred back to PCSIM in order to verify the results.</p>
<p>In Sections <xref ref-type="sec" rid="S1">&#x0201C;The Utilized Hardware System&#x0201D;</xref> and <xref ref-type="sec" rid="S4">&#x0201C;The Parallel neural Circuit SIMulator&#x0201D;</xref> both back-ends are briefly described. Section <xref ref-type="sec" rid="S5">&#x0201C;Network Configuration&#x0201D;</xref> addresses the examined network architecture and the parameters applied. In Section <xref ref-type="sec" rid="S8">&#x0201C;Measurement&#x0201D;</xref>, the experimental setup for both back-ends is presented.</p>
<sec id="S1">
<title>The utilized hardware system</title>
<p>The present prototype FACETS Stage 1 Hardware system physically implements neuron and synapse models using analog circuitry (Schemmel et al., <xref ref-type="bibr" rid="B39">2006</xref>, <xref ref-type="bibr" rid="B37">2007</xref>). Beside the <italic>analog neural network core</italic> (the so-called <italic>Spikey</italic> chip) it consists of different (mostly digital) components that provide communication and power supply as well as a multi-layer software framework for configuration and readout (Gr&#x000FC;bl, <xref ref-type="bibr" rid="B19">2007</xref>; Br&#x000FC;derle et al., <xref ref-type="bibr" rid="B10">2009</xref>).</p>
<p>The Spikey chip is built using a standard 180&#x02009;nm CMOS process on a 25-mm<sup>2</sup> die. Each chip holds 384 conductance-based leaky integrate-and-fire point neurons, which can be interconnected or externally stimulated via approximately 100,000 synapses whose conductance courses rise and decay exponentially in time. As all physical units inherently evolve both in parallel and time-continuously, experiments performed on the hardware are commonly referred to as <italic>emulations</italic>. The dimensioning of the utilized electronic components allows a highly accelerated operation compared to the biological archetype. Throughout this work, emulations were executed with a speedup factor of 10<sup>5</sup>.</p>
<p>In order to identify voltages, currents and the time flow in the chip as parameters of the neuron model, all values need to be translated between the hardware domain and the biological domain. The configuration and readout of the system has been designed for an intuitive, biological description of experimental setups: The Python-based (Rossum, <xref ref-type="bibr" rid="B36">2000</xref>) meta-language <italic>PyNN</italic> (Davison et al., <xref ref-type="bibr" rid="B13">2008</xref>) provides a back-end independent modeling tool, for which a hardware-specific implementation is available (Br&#x000FC;derle et al., <xref ref-type="bibr" rid="B10">2009</xref>). All hardware-specific configuration and data structures (including calibration and parameter mapping), which are encapsulated within low-level machine-oriented software structures, are addressed automatically via a <italic>Python Hardware Abstraction Layer</italic> (PyHAL).</p>
<p>Using this translation of biological values into hardware dimensions and vice versa which is performed by the PyHAL, all values given throughout this work reflect the biological interpretation domain.</p>
<sec id="S2">
<title>Short-term synaptic plasticity</title>
<p>All synapses of the FACETS Stage 1 Hardware support two types of synaptic plasticity (Schemmel et al., <xref ref-type="bibr" rid="B37">2007</xref>). While a spike-timing dependent plasticity (STDP) mechanism (Bi and Poo, <xref ref-type="bibr" rid="B5">1997</xref>; Song et al., <xref ref-type="bibr" rid="B41">2000</xref>) is implemented in every synapse, short-term plasticity (STP) only depends on the spiking behavior of the pre-synaptic neuron. The corresponding circuitry is part of the so-called <italic>synapse drivers</italic> and, thus, STP-parameters are shared by all synaptic connections operated by the same driver. Each pre-synaptic neuron can project its action potentials (APs) to two different synapse drivers. Hence, two freely programmable STP-configurations are available per pre-synaptic neuron. The STP mechanism implemented in the FACETS Stage 1 Hardware is inspired by Markram et al. (<xref ref-type="bibr" rid="B28">1998</xref>). But while the latter model combines synaptic <italic>facilitation</italic> and <italic>depression</italic>, the hardware provides the two modes separately. Each synapse driver can either be run in facilitation or in depression mode or simply emulate static synapses without short-term dynamics. Despite this restriction, these short-term synapse dynamics support dynamic gain-control mechanisms as, e.g., reported in Abbott et al. (<xref ref-type="bibr" rid="B1">1997</xref>).</p>
<p>In the Spikey chip, the conductance <italic>g</italic>(<italic>t</italic>) of a synapse is composed of a discrete synaptic weight multiplier <italic>w<sub>n</sub></italic>, the base efficacy <italic>w</italic><sub>0</sub>(<italic>t</italic>) of a synapse driver and the conductance course of the rising and falling edge <italic>p</italic>(<italic>t</italic>):</p>
<p><italic>g</italic>(<italic>t</italic>)&#x02009;&#x0003D;&#x02009;<italic>w<sub>n</sub></italic>&#x00B7;<italic>w</italic><sub>0</sub>(<italic>t</italic>)&#x00B7;<italic>p</italic>(<italic>t</italic>)&#x02009;&#x0003D;:&#x02009;<italic>w</italic>(<italic>t</italic>)&#x00B7;<italic>p</italic>(<italic>t</italic>)</p>
<p>with <italic>w<sub>n</sub></italic>&#x02009;&#x02208;&#x02009;{0,1,2,&#x02026;,15}. In this framework, STP alters the base efficacy <italic>w</italic><sub>0</sub>(<italic>t</italic>) while the double-exponential conductance course of a single post-synaptic potential is modeled via <italic>p</italic>(<italic>t</italic>)&#x02009;&#x02208;&#x02009;[0,1]. Whenever an AP is provoked by the pre-synaptic neuron, <italic>p</italic>(<italic>t</italic>) is triggered to run the conductance course. To simplify matters, the product <italic>w<sub>n</sub></italic>&#x00B7;<italic>w</italic><sub>0</sub>(<italic>t</italic>) often is combined to the synaptic weight <italic>w</italic>(<italic>t</italic>) or just <italic>w</italic> in case of static synapses.</p>
<p>Both STP-modes, facilitation and depression, alter the synaptic weight in a similar manner using an <italic>active partition</italic> <italic>I</italic>(<italic>t</italic>)&#x02009;&#x02208;&#x02009;[0,1]. The strength <italic>w</italic><sub>stat</sub> of a static synapse is changed to</p>
<disp-formula id="E1"><label>(1)</label><mml:math id="m1"><mml:mtable columnalign='left'><mml:mtr><mml:mtd><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mtext>fac</mml:mtext></mml:mrow></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>=</mml:mo><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mtext>stat</mml:mtext></mml:mrow></mml:msub><mml:mo>&#x022C5;</mml:mo><mml:mrow><mml:mo>[</mml:mo> <mml:mrow><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mo>&#x003BB;</mml:mo><mml:mo>&#x022C5;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>I</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>&#x02212;</mml:mo><mml:mo>&#x003B2;</mml:mo></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow> <mml:mo>]</mml:mo></mml:mrow><mml:mo>,</mml:mo></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mtext>dep</mml:mtext></mml:mrow></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>=</mml:mo><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mtext>stat</mml:mtext></mml:mrow></mml:msub><mml:mo>&#x022C5;</mml:mo><mml:mrow><mml:mo>[</mml:mo> <mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x02212;</mml:mo><mml:mo>&#x003BB;</mml:mo><mml:mo>&#x022C5;</mml:mo><mml:mi>I</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow> <mml:mo>]</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>in case of facilitation and depression, respectively. The parameters &#x003BB; and &#x003B2; are freely configurable. For technical reasons, the change of synaptic weights by STP cannot be larger than the underlying static weight. Stronger modifications are truncated. Hence, 0&#x02009;&#x02264;&#x02009;<italic>w</italic><sub>fac/dep</sub>&#x02009;&#x02264;&#x02009;2&#x00B7;<italic>w</italic><sub>stat</sub>.</p>
<p>The active partition <italic>I</italic> obeys the following dynamics: Without any activity <italic>I</italic> decays exponentially with time constant &#x003C4;<sub>STP</sub>, while every AP processed increases <italic>I</italic> by a fixed fraction <italic>C</italic> toward the maximum,</p>
<disp-formula id="E2"><mml:math id="m2"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:mi>I</mml:mi></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mo>&#x02212;</mml:mo><mml:mfrac><mml:mi>I</mml:mi><mml:mrow><mml:msub><mml:mo>&#x003C4;</mml:mo><mml:mrow><mml:mtext>STP</mml:mtext></mml:mrow></mml:msub></mml:mrow></mml:mfrac><mml:mo>+</mml:mo><mml:mi>C</mml:mi><mml:mo>&#x022C5;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x02212;</mml:mo><mml:mi>I</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x022C5;</mml:mo><mml:mo>&#x003B4;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>t</mml:mi><mml:mo>&#x02212;</mml:mo><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:mrow></mml:math></disp-formula>
<p>For <italic>C</italic>&#x02009;&#x02208;&#x02009;[0,1], <italic>I</italic> is restricted to the interval mentioned above. Since the active partition affects the analog value <italic>w</italic><sub>0</sub>(<italic>t</italic>), the STP-mechanism is not subject to the weight-discretization <italic>w<sub>n</sub></italic> of the synapse arrays but alters weights continuously.</p>
<p>Figure <xref ref-type="fig" rid="F1">1</xref> shows examples of the dynamics of the three STP-modes as measured on the FACETS Stage 1 Hardware. The applied parameters agree with those of the emulations presented throughout this work.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p><bold>Short-term plasticity-mechanism of the FACETS Stage 1 Hardware</bold>. A neuron is excited by an input neuron that spikes regularly at 20&#x02009;Hz. Three hundred milliseconds after the last regular spike a single spike is appended. Additionally, the neuron is stimulated with Poisson spike trains from further input neurons. The figure shows the membrane potential of the post-synaptic neuron, averaged over 500 experiment runs. As the Poisson background cancels out, the EPSPs provoked by the observed synapse are revealed. Time and voltage are given in both hardware values and their biological interpretation. The three traces represent different modes of the involved synapse driver. <italic>Facilitation</italic>: The plastic synapse grows in strength with every AP processed. After 300&#x02009;ms without activity the active partition has partly decayed. <italic>Depression</italic>: High activity weakens the synapse. <italic>Static</italic>: The synapse keeps its weight fixed.</p></caption>
<graphic xlink:href="fncom-04-00129-g001.tif"/>
</fig>
</sec>
<sec id="S3">
<title>Hardware constraints</title>
<p>Neurons and synapses are represented by physical entities in the chip. As similar units reveal slightly different properties due to the production process, each unit exhibits an individual discrepancy between the desired configuration and its actual behavior. Since all parameters are controlled by voltages and currents, which require additional circuitry within the limited die, many parameters and sub-circuits are shared by multiple units. This results in narrowed parameter ranges and limitations on the network topology.</p>
<p>Beyond these intentional design-inherent fluctuations and restrictions, the current prototype system suffers from some malfunctions of different severity. These errors are mostly understood and will be fixed in future systems. In the following, the constraints which are relevant for the applied setup will be outlined. For detailed information the reader may refer to the respective literature given below.</p>
<verse-group><verse-line>Design-inherent constraints</verse-line></verse-group>
<list list-type="bullet">
<list-item><p>As described above, synaptic weights are discrete values <italic>w</italic>&#x02009;&#x0003D;&#x02009;<italic>w<sub>n</sub></italic>&#x00B7;<italic>w</italic><sub>0</sub> with <italic>w<sub>n</sub></italic>&#x02009;&#x02208;&#x02009;{0,1,2,&#x02026;,15} (Schemmel et al., <xref ref-type="bibr" rid="B39">2006</xref>). Since biological weights are continuous values, they are mapped probabilistically to the two closest discrete hardware weights. Therefore, this constraint is assumed to have little impact on large, randomly connected networks.</p></list-item>
<list-item><p>Each pre-synaptic neuron allocates two synapse drivers to provide both facilitating and depressing synapses. Since only 384 synapse drivers are available for the operation of recurrent connections, this restricts the maximum network size to 384/2&#x02009;&#x0003D;&#x02009;192 neurons. After establishing the recurrent connections, only 64 independent input channels remain for excitatory and inhibitory external stimulation via Poisson spike trains (see Bill, <xref ref-type="bibr" rid="B6">2008</xref>, Chapter VI.3).</p></list-item>
<list-item><p>Bottlenecks of the communication interface limit the maximum input bandwidth for external stimulation to approximately 12&#x02009;Hz per channel when 64 channels are used for external stimulation with Poisson spike trains. Future revisions are planned to run at a speedup factor of 10<sup>4</sup> instead of 10<sup>5</sup>, effectively increasing the input bandwidth by a factor of 10 from the biological point of view (see Gr&#x000FC;bl, <xref ref-type="bibr" rid="B19">2007</xref>, Chapter 3.2.1; Br&#x000FC;derle, <xref ref-type="bibr" rid="B9">2009</xref>, Chapter 4.3.7).</p></list-item>
</list>
<verse-group><verse-line>Malfunctions</verse-line></verse-group>
<list list-type="bullet">
<list-item><p>The efficacy of excitatory synapses was found to be unstable. A frequent global activity of excitatory synapses has been shown to decrease EPSP amplitudes up to a factor of two. Presumably, this effect depends on both the configuration of the chip and the overall spike activity. We refer to this malfunction as <italic>load-dependency of the synaptic efficacy</italic> in the following. Since the error cannot be counterbalanced by calibration or tuning the configuration, it is considered crucial for the presented experimental setup (see Br&#x000FC;derle, <xref ref-type="bibr" rid="B9">2009</xref>, Chapter 4.3.4).</p></list-item>
<list-item><p>The current system suffers from a disproportionality between the falling-edge synaptic time constant &#x003C4;<sub>syn</sub>&#x02009;&#x02248;&#x02009;30&#x02009;ms and the membrane time constant &#x003C4;<sub>mem</sub>&#x02009;&#x02248;&#x02009;5&#x02009;ms, i.e., a fast membrane and slow synapses. This was taken into consideration when applying external stimulation, as presented in Section <xref ref-type="sec" rid="S7">&#x0201C;Applied Parameters&#x0201D;</xref> (see Br&#x000FC;derle, <xref ref-type="bibr" rid="B9">2009</xref>, Chapter 4.3.5; Kaplan et al., <xref ref-type="bibr" rid="B24">2009</xref>).</p></list-item>
<list-item><p>Insufficient precision of the neuron threshold comparator along with a limited reset conductance result in a rather wide spread of the neuron threshold and reset voltages <italic>V</italic><sub>thresh</sub> and <italic>V</italic><sub>reset</sub>. As both values are shared by multiple neurons, this effect can only be partially counterbalanced by calibration. The used calibration algorithms lead to <inline-formula><mml:math id="m3"><mml:mrow><mml:msub><mml:mo>&#x003C3;</mml:mo><mml:mrow><mml:msub><mml:mi>V</mml:mi><mml:mrow><mml:mtext>thresh</mml:mtext></mml:mrow></mml:msub></mml:mrow></mml:msub><mml:mo>&#x02248;</mml:mo><mml:mn>3</mml:mn><mml:mtext>&#x02009;mV</mml:mtext></mml:mrow></mml:math></inline-formula> and <inline-formula><mml:math id="m4"><mml:mrow><mml:msub><mml:mo>&#x003C3;</mml:mo><mml:mrow><mml:msub><mml:mi>V</mml:mi><mml:mrow><mml:mtext>reset</mml:mtext></mml:mrow></mml:msub></mml:mrow></mml:msub><mml:mo>&#x02248;</mml:mo><mml:mn>8</mml:mn><mml:mtext>&#x02009;mV</mml:mtext></mml:mrow></mml:math></inline-formula> (see Bill, <xref ref-type="bibr" rid="B6">2008</xref>, Chapter IV.4; Br&#x000FC;derle, <xref ref-type="bibr" rid="B9">2009</xref>, Chapter 4.3.2).</p></list-item>
<list-item><p>Insufficient dynamic ranges of control currents impede a reasonable configuration of the STP parameters &#x003BB; and &#x003B2; in Eq. <xref ref-type="disp-formula" rid="E1">1</xref> without additional technical effort. The presented emulations make use of a workaround which allows a biologically realistic setup of the STP-parameters at the expense of further adjustability. The achieved configuration has been measured and is used throughout the software simulations, as well (see Bill, <xref ref-type="bibr" rid="B6">2008</xref>, Chapter IV.5.4).</p></list-item>
<list-item><p>An error in the spike event readout circuitry prevents a simultaneous recording of the entire network. Since only three neurons of the studied network architecture can be recorded per emulation cycle, every configuration was rerun 192/3&#x02009;&#x0003D;&#x02009;64 times with different neurons recorded. Thus, all neurons have been taken into consideration in order to determine average firing rates. But since the data is obtained in different cycles, it is unclear to what extent network correlation and firing dynamics on a level of precise spike timing can be determined (see M&#x000FC;ller, <xref ref-type="bibr" rid="B34">2008</xref>, Chapter 4.2.2).</p></list-item>
</list>
<sec><title>A remark on parameter precision</title>
<p>The majority of the parameter values used in the implemented neuron model are generated by complex interactions of hardware units, as transistors and capacitors. Each type of circuitry suffers from different variations due to the production process, and these fluctuations sum up to intricate discrepancies of the final parameters. For that reason, both shape and extent of the variances often cannot be calculated in advance. On the other hand, only few parameters of the neuron and synapse model can be observed directly. Exceptions are all kind of voltages, e.g., the membrane voltage or reversal potentials. The knowledge of all other parameters was obtained from indirect measurements by evaluating spike events and membrane voltage traces. The configuration given in Section <xref ref-type="sec" rid="S7">&#x0201C;Applied Parameters&#x0201D;</xref> reflects the current state of knowledge. This means that some specifications &#x02013; especially standard deviations of parameters &#x02013; reflect estimations which are based on long-term experience with the device. But, compared to the above-described malfunctions of the prototype system, distortions arising from uncertainties in the configuration can be expected to be of minor importance.</p>
</sec>
</sec>
</sec>
<sec id="S4">
<title>The parallel neural circuit simulator</title>
<p>All simulations were performed using the PCSIM simulation environment and were set up and controlled via the associated Python interface (Pecevski et al., <xref ref-type="bibr" rid="B35">2009</xref>).</p>
<p>The neurons were modeled as leaky integrate-and-fire cells (LIF) with conductance-based synapses. The dynamics of the membrane voltage <italic>V</italic>(<italic>t</italic>) is defined by</p>
<disp-formula id="E3"><mml:math id="m5"><mml:mrow><mml:mtable columnalign='left'><mml:mtr columnalign='left'><mml:mtd columnalign='left'><mml:mrow><mml:msub><mml:mi>C</mml:mi><mml:mtext>m</mml:mtext></mml:msub><mml:mfrac><mml:mrow><mml:mtext>d</mml:mtext><mml:mi>V</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow><mml:mrow><mml:mtext>d</mml:mtext><mml:mi>t</mml:mi></mml:mrow></mml:mfrac></mml:mrow></mml:mtd><mml:mtd columnalign='left'><mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mtext>leak</mml:mtext></mml:mrow></mml:msub><mml:mo>&#x022C5;</mml:mo><mml:mo stretchy='false'>(</mml:mo><mml:mi>V</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>&#x02212;</mml:mo><mml:msub><mml:mi>V</mml:mi><mml:mrow><mml:mtext>rest</mml:mtext></mml:mrow></mml:msub><mml:mo stretchy='false'>)</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr columnalign='left'><mml:mtd columnalign='left'><mml:mrow></mml:mrow></mml:mtd><mml:mtd columnalign='left'><mml:mrow><mml:mo>&#x02212;</mml:mo><mml:mstyle displaystyle='true'><mml:munderover><mml:mo>&#x02211;</mml:mo><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:msub><mml:mi>N</mml:mi><mml:mtext>e</mml:mtext></mml:msub></mml:mrow></mml:munderover><mml:mrow><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mtext>e</mml:mtext><mml:mo>,</mml:mo><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>&#x022C5;</mml:mo><mml:mo stretchy='false'>(</mml:mo><mml:mi>V</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>&#x02212;</mml:mo><mml:msub><mml:mi>E</mml:mi><mml:mtext>e</mml:mtext></mml:msub><mml:mo stretchy='false'>)</mml:mo></mml:mrow></mml:mstyle></mml:mrow></mml:mtd></mml:mtr><mml:mtr columnalign='left'><mml:mtd columnalign='left'><mml:mrow></mml:mrow></mml:mtd><mml:mtd columnalign='left'><mml:mrow><mml:mo>&#x02212;</mml:mo><mml:mstyle displaystyle='true'><mml:munderover><mml:mo>&#x02211;</mml:mo><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:msub><mml:mi>N</mml:mi><mml:mtext>i</mml:mtext></mml:msub></mml:mrow></mml:munderover><mml:mrow><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mtext>i</mml:mtext><mml:mo>,</mml:mo><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>&#x022C5;</mml:mo><mml:mo stretchy='false'>(</mml:mo><mml:mi>V</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>&#x02212;</mml:mo><mml:msub><mml:mi>E</mml:mi><mml:mtext>i</mml:mtext></mml:msub><mml:mo stretchy='false'>)</mml:mo></mml:mrow></mml:mstyle></mml:mrow></mml:mtd></mml:mtr><mml:mtr columnalign='left'><mml:mtd columnalign='left'><mml:mrow></mml:mrow></mml:mtd><mml:mtd columnalign='left'><mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>I</mml:mi><mml:mrow><mml:mtext>noise</mml:mtext></mml:mrow></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>,</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:mrow></mml:math></disp-formula>
<p>where <italic>C</italic><sub>m</sub> is the membrane capacity, <italic>g</italic><sub>leak</sub> is the leakage conductance, <italic>V</italic><sub>rest</sub> is the leakage reversal potential, and <italic>g</italic><sub>e,<italic>k</italic></sub>(<italic>t</italic>) and <italic>g</italic><sub>i,<italic>k</italic></sub>(<italic>t</italic>) are the synaptic conductances of the <italic>N</italic><sub>e</sub> excitatory and <italic>N</italic><sub>i</sub> inhibitory synapses with reversal potentials <italic>E</italic><sub>e</sub> and <italic>E</italic><sub>i</sub>, respectively. The white noise current <italic>I</italic><sub>noise</sub>(<italic>t</italic>) has zero mean and a standard deviation &#x003C3;<sub>noise</sub>&#x02009;&#x0003D;&#x02009;5&#x02009;pA. It models analog noise of the hardware circuits.</p>
<p>The dynamics of the conductance <italic>g</italic>(<italic>t</italic>) of a synapse is defined by</p>
<disp-formula id="E4"><mml:math id="m6"><mml:mrow><mml:mfrac><mml:mrow><mml:mtext>d</mml:mtext><mml:mi>g</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow><mml:mrow><mml:mtext>d</mml:mtext><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mo>&#x02212;</mml:mo><mml:mfrac><mml:mrow><mml:mi>g</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow><mml:mrow><mml:msub><mml:mo>&#x003C4;</mml:mo><mml:mrow><mml:mtext>syn</mml:mtext></mml:mrow></mml:msub></mml:mrow></mml:mfrac><mml:mo>+</mml:mo><mml:mi>w</mml:mi><mml:mo>&#x022C5;</mml:mo><mml:mo>&#x003B4;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>t</mml:mi><mml:mo>&#x02212;</mml:mo><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mtext>AP</mml:mtext></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo></mml:mrow></mml:math></disp-formula>
<p>where <italic>g</italic>(<italic>t</italic>) is the synaptic conductance and <italic>w</italic> is the synaptic weight. The conductances decrease exponentially with time constant &#x003C4;<sub>syn</sub> and increase instantaneously by adding <italic>w</italic> to the running value of <italic>g</italic>(<italic>t</italic>) whenever an AP occurs in the pre-synaptic neuron at time <italic>t</italic><sub>AP</sub>. Modeling the exponentially rising edge of the conductance course of the FACETS Stage 1 Hardware synapses was considered negligible, as the respective time constant was set to an extremely small value for the hardware emulation.</p>
<p>If we used static synapses the weight <italic>w</italic> of a synapse was constant over time. Whereas for simulations with dynamic synapses, the weight <italic>w</italic>(<italic>t</italic>) of each synapse was modified according to the short-term synaptic plasticity rules described in Section <xref ref-type="sec" rid="S2">&#x0201C;Short-Term Synaptic Plasticity&#x0201D;</xref>.</p>
<p>The values of all parameters were drawn from random distributions with parameters as listed in Table <xref ref-type="table" rid="T1">1</xref>.</p>
<table-wrap position="float" id="T1">
<label>Table 1.</label>
<caption><p><bold>Full set of parameters</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Description</th>
<th align="left">Name</th>
<th align="left">Unit</th>
<th align="left">Mean &#x003BC;</th>
<th align="left">&#x003C3;/&#x003BC;</th>
<th align="left">&#x003C0;/&#x003BC;</th>
<th align="left">Comment</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" colspan="7"><bold>NETWORK ARCHITECTURE</bold></td>
</tr>
<tr>
<td align="left">Number of exc neurons</td>
<td align="left"><italic>N</italic><sub>e</sub></td>
<td align="left"/>
<td align="left">144</td>
<td align="left"/>
<td align="left"/>
<td align="left"/>
</tr>
<tr>
<td align="left">Number of inh neurons</td>
<td align="left"><italic>N</italic><sub>i</sub></td>
<td align="left"/>
<td align="left">48</td>
<td align="left"/>
<td align="left"/>
<td align="left"/>
</tr>
<tr>
<td align="left">Conn prob from exc to exc neurons</td>
<td align="left"><italic>p</italic><sub>ee</sub></td>
<td align="left"/>
<td align="left">0.1</td>
<td align="left"/>
<td align="left"/>
<td align="left"/>
</tr>
<tr>
<td align="left">Conn prob from exc to inh neurons</td>
<td align="left"><italic>p</italic><sub>ie</sub></td>
<td align="left"/>
<td align="left">0.2</td>
<td align="left"/>
<td align="left"/>
<td align="left"/>
</tr>
<tr>
<td align="left">Conn prob from inh to exc neurons</td>
<td align="left"><italic>p</italic><sub>ei</sub></td>
<td align="left"/>
<td align="left">0.3</td>
<td align="left"/>
<td align="left"/>
<td align="left"/>
</tr>
<tr>
<td align="left">Conn prob from inh to inh neurons</td>
<td align="left"><italic>p</italic><sub>ii</sub></td>
<td align="left"/>
<td align="left">0.6</td>
<td align="left"/>
<td align="left"/>
<td align="left"/>
</tr>
<tr>
<td align="left" colspan="7"><bold>NEURONS (EXCITATORY AND INHIBITORY)</bold></td>
</tr>
<tr>
<td align="left">Membrane capacitance</td>
<td align="left"><italic>C</italic><sub>m</sub></td>
<td align="left">nF</td>
<td align="left">0.2</td>
<td align="left">0</td>
<td align="left">0</td>
<td align="left">by definition</td>
</tr>
<tr>
<td align="left">Leakage reversal potential</td>
<td align="left"><italic>V</italic><sub>rest</sub></td>
<td align="left">mV</td>
<td align="left">&#x02212;63,&#x02026;,&#x02212;55</td>
<td align="left"/>
<td align="left"/>
<td align="left">variable parameter</td>
</tr>
<tr>
<td align="left">Firing threshold voltage</td>
<td align="left"><italic>V</italic><sub>thresh</sub></td>
<td align="left">mV</td>
<td align="left">&#x02212;55.0</td>
<td align="left">0.05</td>
<td align="left">0.1</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Reset potential</td>
<td align="left"><italic>V</italic><sub>reset</sub></td>
<td align="left">mV</td>
<td align="left">&#x02212;80.0</td>
<td align="left">0.1</td>
<td align="left">0.2</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Excitatory reversal potential</td>
<td align="left"><italic>E</italic><sub>e</sub></td>
<td align="left">mV</td>
<td align="left">0.0</td>
<td align="left">0</td>
<td align="left">0</td>
<td align="left">&#x02212;20&#x02009;mV in some simulations</td>
</tr>
<tr>
<td align="left">Inhibitory reversal potential</td>
<td align="left"><italic>E</italic><sub>i</sub></td>
<td align="left">mV</td>
<td align="left">&#x02212;80.0</td>
<td align="left">0</td>
<td align="left">0</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Leakage conductance</td>
<td align="left"><italic>g</italic><sub>leak</sub></td>
<td align="left">nS</td>
<td align="left">40.0</td>
<td align="left">0.5</td>
<td align="left">0.5</td>
<td align="left">&#x0002A;)</td>
</tr>
<tr>
<td align="left">Refractory period</td>
<td align="left">&#x003C4;<sub>ref</sub></td>
<td align="left">ms</td>
<td align="left">1.0</td>
<td align="left">0.5</td>
<td align="left">0.5</td>
<td align="left"/>
</tr>
<tr>
<td align="left" colspan="7"><bold>RECURRENT SYNAPSES</bold></td>
</tr>
<tr>
<td align="left">Weight of exc to exc synapses</td>
<td align="left"><italic>w</italic><sub>ee</sub></td>
<td align="left">nS</td>
<td align="left">1.03</td>
<td align="left">0.6</td>
<td align="left">0.7</td>
<td align="left">&#x0002A;) values refer to</td>
</tr>
<tr>
<td align="left">Weight of exc to inh synapses</td>
<td align="left"><italic>w</italic><sub>ie</sub></td>
<td align="left">nS</td>
<td align="left">0.52</td>
<td align="left">0.6</td>
<td align="left">0.7</td>
<td align="left">&#x0002A;) static synapses</td>
</tr>
<tr>
<td align="left">Weight of inh to exc synapses</td>
<td align="left"><italic>w</italic><sub>ei</sub></td>
<td align="left">nS</td>
<td align="left">3.10</td>
<td align="left">0.6</td>
<td align="left">0.7</td>
<td align="left">&#x0002A;)</td>
</tr>
<tr>
<td align="left">Weight of inh to inh synapses</td>
<td align="left"><italic>w</italic><sub>ii</sub></td>
<td align="left">nS</td>
<td align="left">1.55</td>
<td align="left">0.6</td>
<td align="left">0.7</td>
<td align="left">&#x0002A;)</td>
</tr>
<tr>
<td align="left">Cond time constant for all synapses</td>
<td align="left">&#x003C4;<sub>syn</sub></td>
<td align="left">ms</td>
<td align="left">30.0</td>
<td align="left">0.25</td>
<td align="left">0.5</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Conversion factor for facilitation</td>
<td align="left"/>
<td align="left"/>
<td align="left">1.10</td>
<td align="left"/>
<td align="left"/>
<td align="left">to match with static syns</td>
</tr>
<tr>
<td align="left">Conversion factor for depression</td>
<td align="left"/>
<td align="left"/>
<td align="left">1.65</td>
<td align="left"/>
<td align="left"/>
<td align="left">at regular firing of 20&#x02009;Hz</td>
</tr>
<tr>
<td align="left">Strength of STP</td>
<td align="left">&#x003BB;</td>
<td align="left"/>
<td align="left">0.78</td>
<td align="left">0.1</td>
<td align="left">0.2</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Bias for facilitation</td>
<td align="left">&#x003B2;</td>
<td align="left"/>
<td align="left">0.83</td>
<td align="left">0.1</td>
<td align="left">0.2</td>
<td align="left"/>
</tr>
<tr>
<td align="left">STP decay time constant</td>
<td align="left">&#x003C4;<sub>STP</sub></td>
<td align="left">ms</td>
<td align="left">480</td>
<td align="left">0.2</td>
<td align="left">0.4</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Step per spike for facilitation</td>
<td align="left"><italic>C</italic><sub>fac</sub></td>
<td align="left"/>
<td align="left">0.27</td>
<td align="left">0.1</td>
<td align="left">0.2</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Step per spike for depression</td>
<td align="left"><italic>C</italic><sub>dep</sub></td>
<td align="left"/>
<td align="left">0.11</td>
<td align="left">0.1</td>
<td align="left">0.2</td>
<td align="left"/>
</tr>
<tr>
<td align="left" colspan="7"><bold>EXTERNAL STIMULUS: POISSON SPIKE TRAINS</bold></td>
</tr>
<tr>
<td align="left">Number of exc external spike sources</td>
<td align="left"><italic>N</italic><sub>ext,e</sub></td>
<td align="left"/>
<td align="left">32</td>
<td align="left"/>
<td align="left"/>
<td align="left"/>
</tr>
<tr>
<td align="left">Number of inh external spike sources</td>
<td align="left"><italic>N</italic><sub>ext,i</sub></td>
<td align="left"/>
<td align="left">32</td>
<td align="left"/>
<td align="left"/>
<td align="left"/>
</tr>
<tr>
<td align="left">Number of exc inputs per neuron</td>
<td align="left"/>
<td align="left"/>
<td align="left">4&#x02013;6</td>
<td align="left"/>
<td align="left"/>
<td align="left">uniform distribution</td>
</tr>
<tr>
<td align="left">Number of inh inputs per neuron</td>
<td align="left"/>
<td align="left"/>
<td align="left">4&#x02013;6</td>
<td align="left"/>
<td align="left"/>
<td align="left">uniform distribution</td>
</tr>
<tr>
<td align="left">Firing rate per input spike train</td>
<td align="left">&#x003BD;<sub>inp</sub></td>
<td align="left">Hz</td>
<td align="left">11.8</td>
<td align="left">0.2</td>
<td align="left">0.2</td>
<td align="left">&#x0002A;)</td>
</tr>
<tr>
<td align="left">Weight of exc input synapses</td>
<td align="left"><italic>w</italic><sub>inp,e</sub></td>
<td align="left">nS</td>
<td align="left">0.26,&#x02026;,1.29</td>
<td align="left">0.6</td>
<td align="left">0.7</td>
<td align="left">&#x0002A;) varied via <italic>W</italic><sub>input</sub> and</td>
</tr>
<tr>
<td align="left">Weight of inh input synapses</td>
<td align="left"><italic>w</italic><sub>inp,i</sub></td>
<td align="left">nS</td>
<td align="left">0.77,&#x02026;,3.87</td>
<td align="left">0.6</td>
<td align="left">0.7</td>
<td align="left">&#x0002A;) refer to <italic>V</italic><sub>rest</sub>&#x02009;&#x0003D;&#x02009;&#x02212;60&#x02009;mV</td>
</tr>
<tr>
<td align="left">Cond time constant for all synapses</td>
<td align="left">&#x003C4;<sub>syn</sub></td>
<td align="left">ms</td>
<td align="left">30.0</td>
<td align="left">0.25</td>
<td align="left">0.5</td>
<td align="left"/>
</tr>
<tr>
<td align="left" colspan="7"><bold>EXPERIMENT</bold></td>
</tr>
<tr>
<td align="left">Simulated time per exp run</td>
<td align="left"><italic>T</italic><sub>exp</sub></td>
<td align="left">ms</td>
<td align="left">4500</td>
<td align="left"/>
<td align="left"/>
<td align="left">only <italic>t</italic>&#x02009;&#x02265;&#x02009;1000&#x02009;ms evaluated<bold></bold><bold></bold></td>
</tr>
<tr>
<td align="left">Number of exp runs per param set</td>
<td align="left"><italic>n</italic><sub>run</sub></td>
<td align="left"/>
<td align="left">20</td>
<td align="left"/>
<td align="left"/>
<td align="left">&#x000D7;64 in hardware with same network</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>All values given in biological units. If not stated otherwise, values are drawn from a bound normal distribution with mean &#x003BC;, standard deviation &#x003C3;, and bound &#x003C0;. Parameters marked by a &#x002A;) have been spread for the hardware emulations by configuration</italic>.</p></table-wrap-foot>
</table-wrap>
</sec>
<sec id="S5">
<title>Network configuration</title>
<p>In the following, the examined network architecture is presented. Rather than customizing the configuration to the employed device, we aimed for a generic, back-end agnostic choice of parameters. Due to hardware limitations in the input bandwidth, a dedicated concept for external stimulation had to be developed.</p>
<sec id="S6">
<title>Network architecture</title>
<p>We applied a network architecture similar to the setup proposed and studied by Sussillo et al. (<xref ref-type="bibr" rid="B44">2007</xref>) which was proven to feature self-adjustment capabilities. A schematic of the architecture is shown in Figure <xref ref-type="fig" rid="F2">2</xref>. It employs the STP mechanism presented above. Two populations of neurons &#x02013; both similarly stimulated externally with Poisson spike trains &#x02013; are randomly connected obeying simple probability distributions (see below). Connections within the populations are depressing, while bridging connections are facilitating. Thus, if excitatory network activity rises, further excitation is reduced while inhibitory activity is facilitated. Inversely, in case of a low average firing rate, the network sustains excitatory activity.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p><bold>Schematic of the self-adjusting network architecture proposed in Sussillo et al. (<xref ref-type="bibr" rid="B44">2007</xref>)</bold>. Depressing (<monospace>dep</monospace>) and facilitating (<monospace>fac</monospace>) recurrent synaptic connections level the network activity.</p></caption>
<graphic xlink:href="fncom-04-00129-g002.tif"/>
</fig>
<p>Sussillo et al. (<xref ref-type="bibr" rid="B44">2007</xref>) studied the dynamics of this architecture for sparsely connected networks of 5000 neurons through extensive computer simulations of leaky integrate-and-fire neurons and mean field models. In particular, they examined how the network response depends on the mean value and the variance of a Gaussian distributed current injection. It was shown that such networks are capable of adjusting their activity to a moderate level of approximately 5&#x02013;20&#x02009;Hz over a wide range of stimulus parameters while preserving the ability to respond to changes in the external input.</p>
</sec>
<sec id="S7">
<title>Applied parameters</title>
<p>With respect to the constraints described in Section <xref ref-type="sec" rid="S3">&#x0201C;Hardware Constraints&#x0201D;</xref>, we set up recurrent networks comprising 192 conductance-based leaky integrate-and-fire point neurons, 144 (75%) of which were chosen to be excitatory, 48 (25%) to be inhibitory. Besides feedback from recurrent connections, each neuron was externally stimulated via excitatory and inhibitory Poisson spike sources. The setup of recurrent connections and external stimulation is described in detail below.</p>
<p>All parameters specifying the networks are listed in Table <xref ref-type="table" rid="T1">1</xref>. Most values are modeled by a <italic>bound normal distribution</italic> which is defined by its mean &#x003BC;, its standard deviation &#x003C3; and a bound &#x003C0;: The random value <italic>x</italic> is drawn from a normal distribution <italic>N</italic>(&#x003BC;,&#x003C3;<sup>2</sup>). If <italic>x</italic> exceeds the bounds, it is redrawn from a uniform distribution within the bounds.</p>
<p>In case of hardware emulations, some of the deviations &#x003C3; only reflect chip-inherent variations, i.e., fluctuations that remain when all units are intended to provide equal values. For other parameters &#x02013; namely for all synaptic efficacies <italic>w</italic>, the leakage conductance <italic>g</italic><sub>leak</sub> and the input firing rate &#x003BD;<sub>inp</sub> &#x02013; the major fraction of the deviations &#x003C3; was intentionally applied by the experimenter. If present, the variations of hardware parameters are based on Br&#x000FC;derle et al. (<xref ref-type="bibr" rid="B10">2009</xref>).</p>
<p>In case of software simulations, all inhomogeneities are treated as independent statistical variations. Especially, systematic effects, like the load-dependency of the excitatory synaptic efficacy or the unbalanced sensitivity between the neuron populations (see <xref ref-type="sec" rid="S9">&#x0201C;Self-Adjustment Ability&#x0201D;</xref>), have not been modeled during the first simulation series.</p>
<sec>
<title>Recurrent connections</title>
<p>Any two neurons are synaptically connected with probability <italic>p</italic><sub>post,pre</sub> and weight <italic>w</italic><sub>post,pre</sub>. These values depend only on the populations the pre- and post-synaptic neurons are part of.</p>
<p>Synaptic weights always refer to the strength of static synapses. When a synapse features STP, its weight is multiplicatively adjusted such that the strengths of static and dynamic synapses match at a constant regular pre-synaptic firing of 20&#x02009;Hz for <italic>t</italic>&#x02009;&#x02192;&#x02009;&#x0221E;. This adjustment is necessary in order to enable dynamic synapses to be both stronger or weaker than static synapses according to their current activity.</p>
<p>Although the connection probabilities and synaptic weights used for the experiments do not rely on biological measurements or profound theoretical studies, they follow some handy rules. The mean values of the probability distributions are determined by three principles:</p>
<list list-type="order">
<list-item><p>Every neuron has as many excitatory as inhibitory recurrent input synapses: <italic>p</italic><sub>post,e</sub>&#x00B7;<italic>N</italic><sub>e</sub>&#x02009;&#x0003D;&#x02009;<italic>p</italic><sub>post,i</sub>&#x00B7;<italic>N</italic><sub>i</sub>.</p></list-item>
<list-item><p>Inhibitory neurons receive twice as many recurrent synaptic inputs as excitatory neurons. This enables them to sense the state of the network on a more global scale: <italic>p</italic><sub>i,pre</sub>&#x00B7;<italic>N</italic><sub>pre</sub>&#x02009;&#x0003D;&#x02009;2&#x00B7;<italic>p</italic><sub>e,pre</sub>&#x00B7;<italic>N</italic><sub>pre</sub>.</p></list-item>
<list-item><p>Assuming a uniform global firing rate of 20&#x02009;Hz and an average membrane potential of <italic>V</italic>&#x02009;&#x0003D;&#x02009;&#x02212;60&#x02009;mV, synaptic currents are well-balanced in the following terms:</p></list-item>
</list>
<list list-type="simple">
<list-item><label>(a)</label> <p>For each neuron the excitatory and inhibitory currents have equal strength,</p></list-item>
<list-item><label>(b)</label> <p>each excitatory neuron is exposed to as much synaptic current as each inhibitory neuron.</p></list-item>
</list>
<p>Formally, we examine the average current induced by a population <italic>pre</italic> to a single neuron of the population <italic>post</italic>:</p>
<p><italic>I</italic><sub>post,pre</sub>&#x02009;&#x0221D;&#x02009;<italic>p</italic><sub>post,pre</sub>&#x00B7;<italic>N</italic><sub>pre</sub>&#x00B7;<italic>w</italic><sub>post,pre</sub>&#x00B7;|<italic>E</italic><sub>pre</sub>&#x02009;&#x02212;&#x02009;<italic>V</italic>|.</p>
<p>Principle 3 demands that <italic>I</italic><sub>post,pre</sub> is equal for all tuples (post, pre) under the mentioned conditions. Given the sizes of the populations and the reversal potentials, the Principles 1 and 2 determine all recurrent connection probabilities <italic>p</italic><sub>post,pre</sub> and weights <italic>w</italic><sub>post,pre</sub> except for two global multiplicative parameters: one scaling all recurrent connection probabilities, the other one all recurrent weights. While the ratios of all <italic>p</italic><sub>post,pre</sub> as well as the ratios of the <italic>w</italic><sub>post,pre</sub> are fixed, the scaling factors have been chosen such that the currents induced by recurrent synapses exceed those induced by external inputs in order to highlight the functioning of the applied architecture.</p>
</sec>
<sec>
<title>External stimulation</title>
<p>In order to investigate the modulation of activity by the network, external stimulation of different strength should be applied. One could think of varying the total incoming spike rate or the synaptic weights of excitation and inhibition. In order to achieve a biologically realistic setup, one should choose the parameters such that the stimulated neurons will reach a <italic>high-conductance state</italic> (see Destexhe et al., <xref ref-type="bibr" rid="B14">2003</xref>; Kaplan et al., <xref ref-type="bibr" rid="B24">2009</xref>). Neglecting the influence of recurrent connections and membrane resets after spiking, the membrane would tune in to an average potential &#x003BC;<sub>V</sub> superposed by temporal fluctuations &#x003C3;<sub><italic>V</italic></sub>.</p>
<p>As mentioned above, the FACETS Stage 1 Hardware suffers from a small number of input channels if 2&#x02009;&#x000D7;&#x02009;192 synapse drivers are reserved for recurrent connections. At the same time, even resting neurons exhibit a very short membrane time constant of &#x003C4;<sub>mem</sub>&#x02009;&#x02248;&#x02009;5&#x02009;ms. Due to these limitations, we needed to apply an alternative type of stimulation to approximate appropriate neuronal states:</p>
<p>Regarding the dynamics of a conductance-based leaky integrate-and-fire neuron, the conductance course toward any reversal potential can be split up into a time-independent average value and time-dependent fluctuations with vanishing mean. Then, the average conductances toward all reversal potentials can be combined to an effective resting potential and an effective membrane time constant (Shelley et al., <xref ref-type="bibr" rid="B40">2002</xref>). In this framework, only the fluctuations remain to be modeled via external stimuli.</p>
<p>From this point of view, the hardware neurons appear to be in a high-conductance state with an average membrane potential &#x003BC;<sub><italic>V</italic></sub>&#x02009;&#x0003D;&#x02009;<italic>V</italic><sub>rest</sub> without stimulation due to the short membrane time constant &#x003C4;<sub>mem</sub>. Ex post, the available input channels can be used to add fluctuations. The magnitude &#x003C3;<sub><italic>V</italic></sub> of the fluctuations is adjusted via the synaptic weights of the inputs.</p>
<p>Throughout all simulations and emulations, 32 of the 64 input channels were used for excitatory stimulation, the remaining 32 input channels for inhibitory stimulation. Each neuron was connected to four to six excitatory and four to six inhibitory inputs using static synapses. The number of inputs was randomly drawn from a uniform distribution for each neuron and reversal potential. The synaptic weights of the connections were drawn from bound normal distributions. The mean value of these distributions was chosen such that the average traction <italic>w</italic>&#x00B7;(<italic>E</italic><sub>rev</sub>&#x02009;&#x02212;&#x02009;&#x003BC;<sub><italic>V</italic></sub>) was equal for excitatory and inhibitory synapses. The values listed in Table <xref ref-type="table" rid="T1">1</xref> refer to &#x003BC;<sub><italic>V</italic></sub>&#x02009;&#x0003D;&#x02009;<italic>V</italic><sub>rest</sub>&#x02009;&#x0003D;&#x02009;&#x02212;60&#x02009;mV. In case of other resting potentials, the synaptic weights were properly adjusted to achieve an equal average current toward the reversal potentials: In case of excitatory inputs the weight was set to <italic>w</italic><sub>inp,e</sub>&#x00B7;|[<italic>E</italic><sub>e</sub>&#x02009;&#x02212;&#x02009;(&#x02212;60&#x02009;mV)]/<italic>E</italic><sub>e</sub>&#x02009;&#x02212;&#x02009;<italic>V</italic><sub>rest</sub>|. Similarly, inhibitory input weights were adjusted to <italic>w</italic><sub>inp,e</sub>&#x00B7;|[<italic>E</italic><sub>i</sub>&#x02009;&#x02212;&#x02009;(&#x02212;60&#x02009;mV)]/<italic>E</italic><sub>i</sub>&#x02009;&#x02212;&#x02009;<italic>V</italic><sub>rest</sub>|.</p>
<p>Thus, neglecting the influence of recurrent connections and resets of the membrane after APs, the average input-induced membrane potential &#x003BC;<sub><italic>V</italic></sub> always equals <italic>V</italic><sub>rest</sub>. The magnitude of the fluctuations was controlled via a multiplicative <italic>weight factor</italic> <italic>W</italic><sub>input</sub> affecting all input synapses.</p>
</sec>
</sec>
</sec>
<sec id="S8">
<title>Measurement</title>
<p>In order to study the self-adjustment capabilities of the setup, three types of networks were investigated:</p>
<list list-type="simple">
<list-item><label>&#x02013;</label> <p><italic>unconnected</italic> All recurrent synapses were discarded (<italic>w</italic>&#x02009;&#x0003D;&#x02009;0) in order to determine the sole impact of external stimulation.</p></list-item>
<list-item><label>&#x02013;</label> <p><italic>dynamic</italic> All recurrent synapses featured STP. The mode (facilitating, depressing) depended on the type of the connection as shown in Figure <xref ref-type="fig" rid="F2">2</xref>.</p></list-item>
<list-item><label>&#x02013;</label> <p><italic>static</italic> The STP-mechanism was switched off in order to study the relevance of STP for the self-adjustment ability.</p></list-item>
</list>
<p>Rather than on the analysis of the dynamics of a specific network, we aimed at the investigation of the universality of application of the examined network architecture.</p>
<p>Therefore, random networks were generated obeying the above described probability distributions. Besides the three fundamentally different network types (unconnected, dynamic and static), external stimulation of different strength was applied by sweeping both the average membrane potential <italic>V</italic><sub>rest</sub> and the magnitude of fluctuations <italic>W</italic><sub>input</sub>.</p>
<p>For every set of network and input parameters, <italic>n</italic><sub>run</sub>&#x02009;&#x0003D;&#x02009;20 networks and input patterns were generated and run for <italic>T</italic><sub>exp</sub>&#x02009;&#x0003D;&#x02009;4.5&#x02009;s. The average firing rates of both populations of neurons were recorded. To exclude transient initialization effects, only the time span 1&#x02009;s&#x02009;&#x02264;&#x02009;<italic>t</italic>&#x02009;&#x02264;&#x02009;<italic>T</italic><sub>exp</sub> was evaluated. Networks featuring the self-adjustment property are expected to modulate their activity to a medium level of about 5&#x02013;20&#x02009;Hz over a wide range of external stimulation.</p>
<p>This setup was both emulated on the FACETS Stage 1 Hardware system and simulated using PCSIM in order to verify the results.</p>
</sec>
</sec>
<sec>
<title>Results</title>
<p>First we present the results of the hardware emulation and compare them with the properties of simulated networks. Beside the capability of adjusting network activity in principle, we examine to what extent the observed mechanisms are insusceptible to changes in the hardware substrate. Finally we take a look at the ability of such networks to process input streams.</p>
<sec id="S9">
<title>Self-adjustment ability</title>
<p>The results of the hardware emulation performed according to the setup description given in Sections <xref ref-type="sec" rid="S5">&#x0201C;Network Configuration&#x0201D;</xref> and <xref ref-type="sec" rid="S8">&#x0201C;Measurement&#x0201D;</xref> are shown in Figure <xref ref-type="fig" rid="F3">3</xref>. The axes display different input strengths, controlled by the average membrane potential <italic>V</italic><sub>rest</sub> and the magnitude of fluctuation <italic>W</italic><sub>input</sub>. Average firing rates are indicated by the shade of gray of the respective tile.</p>
<p>The average response of networks without recurrent connections is shown in Figure <xref ref-type="fig" rid="F3">3</xref>A. Over a wide range of weak stimulation (lower left corner) almost no spikes occur within the network. For stronger input, the response steadily rises up to &#x003BD;&#x02009;&#x02248;&#x02009;29&#x02009;Hz. In Figure <xref ref-type="fig" rid="F3">3</xref>D the activity of the excitatory and the inhibitory population are compared. Since external stimulation was configured equally for either population, one expects a similar response &#x003BD;<sub>exc</sub>&#x02009;&#x02212;&#x02009;&#x003BD;<sub>inh</sub>&#x02009;&#x02248;&#x02009;0, except for slight stochastic variations. Obviously, the used hardware device exhibits a strong and systematic discrepancy of the sensitivity between the populations, which were located on different halves of the chip. The mean firing rate of excitatory neurons is about three times as high as the response of inhibitory neurons.</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p><bold>Results of the emulations on the FACETS Stage 1 Hardware</bold>. External stimulation of diverse strength is controlled via <italic>V</italic><sub>rest</sub> and <italic>W</italic><sub>input</sub>. For every tile, 20 randomly connected networks with new external stimulation were generated. The resulting average firing rates are illustrated by different shades of gray. Inevitably, differing saturation ranges had to be used for the panels. <italic>HORIZONTAL</italic>: different types of recurrent synapses. <bold>(A,D)</bold> Solely input driven networks without recurrent connections. <bold>(B,E)</bold> Recurrent networks with dynamic synapses using short-term plasticity. <bold>(C,F)</bold> Recurrent networks with static synapses. <italic>VERTICAL</italic>: Mean activity of the entire network <bold>(A&#x02013;C)</bold> and the balance of the populations, measured by the difference between the mean excitatory and inhibitory firing rates <bold>(D&#x02013;F)</bold>.</p></caption>
<graphic xlink:href="fncom-04-00129-g003.tif"/>
</fig>
<p>The mid-column &#x02013; Figures <xref ref-type="fig" rid="F3">3</xref>B,E &#x02013; shows the response of recurrent networks featuring dynamic synapses with the presented STP mechanism. Over a wide range of stimulation, the mean activity is adjusted to a level of 9&#x02013;15&#x02009;Hz. A comparison to the solely input driven setup proves that recurrent networks with dynamic synapses are capable of both raising and lowering their activity toward a smooth plateau. A closer look at the firing rates of the populations reveals the underlying mechanism: In case of weak external stimulation, excitatory network activity exceeds inhibition, while the effect of strong stimuli is attenuated by intense firing of inhibitory neurons. This functionality agrees with the concept of depressing interior and facilitating bridging connections, as described in Section <xref ref-type="sec" rid="S6">&#x0201C;Network Architecture&#x0201D;</xref>.</p>
<p>In spite of the disparity of excitability between the populations, the applied setup is capable of properly adjusting network activity. It is noteworthy that the used connection probabilities and synaptic weights completely ignored this characteristic of the underlying substrate.</p>
<p>To ensure that the self-adjustment ability originates from short-term synaptic plasticity, the STP-mechanism was switched off during a repetition of the experiment. The respective results for recurrent networks using static synapses are shown in Figures <xref ref-type="fig" rid="F3">3</xref>C,F. The networks clearly lack the previously observed self-adjustment capability, but rather tend to extreme excitatory firing. It must be mentioned that such high firing rates exceed the readout bandwidth of the current FACETS Stage 1 Hardware system. Thus, an unknown amount of spike events was discarded within the readout circuitry of the chip. The actual activity of the networks is expected to be even higher than the measured response.</p>
</sec>
<sec>
<title>Comparison to PCSIM</title>
<p>While the results of the hardware emulation draw a self-consistent picture, it ought to be excluded that the observed self-adjustment arises from hardware-specific properties. Therefore, the same setup was applied to the software simulator PCSIM. The results of the software simulation are shown in Figure <xref ref-type="fig" rid="F4">4</xref>. The six panels are arranged like those of the hardware results in Figure <xref ref-type="fig" rid="F3">3</xref>.</p>
<p>In agreement with the hardware emulation, the average response of networks without recurrent connections rises with stronger stimulation, see Figure <xref ref-type="fig" rid="F4">4</xref>A. But as the disparity in the population excitability was not modeled in the simulation, their balance is only subject to statistical variations, see Figure <xref ref-type="fig" rid="F4">4</xref>D.</p>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption><p><bold>Results of the software simulation</bold>. The experimental setup and the arrangement of the panels are equal to Figure <xref ref-type="fig" rid="F3">3</xref>. Also, the general behavior is consistent with the hardware emulation, though the average network response is more stable against different strengths of stimulation and all firing rates are higher. Accordingly, in case of dynamic recurrent synapses, the plateau is located at &#x003BD;<sub>total</sub>&#x02009;&#x02248;&#x02009;17&#x02009;Hz.</p></caption>
<graphic xlink:href="fncom-04-00129-g004.tif"/>
</fig>
<p>Generally, the software simulation yields significantly higher firing rates than the hardware emulation. Two possible causes are:</p>
<list list-type="bullet">
<list-item><p>The load-dependency of the excitatory synaptic efficacy (see <xref ref-type="sec" rid="S3">Hardware Constraints</xref>) certainly entails reduced network activity in case of the hardware emulation.</p></list-item>
<list-item><p>The response curve of hardware neurons slightly differs from the behavior of an ideal conductance-based LIF model (Br&#x000FC;derle, <xref ref-type="bibr" rid="B9">2009</xref>, Figure 6.4).</p></list-item>
</list>
<p>Consistently, an increased activity is also observed in the simulations of recurrent networks. Figures <xref ref-type="fig" rid="F4">4</xref>B,E show the results for networks with synapses featuring short-term synaptic plasticity. Obviously, the networks exhibit the expected self-adjustment ability. But the plateau is found at approximately 17&#x02009;Hz compared to 12&#x02009;Hz in the hardware emulation. Finally, in case of static recurrent synapses &#x02013; see Figures <xref ref-type="fig" rid="F4">4</xref>C,F &#x02013; the average network activity rises up to 400&#x02009;Hz and lacks any visible moderation.</p>
<p>In conclusion, the hardware emulation and the software simulation yield similar results regarding the basic dynamics. Quantitatively, the results differ approximately by a factor of 2.</p>
<p>In order to approximate the influence of the unstable excitatory synaptic efficacy, which is suspected to be the leading cause for the inequality, the excitatory reversal potential was globally set to <italic>E</italic><sub>e</sub>&#x02009;&#x0003D;&#x02009;&#x02212;20&#x02009;mV during a repetition of the software simulation. Indeed, the results of the different back-ends become more similar. The average activity of networks with dynamic synapses (corresponding to Figures <xref ref-type="fig" rid="F3">3</xref>B and <xref ref-type="fig" rid="F4">4</xref>B) is shown in Figure <xref ref-type="fig" rid="F5">5</xref>.</p>
<fig id="F5" position="float">
<label>Figure 5</label>
<caption><p><bold>Software simulation: Lower excitatory reversal potential</bold>. Average network response of recurrent networks with dynamic synapses. In order to approximate the load-dependency of the excitatory synaptic efficacy in the chip, <italic>E</italic><sub>e</sub> was set to &#x02212;20&#x02009;mV for subsequent software simulations. Compare with Figure <xref ref-type="fig" rid="F3">3</xref>B.</p></caption>
<graphic xlink:href="fncom-04-00129-g005.tif"/>
</fig>
<p>Due to the obviously improved agreement, all further software simulations have been performed with a lower excitatory reversal potential <italic>E</italic><sub>e</sub>&#x02009;&#x0003D;&#x02009;&#x02212;20&#x02009;mV.</p>
</sec>
<sec>
<title>Robustness</title>
<p>We show that the observed self-adjustment property of the network architecture provides certain types of activity robustness that are beneficial for the operation of neuromorphic hardware systems.</p>
<sec>
<title>Reliable and relevant activity regimes</title>
<p>By applying the network architecture presented in Section <xref ref-type="sec" rid="S6">&#x0201C;Network Architecture&#x0201D;</xref>, we aim at the following two kinds of robustness of network dynamics:</p>
<list list-type="bullet">
<list-item><p>A high reliability of the average network activity, independent of the precise individual network connectivity or stimulation pattern. All networks with dynamic synapses that are generated and stimulated randomly, but obeying equal probability distributions, shall yield a similar average firing rate &#x003BD;<sub>total</sub>.</p></list-item>
<list-item><p>The average firing rate &#x003BD;<sub>total</sub> shall be kept within a biologically relevant range for a wide spectrum of stimulation strength and variability. For awake mammalian cortices, rates in the order of 5&#x02013;20&#x02009;Hz are typical (see, e.g., Baddeley et al., <xref ref-type="bibr" rid="B2">1997</xref>; Steriade, <xref ref-type="bibr" rid="B42">2001</xref>; Steriade et al., <xref ref-type="bibr" rid="B43">2001</xref>).</p></list-item>
</list>
<p>The emergence of both types of robustness in the applied network architecture is first tested by evaluating the PCSIM data. Still, it is not a priori clear that the robustness is preserved when transferring the self-adjusting paradigm to the hardware back-end. The transistor-level variations discussed in Section <xref ref-type="sec" rid="S3">&#x0201C;Hardware Constraints&#x0201D;</xref> might impede the reliability of the moderating effects, e.g., by causing an increased excitability for some of the neurons, or by too heterogeneous characteristics of the synaptic plasticity itself. Therefore, the robustness is also tested directly on a hardware device and the results are compared with those of the software simulation.</p>
<p>While each tile in Figure <xref ref-type="fig" rid="F3">3</xref> represents the averaged overall firing rate &#x003BD;<sub>total</sub> of 20 randomly generated networks and input patterns, Figure <xref ref-type="fig" rid="F6">6</xref> shows the standard deviation &#x003C3;<sub>&#x003BD;</sub> of the activity of networks obeying equal probability distributions as a function of &#x003BD;<sub>total</sub>. Networks using dynamic synapses are marked by triangles, those with static synapses by circles. Only setups with &#x003BD;<sub>total</sub>&#x02009;&#x0003E;&#x02009;1&#x02009;Hz are shown.</p>
<fig id="F6" position="float">
<label>Figure 6</label>
<caption><p><bold>Reliable and realistic network activity</bold>. Each point is determined by 20 random networks generated from equal probability distributions. The average firing rate of all networks is plotted on the <italic>x</italic>-axis, the standard deviation between the networks on the <italic>y</italic>-axis. Recurrent networks featuring short-term plasticity (triangles) can reliably be found within a close range. Setups with static synapses (circles) exhibit both larger average firing rates and larger standard deviations. <bold>(A)</bold> Emulation on the FACETS Stage 1 Hardware. <bold>(B)</bold> Software simulation with lowered excitatory reversal potential <italic>E</italic><sub>e</sub>.</p></caption>
<graphic xlink:href="fncom-04-00129-g006.tif"/>
</fig>
<p>For both the hardware device and the software simulation, the data clearly show that the required robustness effects are achieved by enabling the self-adjusting mechanism with dynamic synapses. The fluctuation &#x003C3;<sub>&#x003BD;</sub> from network to network is significantly lower for networks that employ dynamic recurrent connections. Moreover, only for dynamic synapses the average firing rate &#x003BD;<sub>total</sub> is reliably kept within the proposed regime, while in case of static synapses most of the observed rates are well beyond its upper limit.</p>
<p>This observation qualitatively holds both for the hardware and for the software data. In case of networks with static synapses emulated on the hardware system, the upper limit of observed firing rates at about 100&#x02009;Hz is determined technically by bandwidth limitations of the spike recording circuitry. This also explains the dropping variation &#x003C3;<sub>&#x003BD;</sub> for firing rates close to that limit. If many neurons fire at rates that exceed the readout bandwidth, the diversity in network activity will seemingly&#x02009;shrink.</p>
<p>While the software simulation data prove that the self-adjusting principle provides the robustness features already for networks as small as those tested, the hardware emulation results show that the robustness is preserved despite of the transistor-level variations. Even though the different biological network descriptions are mapped randomly onto the inhomogeneous hardware resources, the standard deviation of firing rates is similar in hardware and in software.</p>
</sec>
<sec>
<title>Independence of the emulation device</title>
<p>Besides the ambiguous mapping of given biological network descriptions to an inhomogeneous neuromorphic hardware system as discussed above, the choice of the particular emulation device itself imposes another source of possible unreliability of results. Often, multiple instances of the same system are available to an experimenter. Ideally, such chips of equal design should yield identical network dynamics. But due to process-related inhomogeneities and due to the imperfections as discussed in Section <xref ref-type="sec" rid="S3">&#x0201C;Hardware Constraints&#x0201D;</xref>, this objective is unachievable in terms of precise spike timing whenever analog circuitry is involved. Nevertheless, one can aim for a similar behavior on a more global scale, i.e., for alike results regarding statistical properties of populations of neurons.</p>
<p>All previous emulations have been performed on a system which was exclusively assigned to the purpose of this work. In order to investigate the influence of the particular hardware substrate, a different randomly chosen chip was set up with the same <italic>biological configuration</italic>. In this context, biological configuration denotes that both systems had been calibrated for general purpose. The high-level pyNN-description of the experiment remained unchanged &#x02013; only the translation of biological values to hardware parameters involved different calibration data. This customization is performed automatically by low-level software structures. Therefore, the setup is identical from the experimenter&#x00027;s point of view.</p>
<p>In the following, the two devices will be referred to as <italic>primary</italic> and <italic>comparative</italic>, respectively. Just as on the primary device, networks emulated on the comparative system featured the self-adjustment ability if dynamic synapses were used for recurrent connections. But network activity was moderated to rather low firing rates of 2&#x02013;6&#x02009;Hz. The response of networks without recurrent connections revealed that the used chip suffered from a similar disparity of excitability as the primary device. But in this case, it was the inhibitory population which showed a significantly heightened responsiveness.</p>
<p>Apparently, the small networks were not capable of completely compensating for the systematic unbalance of the populations. Nevertheless, they still were able to both raise and lower their firing rate compared to input-induced response. Figure <xref ref-type="fig" rid="F7">7</xref> shows the difference of the activity between recurrent networks with short-term synaptic plasticity and solely input driven networks without recurrent connections,</p>
<p>&#x00394;&#x003BD;&#x02009;:&#x0003D;&#x02009;&#x003BD;<sub>total,dyn</sub>&#x02009;&#x02212;&#x02009;&#x003BD;<sub>total,input</sub>.</p>
<p>For this chart, the <italic>V</italic><sub>rest</sub>&#x02009;&#x02212;&#x02009;<italic>W</italic><sub>input</sub> diagonal of Figure <xref ref-type="fig" rid="F3">3</xref> has been mapped to the <italic>x</italic>-axis, representing an increasing input strength. &#x00394;&#x003BD; is plotted on the <italic>y</italic>-axis. Independent of the used back-end, recurrent networks raise activity in case of weak external excitation, while the effect of strong stimulation is reduced.</p>
<fig id="F7" position="float">
<label>Figure 7</label>
<caption><p><bold>Self-adjusting effect on different platforms</bold>. The difference &#x00394;&#x003BD;&#x02009;:&#x0003D;&#x02009;&#x003BD;<sub>total,dyn</sub>&#x02009;&#x02212;&#x02009;&#x003BD;<sub>total,input</sub> is plotted against an increasing strength and variability of the external network stimulation. The diamond symbols represent the data acquired with PCSIM. The square (circle) symbols represent data measured with the primary (comparative) hardware device. Measurements with the comparative device, but with a mirrored placing of the two network populations, are plotted with triangle symbols. See main text for details.</p></caption>
<graphic xlink:href="fncom-04-00129-g007.tif"/>
</fig>
<p>To allow for the inverse disparity of excitability of the comparative device, the mapping of the excitatory and the inhibitory population, which were located on different halves of the chip, was mirrored during a repetition of the emulation. Thus, the excitatory population exhibited an increased responsiveness resembling the disparity of the primary device. The &#x00394;&#x003BD;-curve of the mirrored repetition on the comparative system can also be found in Figure <xref ref-type="fig" rid="F7">7</xref>. As expected, with this choice of population placing, the moderating effect of the applied self-adjusting paradigm matches better the characteristics of the primary device.</p>
<p>These observations suggest that differing emulation results rather arise from large-scaled systematic inhomogeneities of the hardware substrate than from statistically distributed fixed pattern noise of individual units.</p>
<p>Therefore, it can be stated that the applied architecture is capable of reliably compensating statistical fluctuations of hardware unit properties, unless variations extend to a global scale. But even in case of large-scale deviations, the applied construction principle preserves its self-adjustment ability and provides reproducible network properties, albeit at a shifted working point.</p>
</sec>
</sec>
<sec id="S10">
<title>Responsiveness to input</title>
<p>While it was shown that the applied configuration provides a well-defined network state in terms of average firing rates, it remains unclear whether the probed architecture is still able to process information induced by external input. It can be suspected that the strong recurrent connectivity &#x0201C;overwrites&#x0201D; any temporal structure of the input spike trains. Yet, the usability of the architecture regarding a variety of computational tasks depends on its responsiveness to changes in the input. A systematic approach to settle this question exceeds the scope of this work. Therefore, we address the issue only in brief.</p>
<p>First, we determine the temporal response of the architecture to sudden changes in external excitation. Then, we look for traces of previously presented input patterns in the current network state and test whether the networks are capable of performing a non-linear computation on the meaning assigned to these patterns.</p>
<p>For all subsequent simulations the input parameters are set to <italic>V</italic><sub>rest</sub>&#x02009;&#x0003D;&#x02009;&#x02212;59&#x02009;mV and <italic>W</italic><sub>input</sub>&#x02009;&#x0003D;&#x02009;4.0 (cf. Figure <xref ref-type="fig" rid="F5">5</xref>). Only networks featuring dynamic recurrent connections are investigated. Due to technical limitations of the current hardware system as discussed in Section <xref ref-type="sec" rid="S3">&#x0201C;Hardware Constraints&#x0201D;</xref>, the results of this section are based on software simulations, only. For example, the additional external stimulation, as applied in the following, exceeds the current input bandwidth of the prototype hardware device. Furthermore, the evaluation of network states requires access to (at least) the spike output of all neurons, simultaneously. The current hardware system only supports the recording of a small subset of neurons at a time.</p>
<p>In Figure <xref ref-type="fig" rid="F8">8</xref>A the average response of the excitatory and inhibitory populations to increased external excitation are shown. For this purpose, the firing rate of all excitatory Poisson input channels was doubled from 11.8 to 23.6&#x02009;Hz at <italic>t</italic>&#x02009;&#x0003D;&#x02009;4&#x02009;s. It was reset to 11.8&#x02009;Hz at <italic>t</italic>&#x02009;&#x0003D;&#x02009;7&#x02009;s, i.e., the applied stimulation rate was shaped as a rectangular pulse. In order to examine the average response of the recurrent networks to this steep differential change in the input, <italic>n</italic><sub>run</sub>&#x02009;&#x0003D;&#x02009;1000 networks and input patterns have been generated. While the network response obtained from a single simulation run is subject to statistical fluctuations, the influence of the input pulse is revealed precisely by averaging over the activity of many different networks. For analysis, the network response was convolved with a box filter (50&#x02009;ms window size). In conclusion, the temporal response of the recurrent networks is characterized by two obvious features:</p>
<fig id="F8" position="float">
<label>Figure 8</label>
<caption><p><bold>Network traces of transient input</bold>. Results of software simulations testing the response of recurrent networks with dynamic synapses to transient input. <bold>(A)</bold> Firing rate of the excitatory input channels and average response of either population to an excitatory input pulse lasting for 3&#x02009;s. The steep differential change in excitation is answered by a distinct peak. After some hundred milliseconds the networks attune to a new level of equilibrium. <bold>(B)</bold> Average performance of the architecture in a retroactive pattern classification task. The network states contain information on input spike patterns which were presented some hundred milliseconds ago. The latest patterns presented are to be processed in a non-linear XOR task.</p></caption>
<graphic xlink:href="fncom-04-00129-g008.tif"/>
</fig>
<list list-type="order">
<list-item><p>Immediately after the additional input is switched on or off, the response curves show distinct peaks which decay at a time scale of &#x003C4;&#x02009;&#x02248;&#x02009;100&#x02009;ms.</p></list-item>
<list-item><p>After some hundred milliseconds, the networks level off at a new equilibrium. Due to the self-adjustment mechanism, the activity of the inhibitory population clearly increases.</p></list-item>
</list>
<p>These findings confirm that the investigated networks show a significant response to changes in the input. This suggests that such neural circuits might be capable of performing classification tasks or continuous-time calculations if a readout is attached and trained.</p>
<p>We tested this conjecture by carrying out a computational test proposed in Haeusler and Maass (<xref ref-type="bibr" rid="B21">2007</xref>). The 64 external input channels were assigned to two disjunct <italic>streams A</italic> and <italic>B</italic>. Each stream consisted of 16 excitatory and 16 inhibitory channels. For each stream two Poisson spike train templates (referred to as &#x0002B;<italic>s</italic> and &#x02212;<italic>s</italic>,<italic>S</italic>&#x02009;&#x02208;&#x02009;{<italic>A</italic>,<italic>B</italic>}) lasting for 2400&#x02009;ms were drawn and partitioned to 24 segments &#x000B1;<italic>s</italic>,<italic>i</italic> of 100&#x02009;ms duration. In every simulation run the input was randomly composed of the segments of these templates, e.g.,</p>
<p><italic>Stream A</italic>: &#x0002B;<sub><italic>A</italic>23</sub>&#x02212;<sub><italic>A</italic>22</sub>&#x02026;&#x02212;<sub><italic>A</italic>1</sub>&#x02009;&#x0002B;&#x02009;<sub><italic>A</italic>0</sub></p>
<p><italic>Stream B</italic>: &#x02212;<sub><italic>B</italic>23</sub>&#x0002B;<sub><italic>B</italic>22</sub>&#x02026;&#x02212;<sub><italic>B</italic>1</sub>&#x02009;&#x02212;&#x02009;<sub><italic>B</italic>0</sub></p>
<p>leading to 2<sup>24</sup> possible input patterns for either stream. Before the input was presented to the network, all spikes were jittered using a Gaussian distribution with zero mean and standard deviation 1&#x02009;ms. The task was to identify the last four segments presented (0&#x02009;&#x02264;&#x02009;<italic>i</italic>&#x02009;&#x02264;&#x02009;3) at the end of the experiment. For that purpose, the spike response of the network was filtered with an exponential decay kernel (&#x003C4;<sub>decay</sub>&#x02009;&#x0003D;&#x02009;&#x003C4;<sub>syn</sub>&#x02009;&#x0003D;&#x02009;30&#x02009;ms). The resulting network state at <italic>t</italic>&#x02009;&#x0003D;&#x02009;2400&#x02009;ms was presented to linear readout neurons which were trained via linear regression as in Maass et al. (<xref ref-type="bibr" rid="B26">2002</xref>). The training was based on 1500 simulation runs. Another 300 runs were used for evaluation. In order to determine the performance of the architecture for this retroactive pattern classification task, the above setup was repeated 30 times with newly generated networks and input templates.</p>
<p>The average performance of networks with recurrent dynamic synapses is shown in Figure <xref ref-type="fig" rid="F8">8</xref>B. The error bars denote the standard error of the mean. Obviously, the network state at <italic>t</italic>&#x02009;&#x0003D;&#x02009;2400&#x02009;ms contains significant information on the latest patterns presented and preserves traces of preceding patterns for some hundred milliseconds. For comparison, recurrent networks using static synapses performed only slightly over chance level (not shown). In addition to the pattern classification task, another linear readout neuron was trained to compute the non-linear expression XOR(&#x000B1;<sub><italic>A</italic>0</sub>,&#x000B1;<sub><italic>B</italic>0</sub>) from the network output. Note that this task cannot be solved by a linear readout operating directly on the input spike trains.</p>
<p>Summing up, the self-adjusting recurrent networks are able to perform multiple computational tasks in parallel. Since the main objective of this work was to verify the self-adjustment ability of small networks on a neuromorphic hardware device, both connection probabilities and synaptic weights of recurrent connections had been chosen high compared to the strength of external stimulation. Still, the networks significantly respond to changes in the input and provide manifold information on present and previous structure of the stimulus.</p>
<p>Recent theoretical work (Buesing et al., <xref ref-type="bibr" rid="B11">2010</xref>) stressed that the computational power of recurrent networks of spiking neurons strongly depends on their connectivity structure. As a general rule, it has been shown to be beneficial to operate a recurrent neural network in the edge-of-chaos regime (Bertschinger and Natschl&#x000E4;ger, <xref ref-type="bibr" rid="B4">2004</xref>). Nevertheless, as addressed in Legenstein and Maass (<xref ref-type="bibr" rid="B25">2007</xref>), the optimal configuration for a specific task can differ from this estimate. Accordingly, task-dependent recurrent connectivity parameters might be preferable to achieve good experimental results (see, e.g., Haeusler et al., <xref ref-type="bibr" rid="B22">2009</xref>). While networks of randomly connected neurons feature favorable kernel qualities, i.e., they perform rich non-linear operations on the input, theoretical studies of Ganguli et al. (<xref ref-type="bibr" rid="B18">2008</xref>) prove that networks with hidden feedforward structures provide superior memory storage capabilities. Future research might identify such connectivity patterns in seemingly random cortical circuits and improve our understanding of working memory.</p>
<p>While the examined recurrent network architecture was not optimized for computation, neither regarding its kernel quality nor its memory traces, the cited studies suggest that the performance will increase if network parameters are attuned to particular tasks. Further research is needed to explore under which conditions the examined architecture provides a stable operating point, a high responsiveness to stimuli, and appropriate memory traces.</p>
</sec>
</sec>
<sec sec-type="discussion">
<title>Discussion</title>
<p>We showed that recurrent neural networks featuring short-term synaptic plasticity are applicable to present neuromorphic mixed-signal VLSI devices. For the first time dynamic synapses play a functional role in network dynamics during a hardware emulation. Since neuromorphic hardware devices model neural information processing with analog circuitry, they generally suffer from process-related fluctuations which affect the dynamics of their components. In order to minimize the influence of unit variations on emulation results, we applied a self-adjustment principle on a network level as proposed by Sussillo et al. (<xref ref-type="bibr" rid="B44">2007</xref>).</p>
<p>Even though the employed prototype system only supports a limited network size, the expected self-adjustment property was observed on all used back-ends. The biological description of the experimental setup was equal for all utilized chips, i.e., the configuration was not customized to characteristics of the specific hardware system. Beyond the validation of the basic functioning of the self-adjusting mechanism, we addressed the robustness of the construction principle against both statistical variations of network entities and systematic disparities between different chips. We showed that the examined architecture reliably adjusts the average network response to a moderate firing regime. While congeneric networks emulated on the same chip yielded a widely similar behavior, the operating point achieved on different systems still was affected by large-scale characteristics of the utilized back-end.</p>
<p>All outcomes of the hardware emulation were qualitatively confirmed by software simulations. Furthermore, the influence of a major imperfection of the current revision of the FACETS Stage 1 Hardware, the load-dependency of the excitatory synaptic efficacy, was studied by the accompanying application of the simulator PCSIM.</p>
<p>Presumably, the performance of the applied architecture will improve with increasing network size. Upcoming neuromorphic emulators like the FACETS Stage 2 Wafer-scale Integration system (see Fieres et al., <xref ref-type="bibr" rid="B17">2008</xref>; Schemmel et al., <xref ref-type="bibr" rid="B38">2008</xref>) will comprise more than 100,000 neurons and millions of synapses. Even earlier, the present chip-based system will sustain the interconnection of multiple chips and thus provide a substrate of some thousand neurons. As such large-scale mixed-signal VLSI devices will inevitably exhibit variations in unit properties, detailed knowledge of circuitry design is required by the user to reduce distortions of experimental results on the level of single units. On the other hand, the beneficial application of neuromorphic VLSI devices as both neuroscientific modeling and novel computing tools will require that it does not demand an expert in electronic engineering to run the system. We showed that self-regulation properties of neural networks can help to overcome disadvantageous effects of unit level variations of neuromorphic VLSI devices. The employed network architecture might ensure a highly similar network behavior independent of the utilized system. Therefore this work displays an important step toward a reliable and practicable operation of neuromorphic hardware.</p>
<p>The applied configuration required strong recurrent synapses at a high connectivity. The results of Sussillo et al. (<xref ref-type="bibr" rid="B44">2007</xref>) show that even sparsely connected networks can manage to efficiently adjust their activity, provided they comprise a sufficiently large number of neurons which will be sustained by future hardware systems. Thereby, the examined construction principle will become applicable to a variety of experimental setups and network designs. As touched upon in Section <xref ref-type="sec" rid="S10">&#x0201C;Responsiveness to Input&#x0201D;</xref>, the presented self-adjusting networks still are sensitive and responsive to changes in external excitation. Furthermore, we verified that even networks with disproportionately strong recurrent synapses can perform simple non-linear operations on transient input streams. By applying biologically more realistic connectivity parameters, it has been shown that randomly connected networks of spiking neurons are able to accomplish ambitious computational tasks (Maass et al., <xref ref-type="bibr" rid="B27">2004</xref>) and that short-term synaptic plasticity can improve the performance of such networks in neural information processing (Maass et al., <xref ref-type="bibr" rid="B26">2002</xref>). Thus, this architecture provides a promising application for neuromorphic hardware devices while the high configurability of novel systems as well supports the emulation of circuits tailored to specific tasks.</p>
</sec>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ack>
<p>The work presented in this paper is supported by the European Union &#x02013; projects # FP6-015879 (FACETS) and # FP7-237955 (FACETS-ITN). The authors would like to thank Mihai Petrovici for fruitful discussions and diligent remarks.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Abbott</surname> <given-names>L.</given-names></name> <name><surname>Varela</surname> <given-names>J.</given-names></name> <name><surname>Sen</surname> <given-names>K.</given-names></name> <name><surname>Nelson</surname> <given-names>S.</given-names></name></person-group> (<year>1997</year>). <article-title>Synaptic depression and cortical gain control</article-title>. <source>Science</source> <volume>275</volume>, <fpage>221</fpage>&#x02013;<lpage>224</lpage>.<pub-id pub-id-type="doi">10.1126/science.275.5297.221</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Baddeley</surname> <given-names>R.</given-names></name> <name><surname>Abbott</surname> <given-names>L. F.</given-names></name> <name><surname>Booth</surname> <given-names>M. C. A.</given-names></name> <name><surname>Sengpiel</surname> <given-names>F.</given-names></name> <name><surname>Freeman</surname> <given-names>T.</given-names></name> <name><surname>Wakeman</surname> <given-names>E. A.</given-names></name> <name><surname>Rolls</surname> <given-names>E. T.</given-names></name></person-group> (<year>1997</year>). <article-title>Responses of neurons in primary and inferior temporal visual cortices to natural scenes</article-title>. <source>Proc. R. Soc. Lond., B, Biol. Sci.</source> <volume>264</volume>, <fpage>1775</fpage>&#x02013;<lpage>1783</lpage>.<pub-id pub-id-type="doi">10.1098/rspb.1997.0246</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bartolozzi</surname> <given-names>C.</given-names></name> <name><surname>Indiveri</surname> <given-names>G.</given-names></name></person-group> (<year>2007</year>). <article-title>Synaptic dynamics in analog VLSI</article-title>. <source>Neural Comput.</source> <volume>19</volume>, <fpage>2581</fpage>&#x02013;<lpage>2603</lpage>.<pub-id pub-id-type="doi">10.1162/neco.2007.19.10.2581</pub-id><pub-id pub-id-type="pmid">17716003</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bertschinger</surname> <given-names>N.</given-names></name> <name><surname>Natschl&#x000E4;ger</surname> <given-names>T.</given-names></name></person-group> (<year>2004</year>). <article-title>Real-time computation at the edge of chaos in recurrent neural networks</article-title>. <source>Neural Comput.</source> <volume>16</volume>, <fpage>1413</fpage>&#x02013;<lpage>1436</lpage>.<pub-id pub-id-type="doi">10.1162/089976604323057443</pub-id><pub-id pub-id-type="pmid">15165396</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bi</surname> <given-names>G.</given-names></name> <name><surname>Poo</surname> <given-names>M.</given-names></name></person-group> (<year>1997</year>). <article-title>Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type</article-title>. <source>Neural Comput.</source> <volume>9</volume>, <fpage>503</fpage>&#x02013;<lpage>514</lpage>.<pub-id pub-id-type="pmid">9097470</pub-id></citation></ref>
<ref id="B6"><citation citation-type="thesis"><person-group person-group-type="author"><name><surname>Bill</surname> <given-names>J.</given-names></name></person-group> (<year>2008</year>). <source>Self-Stabilizing Network Architectures on a Neuromorphic Hardware System</source>. Diploma thesis (English), <publisher-name>University of Heidelberg</publisher-name>, HD-KIP-08-44.</citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Boegershausen</surname> <given-names>M.</given-names></name> <name><surname>Suter</surname> <given-names>P.</given-names></name> <name><surname>Liu</surname> <given-names>S.-C.</given-names></name></person-group> (<year>2003</year>). <article-title>Modeling short-term synaptic depression in silicon</article-title>. <source>Neural Comput.</source> <volume>15</volume>, <fpage>331</fpage>&#x02013;<lpage>348</lpage>.<pub-id pub-id-type="doi">10.1162/089976603762552942</pub-id><pub-id pub-id-type="pmid">12590810</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brette</surname> <given-names>R.</given-names></name> <name><surname>Rudolph</surname> <given-names>M.</given-names></name> <name><surname>Carnevale</surname> <given-names>T.</given-names></name> <name><surname>Hines</surname> <given-names>M.</given-names></name> <name><surname>Beeman</surname> <given-names>D.</given-names></name> <name><surname>Bower</surname> <given-names>J. M.</given-names></name> <name><surname>Diesmann</surname> <given-names>M.</given-names></name> <name><surname>Morrison</surname> <given-names>A.</given-names></name> <name><surname>Goodman</surname> <given-names>P. H.</given-names></name> <name><surname>Harris</surname> <given-names>F. C.</given-names> <suffix>Jr</suffix></name> <name><surname>Zirpe</surname> <given-names>M.</given-names></name> <name><surname>Natschlager</surname> <given-names>T.</given-names></name> <name><surname>Pecevski</surname> <given-names>D.</given-names></name> <name><surname>Ermentrout</surname> <given-names>B.</given-names></name> <name><surname>Djurfeldt</surname> <given-names>M.</given-names></name> <name><surname>Lansner</surname> <given-names>A.</given-names></name> <name><surname>Rochel</surname> <given-names>O.</given-names></name> <name><surname>Vieville</surname> <given-names>T.</given-names></name> <name><surname>Muller</surname> <given-names>E.</given-names></name> <name><surname>Davison</surname> <given-names>A. P.</given-names></name> <name><surname>Boustani</surname> <given-names>S. E.</given-names></name> <name><surname>Destexhe</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>Simulation of networks of spiking neurons: a review of tools and strategies</article-title>. <source>J. Comput. Neurosci.</source> <volume>23</volume>, <fpage>349</fpage>&#x02013;<lpage>398</lpage>.<pub-id pub-id-type="doi">10.1007/s10827-007-0038-6</pub-id><pub-id pub-id-type="pmid">17629781</pub-id></citation></ref>
<ref id="B9"><citation citation-type="thesis"><person-group person-group-type="author"><name><surname>Br&#x000FC;derle</surname> <given-names>D.</given-names></name></person-group> (<year>2009</year>). <source>Neuroscientific Modeling with a Mixed-Signal VLSI Hardware System</source>. Ph.D. thesis, <publisher-name>University of Heidelberg</publisher-name>, HD-KIP 09-30.<pub-id pub-id-type="pmid">19562085</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Br&#x000FC;derle</surname> <given-names>D.</given-names></name> <name><surname>M&#x000FC;ller</surname> <given-names>E.</given-names></name> <name><surname>Davison</surname> <given-names>A.</given-names></name> <name><surname>Muller</surname> <given-names>E.</given-names></name> <name><surname>Schemmel</surname> <given-names>J.</given-names></name> <name><surname>Meier</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system</article-title>. <source>Front. Neuroinform.</source> <volume>3</volume>:<fpage>17</fpage>.</citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Buesing</surname> <given-names>L.</given-names></name> <name><surname>Schrauwen</surname> <given-names>B.</given-names></name> <name><surname>Legenstein</surname> <given-names>R.</given-names></name></person-group> (<year>2010</year>). <article-title>Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons</article-title>. <source>Neural Comput.</source> <volume>22</volume>, <fpage>1272</fpage>&#x02013;<lpage>1311</lpage>.<pub-id pub-id-type="doi">10.1162/neco.2009.01-09-947</pub-id><pub-id pub-id-type="pmid">20028227</pub-id></citation></ref>
<ref id="B12"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Dally</surname> <given-names>W. J.</given-names></name> <name><surname>Poulton</surname> <given-names>J. W.</given-names></name></person-group> (<year>1998</year>). <source>Digital Systems Engineering</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Cambridge University Press</publisher-name>.</citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Davison</surname> <given-names>A. P.</given-names></name> <name><surname>Br&#x000FC;derle</surname> <given-names>D.</given-names></name> <name><surname>Eppler</surname> <given-names>J.</given-names></name> <name><surname>Kremkow</surname> <given-names>J.</given-names></name> <name><surname>Muller</surname> <given-names>E.</given-names></name> <name><surname>Pecevski</surname> <given-names>D.</given-names></name> <name><surname>Perrinet</surname> <given-names>L.</given-names></name> <name><surname>Yger</surname> <given-names>P.</given-names></name></person-group> (<year>2008</year>). <article-title>PyNN: a common interface for neuronal network simulators</article-title>. <source>Front. Neuroinform.</source> <volume>2</volume>:<fpage>11</fpage>.<pub-id pub-id-type="doi">10.3389/neuro.11.011.2008</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Destexhe</surname> <given-names>A.</given-names></name> <name><surname>Rudolph</surname> <given-names>M.</given-names></name> <name><surname>Pare</surname> <given-names>D.</given-names></name></person-group> (<year>2003</year>). <article-title>The high-conductance state of neocortical neurons in vivo</article-title>. <source>Nat. Rev. Neurosci.</source> <volume>4</volume>, <fpage>739</fpage>&#x02013;<lpage>751</lpage>.<pub-id pub-id-type="doi">10.1038/nrn1198</pub-id><pub-id pub-id-type="pmid">12951566</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Douglas</surname> <given-names>R.</given-names></name> <name><surname>Mahowald</surname> <given-names>M.</given-names></name> <name><surname>Mead</surname> <given-names>C.</given-names></name></person-group> (<year>1995</year>). <article-title>Neuromorphic analogue VLSI</article-title>. <source>Annu. Rev. Neurosci.</source> <volume>18</volume>, <fpage>255</fpage>&#x02013;<lpage>281</lpage>.<pub-id pub-id-type="doi">10.1146/annurev.ne.18.030195.001351</pub-id><pub-id pub-id-type="pmid">7605063</pub-id></citation></ref>
<ref id="B16"><citation citation-type="web"><person-group person-group-type="author"><collab>FACETS.</collab></person-group> (<year>2009</year>). <article-title>Fast Analog Computing with Emergent Transient States &#x02013; project website</article-title>.<uri xlink:href="http://www.facets-project.org">http://www.facets-project.org</uri>.</citation></ref>
<ref id="B17"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Fieres</surname> <given-names>J.</given-names></name> <name><surname>Schemmel</surname> <given-names>J.</given-names></name> <name><surname>Meier</surname> <given-names>K.</given-names></name></person-group> (<year>2008</year>). <article-title>&#x0201C;Realizing biological spiking network models in a configurable wafer-scale hardware system,&#x0201D;</article-title> in <source>Proceedings of the 2008 International Joint Conference on Neural Networks (IJCNN)</source> (<publisher-loc>Hong Kong</publisher-loc>: <publisher-name>IEEE Press</publisher-name>).</citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ganguli</surname> <given-names>S.</given-names></name> <name><surname>Huh</surname> <given-names>D.</given-names></name> <name><surname>Sompolinsky</surname> <given-names>H.</given-names></name></person-group> (<year>2008</year>). <article-title>Memory traces in dynamical systems</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A.</source> <volume>105</volume>, <fpage>18970</fpage>&#x02013;<lpage>18975</lpage>.<pub-id pub-id-type="doi">10.1073/pnas.0804451105</pub-id><pub-id pub-id-type="pmid">19020074</pub-id></citation></ref>
<ref id="B19"><citation citation-type="thesis"><person-group person-group-type="author"><name><surname>Gr&#x000FC;bl</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <source>VLSI Implementation of a Spiking Neural Network</source>. Ph.D. thesis, <publisher-name>Ruprecht-Karls-University</publisher-name>, <publisher-loc>Heidelberg</publisher-loc>, document No. HD-KIP 07-10.</citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gupta</surname> <given-names>A.</given-names></name> <name><surname>Wang</surname> <given-names>Y.</given-names></name> <name><surname>Markram</surname> <given-names>H.</given-names></name></person-group> (<year>2000</year>). <article-title>Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex</article-title>. <source>Science</source> <volume>287</volume>, 273.<pub-id pub-id-type="doi">10.1126/science.287.5451.273</pub-id><pub-id pub-id-type="pmid">10634775</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haeusler</surname> <given-names>S.</given-names></name> <name><surname>Maass</surname> <given-names>W.</given-names></name></person-group> (<year>2007</year>). <article-title>A statistical analysis of information processing properties of lamina-specific cortical microcircuit models</article-title>. <source>Cereb. Cortex</source> <volume>17</volume>, <fpage>149</fpage>&#x02013;<lpage>162</lpage>.<pub-id pub-id-type="doi">10.1093/cercor/bhj132</pub-id><pub-id pub-id-type="pmid">16481565</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haeusler</surname> <given-names>S.</given-names></name> <name><surname>Schuch</surname> <given-names>K.</given-names></name> <name><surname>Maass</surname> <given-names>W.</given-names></name></person-group> (<year>2009</year>). <article-title>Motif distribution, dynamical properties, and computational performance of two data-based cortical microcircuit templates</article-title>. <source>J. Physiol. Paris</source> <volume>103</volume>, <fpage>73</fpage>&#x02013;<lpage>87</lpage>.<pub-id pub-id-type="doi">10.1016/j.jphysparis.2009.05.006</pub-id><pub-id pub-id-type="pmid">19500669</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Indiveri</surname> <given-names>G.</given-names></name> <name><surname>Chicca</surname> <given-names>E.</given-names></name> <name><surname>Douglas</surname> <given-names>R.</given-names></name></person-group> (<year>2006</year>). <article-title>A VLSI array of low-power spiking neurons and bistable synapses with spike-timing dependent plasticity</article-title>. <source>IEEE Trans. Neural Netw.</source> <volume>17</volume>, <fpage>211</fpage>&#x02013;<lpage>221</lpage>.<pub-id pub-id-type="doi">10.1109/TNN.2005.860850</pub-id><pub-id pub-id-type="pmid">16526488</pub-id></citation></ref>
<ref id="B24"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Kaplan</surname> <given-names>B.</given-names></name> <name><surname>Br&#x000FC;derle</surname> <given-names>D.</given-names></name> <name><surname>Schemmel</surname> <given-names>J.</given-names></name> <name><surname>Meier</surname> <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>&#x0201C;High-conductance states on a neuromorphic hardware system,&#x0201D;</article-title> in <source>Proceedings of the 2009 International Joint Conference on Neural Networks (IJCNN)</source> (<publisher-loc>Atlanta, GA</publisher-loc>: <publisher-name>IEEE Press</publisher-name>).</citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Legenstein</surname> <given-names>R.</given-names></name> <name><surname>Maass</surname> <given-names>W.</given-names></name></person-group> (<year>2007</year>). <article-title>Edge of chaos and prediction of computational performance for neural circuit models</article-title>. <source>Neural Netw.</source> <volume>20</volume>, <fpage>323</fpage>&#x02013;<lpage>334</lpage>.<pub-id pub-id-type="doi">10.1016/j.neunet.2007.04.017</pub-id><pub-id pub-id-type="pmid">17517489</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maass</surname> <given-names>W.</given-names></name> <name><surname>Natschl&#x000E4;ger</surname> <given-names>T.</given-names></name> <name><surname>Markram</surname> <given-names>H.</given-names></name></person-group> (<year>2002</year>). <article-title>Real-time computing without stable states: a new framework for neural computation based on perturbations</article-title>. <source>Neural Comput.</source> <volume>14</volume>, <fpage>2531</fpage>&#x02013;<lpage>2560</lpage>.<pub-id pub-id-type="doi">10.1162/089976602760407955</pub-id><pub-id pub-id-type="pmid">12433288</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maass</surname> <given-names>W.</given-names></name> <name><surname>Natschl&#x000E4;ger</surname> <given-names>T.</given-names></name> <name><surname>Markram</surname> <given-names>H.</given-names></name></person-group> (<year>2004</year>). <article-title>Fading memory and kernel properties of generic cortical microcircuit models</article-title>. <source>J. Physiol. Paris</source> <volume>98</volume>, <fpage>315</fpage>&#x02013;<lpage>330</lpage>.<pub-id pub-id-type="doi">10.1016/j.jphysparis.2005.09.020</pub-id><pub-id pub-id-type="pmid">16310350</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Markram</surname> <given-names>H.</given-names></name> <name><surname>Wang</surname> <given-names>Y.</given-names></name> <name><surname>Tsodyks</surname> <given-names>M.</given-names></name></person-group> (<year>1998</year>). <article-title>Differential signaling via the same axon of neocortical pyramidal neurons</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A.</source> <volume>95</volume>, <fpage>5323</fpage>&#x02013;<lpage>5328</lpage>.<pub-id pub-id-type="doi">10.1073/pnas.95.9.5323</pub-id><pub-id pub-id-type="pmid">9560274</pub-id></citation></ref>
<ref id="B29"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Mead</surname> <given-names>C. A.</given-names></name></person-group> (<year>1989</year>). <source>Analog VLSI and Neural Systems</source>. <publisher-loc>Reading, MA</publisher-loc>: <publisher-name>Addison Wesley</publisher-name>.</citation></ref>
<ref id="B30"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Merolla</surname> <given-names>P. A.</given-names></name> <name><surname>Boahen</surname> <given-names>K.</given-names></name></person-group> (<year>2006</year>). <article-title>&#x0201C;Dynamic computation in a recurrent network of heterogeneous silicon neurons,&#x0201D;</article-title> in <source>Proceedings of the 2006 IEEE International Symposium on Circuits and Systems (ISCAS 2006)</source> (<publisher-loc>Island of Kos</publisher-loc>: <publisher-name>IEEE Press</publisher-name>).<pub-id pub-id-type="pmid">16608123</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mitra</surname> <given-names>S.</given-names></name> <name><surname>Fusi</surname> <given-names>S.</given-names></name> <name><surname>Indiveri</surname> <given-names>G.</given-names></name></person-group> (<year>2009</year>). <article-title>Real-time classification of complex patterns using spike-based learning in neuromorphic VLSI</article-title>. <source>IEEE Trans. Biomed. Circuits Syst.</source> <volume>3</volume>, <fpage>32</fpage>&#x02013;<lpage>42</lpage>.<pub-id pub-id-type="doi">10.1109/TBCAS.2008.2005781</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morrison</surname> <given-names>A.</given-names></name> <name><surname>Diesmann</surname> <given-names>M.</given-names></name> <name><surname>Gerstner</surname> <given-names>W.</given-names></name></person-group> (<year>2008</year>). <article-title>Phenomenological models of synaptic plasticity based on spike timing</article-title>. <source>Biol. Cybern.</source> <volume>98</volume>, <fpage>459</fpage>&#x02013;<lpage>478</lpage>.<pub-id pub-id-type="doi">10.1007/s00422-008-0233-1</pub-id><pub-id pub-id-type="pmid">18491160</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morrison</surname> <given-names>A.</given-names></name> <name><surname>Mehring</surname> <given-names>C.</given-names></name> <name><surname>Geisel</surname> <given-names>T.</given-names></name> <name><surname>Aertsen</surname> <given-names>A.</given-names></name> <name><surname>Diesmann</surname> <given-names>M.</given-names></name></person-group> (<year>2005</year>). <article-title>Advancing the boundaries of high connectivity network simulation with distributed computing</article-title>. <source>Neural Comput.</source> <volume>17</volume>, <fpage>1776</fpage>&#x02013;<lpage>1801</lpage>.<pub-id pub-id-type="doi">10.1162/0899766054026648</pub-id><pub-id pub-id-type="pmid">15969917</pub-id></citation></ref>
<ref id="B34"><citation citation-type="thesis"><person-group person-group-type="author"><name><surname>M&#x000FC;ller</surname> <given-names>E.</given-names></name></person-group> (<year>2008</year>). <source>Operation of an Imperfect Neuromorphic Hardware Device</source>. Diploma thesis (English), <publisher-name>University of Heidelberg</publisher-name>, HD-KIP-08-43.</citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pecevski</surname> <given-names>D. A.</given-names></name> <name><surname>Natschl&#x000E4;ger</surname> <given-names>T.</given-names></name> <name><surname>Schuch</surname> <given-names>K. N.</given-names></name></person-group> (<year>2009</year>). <article-title>PCSIM: a parallel simulation environment for neural circuits fully integrated with Python</article-title>. <source>Front. Neuroinform.</source> <volume>3</volume>:<fpage>11</fpage>.<pub-id pub-id-type="doi">10.3389/neuro.11.011.2009</pub-id></citation></ref>
<ref id="B36"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Rossum</surname> <given-names>G. V.</given-names></name></person-group> (<year>2000</year>). <source>Python Reference Manual</source>, February 19, 1999, Release 1.5.2. <publisher-loc>Bloomington, IN</publisher-loc>: <publisher-name>iUniverse, Incorporated</publisher-name>.</citation></ref>
<ref id="B37"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Schemmel</surname> <given-names>J.</given-names></name> <name><surname>Br&#x000FC;derle</surname> <given-names>D.</given-names></name> <name><surname>Meier</surname> <given-names>K.</given-names></name> <name><surname>Ostendorf</surname> <given-names>B.</given-names></name></person-group> (<year>2007</year>). <article-title>&#x0201C;Modeling synaptic plasticity within networks of highly accelerated I&#x0026;F neurons,&#x0201D;</article-title> in <source>Proceedings of the 2007 IEEE International Symposium on Circuits and Systems (ISCAS&#x02019;07)</source> (<publisher-loc>New Orleans, LA</publisher-loc>: <publisher-name>IEEE Press</publisher-name>).</citation></ref>
<ref id="B38"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Schemmel</surname> <given-names>J.</given-names></name> <name><surname>Fieres</surname> <given-names>J.</given-names></name> <name><surname>Meier</surname> <given-names>K.</given-names></name></person-group> (<year>2008</year>). <article-title>&#x0201C;Wafer-scale integration of analog neural networks,&#x0201D;</article-title> in <source>Proceedings of the 2008 International Joint Conference on Neural Networks (IJCNN)</source> (<publisher-loc>Hong Kong</publisher-loc>: <publisher-name>IEEE Press</publisher-name>).<pub-id pub-id-type="pmid">18647022</pub-id></citation></ref>
<ref id="B39"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Schemmel</surname> <given-names>J.</given-names></name> <name><surname>Gr&#x000FC;bl</surname> <given-names>A.</given-names></name> <name><surname>Meier</surname> <given-names>K.</given-names></name> <name><surname>Muller</surname> <given-names>E.</given-names></name></person-group> (<year>2006</year>). <article-title>&#x0201C;Implementing synaptic plasticity in a VLSI spiking neural network model,&#x0201D;</article-title> in <source>Proceedings of the 2006 International Joint Conference on Neural Networks (IJCNN&#x02019;06)</source> (<publisher-loc>Vancouver</publisher-loc>: <publisher-name>IEEE Press</publisher-name>).</citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shelley</surname> <given-names>M.</given-names></name> <name><surname>McLaughlin</surname> <given-names>D.</given-names></name> <name><surname>Shapley</surname> <given-names>R.</given-names></name> <name><surname>Wielaard</surname> <given-names>J.</given-names></name></person-group> (<year>2002</year>). <article-title>States of high conductance in a large-scale model of the visual cortex</article-title>. <source>J. Comput. Neurosci.</source> <volume>13</volume>, <fpage>93</fpage>&#x02013;<lpage>109</lpage>.<pub-id pub-id-type="doi">10.1023/A:1020158106603</pub-id><pub-id pub-id-type="pmid">12215724</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Song</surname> <given-names>S.</given-names></name> <name><surname>Miller</surname> <given-names>K.</given-names></name> <name><surname>Abbott</surname> <given-names>L.</given-names></name></person-group> (<year>2000</year>). <article-title>Competitive Hebbian learning through spiketiming-dependent synaptic plasticity</article-title>. <source>Nat. Neurosci.</source> <volume>3</volume>, <fpage>919</fpage>&#x02013;<lpage>926</lpage>.<pub-id pub-id-type="doi">10.1038/78829</pub-id><pub-id pub-id-type="pmid">10966623</pub-id></citation></ref>
<ref id="B42"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Steriade</surname> <given-names>M.</given-names></name></person-group> (<year>2001</year>). <source>The Intact and Sliced Brain.</source> <publisher-loc>Cambridge, MA</publisher-loc>: <publisher-name>MIT Press.</publisher-name></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Steriade</surname> <given-names>M.</given-names></name> <name><surname>Timofeev</surname> <given-names>I.</given-names></name> <name><surname>Grenier</surname> <given-names>F.</given-names></name></person-group> (<year>2001</year>). <article-title>Natural waking and sleep states: a view from inside neocortical neurons</article-title>. <source>J. Neurophysiol.</source> <volume>85</volume>, <fpage>1969</fpage>&#x02013;<lpage>1985</lpage>.<pub-id pub-id-type="pmid">11353014</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sussillo</surname> <given-names>D.</given-names></name> <name><surname>Toyoizumi</surname> <given-names>T.</given-names></name> <name><surname>Maass</surname> <given-names>W.</given-names></name></person-group> (<year>2007</year>). <article-title>Self-tuning of neural circuits through short-term synaptic plasticity</article-title>. <source>J. Neurophysiol.</source> <volume>97</volume>, <fpage>4079</fpage>&#x02013;<lpage>4095</lpage>.<pub-id pub-id-type="doi">10.1152/jn.01357.2006</pub-id><pub-id pub-id-type="pmid">17409166</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vogelstein</surname> <given-names>R. J.</given-names></name> <name><surname>Mallik</surname> <given-names>U.</given-names></name> <name><surname>Vogelstein</surname> <given-names>J. T.</given-names></name> <name><surname>Cauwenberghs</surname> <given-names>G.</given-names></name></person-group> (<year>2007</year>). <article-title>Dynamically reconfigurable silicon array of spiking neuron with conductance-based synapses</article-title>. <source>IEEE Trans. Neural Netw.</source> <volume>18</volume>, <fpage>253</fpage>&#x02013;<lpage>265</lpage>.<pub-id pub-id-type="doi">10.1109/TNN.2006.883007</pub-id><pub-id pub-id-type="pmid">17278476</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zucker</surname> <given-names>R. S.</given-names></name> <name><surname>Regehr</surname> <given-names>W. G.</given-names></name></person-group> (<year>2002</year>). <article-title>Short-term synaptic plasticity</article-title>. <source>Annu. Rev. Physiol.</source> <volume>64</volume>, <fpage>355</fpage>&#x02013;<lpage>405</lpage>.<pub-id pub-id-type="doi">10.1146/annurev.physiol.64.092501.114547</pub-id><pub-id pub-id-type="pmid">11826273</pub-id></citation></ref>
</ref-list>
</back>
</article>