<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Neuroinform.</journal-id>
<journal-title>Frontiers in Neuroinformatics</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Neuroinform.</abbrev-journal-title>
<issn pub-type="epub">1662-5196</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fninf.2014.00085</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Technology Report Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Muir</surname> <given-names>Dylan R.</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://community.frontiersin.org/people/u/26847"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Kampa</surname> <given-names>Bj&#x000F6;rn M.</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
<uri xlink:href="http://community.frontiersin.org/people/u/10890"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Neurophysiology, Brain Research Institute, University of Z&#x000FC;rich</institution> <country>Z&#x000FC;rich, Switzerland</country></aff>
<aff id="aff2"><sup>2</sup><institution>Biozentrum, University of Basel</institution> <country>Basel, Switzerland</country></aff>
<aff id="aff3"><sup>3</sup><institution>Department of Neurophysiology, Institute of Biology 2, RWTH Aachen University</institution> <country>Aachen, Germany</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Arjen Van Ooyen, VU University Amsterdam, Netherlands</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Thomas Wachtler, Ludwig-Maximilians-Universit&#x000E4;t M&#x000FC;nchen, Germany; Arjen Van Ooyen, VU University Amsterdam, Netherlands</p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: Dylan R. Muir, Biozentrum, University of Basel, Klingelbergstrasse 50/70, 4056 Basel, Switzerland e-mail: <email>dylan.muir&#x00040;unibas.ch</email></p></fn>
<fn fn-type="other" id="fn002"><p>This article was submitted to the journal Frontiers in Neuroinformatics.</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>20</day>
<month>01</month>
<year>2015</year>
</pub-date>
<pub-date pub-type="collection">
<year>2014</year>
</pub-date>
<volume>8</volume>
<elocation-id>85</elocation-id>
<history>
<date date-type="received">
<day>22</day>
<month>09</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>02</day>
<month>12</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2015 Muir and Kampa.</copyright-statement>
<copyright-year>2015</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract><p>Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using <italic>ad-hoc</italic> solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, <monospace>FocusStack</monospace>, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, <monospace>StimServer</monospace>, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. <monospace>FocusStack</monospace> is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories<xref ref-type="fn" rid="fn0001"><sup>1</sup></xref>.</p></abstract>
<kwd-group>
<kwd>two-photon calcium imaging</kwd>
<kwd>neuronal responses</kwd>
<kwd>Matlab</kwd>
<kwd>visual stimulus generation</kwd>
<kwd>analysis toolbox</kwd>
<kwd>small memory footprint</kwd>
<kwd>open source</kwd>
</kwd-group>
<counts>
<fig-count count="6"/>
<table-count count="4"/>
<equation-count count="0"/>
<ref-count count="18"/>
<page-count count="13"/>
<word-count count="7435"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="introduction" id="s1">
<title>1. Introduction</title>
<p>Two-photon calcium imaging has become a major method to record neuronal activity. However, analysis of the acquired data has special requirements because of the image based data format. In addition, increasing spatial and temporal resolution also require increasing computation power of the analysis system. While consumer computing hardware is cheap and accessible for most researchers, it is usually limited in maximum addressable memory. Microscopes with resonance scanners, which are increasingly becoming standard equipment for two-photon imaging of neuronal signals on fast timescales, can easily generate in the order of 10 MB (10 &#x000D7; 2<sup>20</sup> bytes) of data per second. Coupled with the trend toward imaging in awake, behaving animals, which necessitates lengthly imaging trials, single imaging sessions can produce 10 of gigabytes (10 &#x000D7; 2<sup>30</sup> bytes) of data. In-memory analysis of two-photon imaging data entails considerable hardware requirements (and increasingly so), leading to a rapid rise in cost and accessibility.</p>
<p>Here we present a new, open-source, Matlab-based two-photon analysis toolchain designed to process large two-photon imaging stacks with only a small memory footprint. This makes analysis possible on standard consumer hardware. We also present a new open-source Matlab-based server for visual stimulus generation and presentation which can be controlled remotely over TCP or UDP network links.</p>
<p>In Section 2 we present a high-level overview of our stimulation and analysis toolchain. In Section 3 we discuss how the end user interacts with <monospace>FocusStack</monospace>, and how the design of <monospace>FocusStack</monospace> makes analysis of two-photon imaging data simpler. In Section 4 we present the low-level representation of a <monospace>FocusStack</monospace> object, and discuss how <monospace>FocusStack</monospace> can easily be adapted to new two-photon imaging data formats. In Section 5 we discuss the design of <monospace>StimServer</monospace>, and how stimuli are configured and queued during an experiment. In Section 6 we present an example two-photon imaging experiment and analysis using <monospace>StimServer</monospace> and <monospace>FocusStack</monospace>. The experimental data analyzed, and Matlab scripts required to reproduce the analysis, are available as Supplementary Material.</p>
<sec>
<title>1.1. Existing two-photon stack analysis packages</title>
<p>The only other publicly-available two-photon processing system, at time of writing, is the Two-Photon Processor (2PP; Tomek et al., <xref ref-type="bibr" rid="B17">2013</xref>). 2PP is a GUI-based Matlab toolbox for analyzing two-photon calcium data, and performs automated ROI segmentation, stack alignment, and calcium signal extraction. Our toolchain differs from 2PP in a number of ways:</p>
<list list-type="bullet">
<list-item><p><monospace>FocusStack</monospace> is aware of stimulus identity and timing, and automates derandomization of time-series data, when stimuli are presented in random order;</p></list-item>
<list-item><p><monospace>FocusStack</monospace> is command-based, as opposed to the GUI-based interface of 2PP. We believe a command-based interface is more appropriate for analysis of experimental data, where identical analysis steps need to be repeated for several data sets;</p></list-item>
<list-item><p><monospace>FocusStack</monospace> is designed for small memory requirements, using novel Matlab classes for efficient data access. 2PP is not designed for lazy data access, implying that entire stacks must be analyzed in-memory, with a consequently large memory footprint;</p></list-item>
<list-item><p><monospace>FocusStack</monospace> interfaces directly with additional Matlab analysis tools for spike estimation from calcium response traces. 2PP incorporates these algorithms internally.</p></list-item>
</list>
</sec>
</sec>
<sec>
<title>2. Toolchain overview</title>
<p>The design goal of <monospace>FocusStack</monospace> and <monospace>StimServer</monospace> was to provide simple, extensible tools to assist experiments using two-photon calcium imaging of neuronal responses; accessible to those with little programming experience, but powerful enough to automate most low-level analysis of calcium response stacks. Although alternative programming languages are increasing in popularity (such as Python, R and Octave), Matlab remains an accessible and frequently used tool for analysis and statistical testing, with a reputation for simple uptake by non-programmers. For this reason, we designed a toolchain that does allows users to design and script their entire analysis in Matlab, without the need for additional software packages.</p>
<p>An overview of a two-photon acquisition and visual stimulation setup is shown in Figure <xref ref-type="fig" rid="F1">1</xref>. Due to the real-time requirements of both visual stimulation and acquisition of two-photon imaging data, these tasks are usually performed on separate dedicated computing systems (Figure <xref ref-type="fig" rid="F1">1A</xref>). <monospace>StimServer</monospace> is controlled over a TCP or UDP network link, to trigger stimulus presentation and sequencing. Two-photon acquisition occurs using the software appropriate for the experimental equipment used, and stores the resulting imaging stacks as binary data files on disk. Ideally, meta-data about the stack&#x02014;stimulus identity and random sequencing, stack resolution, information about the acquisition system, etc.&#x02014;are stored with the stack data files in a file header or a &#x0201C;side-car&#x0201D; meta-data file.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p><bold>Overview of visual stimulation, stack acquisition and analysis in Matlab. (A)</bold> <monospace>StimServer</monospace> is used to generate and present visual stimuli to an animal, under the control of a calcium imaging system, over a network link. Data is stored in a binary format. <bold>(B)</bold> A <monospace>FocusStack</monospace> object is created in Matlab, to access several sequential data blocks as a single concatenated stack. <bold>(C)</bold> The <monospace>FocusStack</monospace> object &#x0201C;<monospace>fs</monospace>&#x0201D; can now be accessed as a Matlab tensor, and passed into Matlab functions. <bold>(D)</bold> Extracting a single frame is as simple as referencing a Matlab tensor. <bold>(E)</bold> Extracting the response trace through time of a single pixel is equally simple.</p></caption>
<graphic xlink:href="fninf-08-00085-g0001.tif"/>
</fig>
<p>Binary stack data files are analyzed in Matlab, by using <monospace>FocusStack</monospace> to map several stack files to a single <monospace>FocusStack</monospace> object (Figure <xref ref-type="fig" rid="F1">1B</xref>). This object appears as a simple Matlab tensor (i.e., a multi-dimensional Matlab matrix), with frames, channels, and single pixels accessed using standard Matlab referencing (Figures <xref ref-type="fig" rid="F1">1D-E</xref>). Since <monospace>FocusStack</monospace> objects can be accessed as Matlab tensors, many existing Matlab analysis functions that expect tensors can seamlessly be passed <monospace>FocusStack</monospace> objects without modification (Figure <xref ref-type="fig" rid="F1">1C</xref>).</p>
<p>However, <monospace>FocusStack</monospace> objects are aware of stimulus timing and sequencing, provide services for stack alignment, provide support for assigning baseline fluorescence distributions, and have simple helper functions to perform derandomization and segmentation of stack data. These facilities are described in the next section. An example flow-chart for analysis using a <monospace>FocusStack</monospace> object is shown in Figure <xref ref-type="fig" rid="F3">3</xref>.</p>
</sec>
<sec>
<title>3. High-level interface to FocusStack</title>
<p>The design of <monospace>FocusStack</monospace> is to provide a smart wrapper interface to two-photon imaging data. An example of creating and accessing a two-photon imaging stack using a <monospace>FocusStack</monospace> object is given in Figure <xref ref-type="fig" rid="F1">1B</xref>. Several files of acquired calcium responses can be wrapped by a <monospace>FocusStack</monospace> object, to appear to Matlab and to calling functions as a simple Matlab tensor (Figures <xref ref-type="fig" rid="F1">1C&#x02013;E</xref>).</p>
<sec>
<title>3.1. Stack meta-data</title>
<p>Although a <monospace>FocusStack</monospace> object can be accessed just like a Matlab tensor, each frame and pixel in the stack has many items of meta-data associated with it (see Figure <xref ref-type="fig" rid="F2">2</xref> and Tables <xref ref-type="table" rid="T1">1</xref>,<xref ref-type="table" rid="T2">2</xref>). Meta-data consists of stack-global information such as the resolution (pixels per &#x003BC;m) and dimensions of the stack, imaging frame rate and Z-step per frame. Stimulus-specific global meta data includes the number of unique stimuli presented, the duration of each stimulus, the order of presentation, and which periods of the stimulus presentation should be used for analysis (see Listing 2).</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p><bold>Stimulus information and other meta-data associated with each frame</bold>. A series of drifting grating stimuli were presented in randomized order (see values of <monospace>nStimulusSeqID</monospace>), over several repeats (see values of <monospace>nBlockIndex</monospace>). Using <monospace>FocusStack.FrameStimulusInfo</monospace>, the stimulus meta-data associated with each frame can be accessed (meta-data is listed for the frame indicated by the arrow). In addition, each frame is associated with a mis-alignment shift (<monospace>mfFrameShifts</monospace>), a &#x0201C;black&#x0201D; level (<monospace>vfBlackTrace</monospace>), and a per-pixel baseline distribution (insets at top right). Colors at top left indicate the corresponding traces of meta-data values in the time-series plot.</p></caption>
<graphic xlink:href="fninf-08-00085-g0002.tif"/>
</fig>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p><bold>List of meta-data provided by a <monospace>FocusStack</monospace> object</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left"><bold>Meta-data parameter name</bold></th>
<th align="left"><bold>Contents</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" colspan="2"><bold>STACK-GLOBAL META-DATA</bold></td>
</tr>
<tr>
<td align="left"><monospace>.fPixelsPerUM</monospace></td>
<td align="left">Spatial calibration of the stack (X and Y), in pixels per &#x003BC;m</td>
</tr>
<tr>
<td align="left"><monospace>.tFrameDuration</monospace></td>
<td align="left">Acquisition time per frame, in seconds</td>
</tr>
<tr>
<td align="left"><monospace>.fZStep</monospace></td>
<td align="left">Z spacing between subsequent frames, in &#x003BC;m</td>
</tr>
<tr>
<td align="left"><monospace>.mfFrameShifts</monospace></td>
<td align="left">Misalignment shifts for each frame, in fractional pixels. Assigned manually, or using alignment method <monospace>FocusStack/Align</monospace></td>
</tr>
<tr>
<td align="left"><monospace>.vfBlackTrace</monospace></td>
<td align="left">Black set-point value for each frame, in raw units. Assigned manually, or using utility method <monospace>FocusStack/DefineBlackRegion</monospace></td>
</tr>
<tr>
<td align="left" colspan="2"><bold>STIMULUS-RELATED META-DATA</bold></td>
</tr>
<tr>
<td align="left"><monospace>.tBlankTime</monospace></td>
<td align="left">Blank time between episodic visual stimuli</td>
</tr>
<tr>
<td align="left"><monospace>.vnStimulusIDs</monospace></td>
<td align="left">List of stimuli presented (one stimulus ID per data file). Each stimulus ID can contain a sequence of several individual stimuli</td>
</tr>
<tr>
<td align="left"><monospace>.nNumStimuli</monospace></td>
<td align="left">(Read-only)Total number of individual stimulus sequence IDs presented in the entire stack</td>
</tr>
<tr>
<td align="left"><monospace>.cvnSequenceIDs</monospace></td>
<td align="left">Cell array, each element containing a vector of stimulus sequence IDs in the order they were presented</td>
</tr>
<tr>
<td align="left"><monospace>.vtStimulusDurations</monospace></td>
<td align="left">Vector of stimulus sequence ID durations, in seconds. Each entry specifies the duration of the corresponding stimulus sequence ID</td>
</tr>
<tr>
<td align="left"><monospace>.vtStimulusStartTimes</monospace></td>
<td align="left">Vector of onset times for each individual stimulus presentation, as an offset in seconds from the first frame of the stack. Assigned manually, or computed automatically</td>
</tr>
<tr>
<td align="left"><monospace>.vtStimulusEndTimes</monospace></td>
<td align="left">Vector of end times for each individual stimulus presentation, as an offset in seconds from the first frame of the stack. Assigned manually, or computed automatically</td>
</tr>
<tr>
<td align="left"><monospace>.mtStimulusUseTimes</monospace></td>
<td align="left">Matrix of times indicating which periods of stimulus presentation should be used for analysis. Each row corresponds to a stimulus sequence ID, and is a row vector <monospace>[tStartTime tStopTime]</monospace>, indicating offsets from the start of the presentation of the corresponding stimulus</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>All the parameters listed here are <monospace>FocusStack</monospace> class properties, and should be assigned from meta-data stored with the data file, whenever possible.</italic></p>
</table-wrap-foot>
</table-wrap>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p><bold>List of meta-data provided by a <monospace>FocusStack</monospace> object (continued from Table <xref ref-type="table" rid="T1">1</xref>)</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left"><bold>Meta-data parameter name</bold></th>
<th align="left"><bold>Contents</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" colspan="2"><bold>FRAME-RELATED META-DATA</bold></td>
</tr>
<tr>
<td align="left"><monospace>vtGlobalTime</monospace></td>
<td align="left">The time in seconds since the first frame in the stack. (FSI)</td>
</tr>
<tr>
<td align="left"><monospace>vnBlockIndex</monospace></td>
<td align="left">The index of the block (data file) the associated frame falls within. (FSI)</td>
</tr>
<tr>
<td align="left"><monospace>vnFrameInBlock</monospace></td>
<td align="left">The index of the associated frame within the block, with the first frame given index 1. (FSI)</td>
</tr>
<tr>
<td align="left"><monospace>vtTimeInBlock</monospace></td>
<td align="left">The time in seconds since the first frame in the block. (FSI)</td>
</tr>
<tr>
<td align="left"><monospace>vnStimulusSeqID</monospace></td>
<td align="left">The stimulus sequence ID associated with each frame. (FSI)</td>
</tr>
<tr>
<td align="left"><monospace>vtTimeInStim</monospace> <monospace>Presentation</monospace></td>
<td align="left">The time in seconds of the associated frame since the onset of the stimulus in which the frame falls. (FSI)</td>
</tr>
<tr>
<td align="left"><monospace>vnPresentationIndex</monospace></td>
<td align="left">The index of the current stimulus presentation in the entire stack. The first stimulus is given index 1. (FSI)</td>
</tr>
<tr>
<td align="left"><monospace>vbUseFrame</monospace></td>
<td align="left">A boolean value associated with each frame, indicating whether that frame should be used for analysis. (FSI)</td>
</tr>
<tr>
<td align="left"><monospace>tfBlankMean</monospace>, <monospace>tfBlankStd</monospace></td>
<td align="left">Mean and standard deviation distribution of the baseline distribution assigned to a stack. Obtained by referencing a stack using <monospace>fs.BlankFrames(&#x0003C;stack reference&#x0003E;)</monospace> or by using the <monospace>FocusStack/GetCorresponding</monospace> <monospace>BlankFrames</monospace> class method</td>
</tr>
<tr>
<td/>
<td align="left">The baseline distribution is assigned using the <monospace>FocusStack/AssignBlankFrame</monospace> class method (see Section 3.1.2 and Listing <xref ref-type="table" rid="T5">1</xref>)</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Parameters listed that specify &#x0201C;FSI&#x0201D; for access are obtained using the <monospace>FocusStack/FrameStimulusInfo</monospace> class method, and are computed from the parameters given in Table <xref ref-type="table" rid="T1">1</xref>.</italic></p>
</table-wrap-foot>
</table-wrap>
<p>Associating this meta-data with the stack enables <monospace>FocusStack</monospace> to provide detailed information about each frame (Figure <xref ref-type="fig" rid="F2">2</xref>), using the <monospace>FrameStimulusInfo</monospace> method. Each frame is tagged with the information about the stimulus being presented while that frame was being acquired.</p>
<p>For file formats supported out of the box by <monospace>FocusStack</monospace>, meta-data stored in the data files is automatically loaded and assigned to the <monospace>FocusStack</monospace> object. When implementing two-photon acquisition software and extending <monospace>FocusStack</monospace> to support the corresponding file formats, care should be taken to store and automatically load as much meta-data as possible.</p>
<sec>
<title>3.1.1. Stack alignment</title>
<p>Ensuring that each successive frame is aligned with the previous frames in the stack is essential for extracting calcium responses with high signal to noise ratio. <monospace>FocusStack</monospace> provides transparent internal support for sub-pixel rigid linear translation of each frame. Each frame in a <monospace>FocusStack</monospace> object has a [2 &#x000D7; <italic>N</italic>] double list <monospace>mfFrameShifts</monospace> of displacements for each frame, relative to the unshifted stack origin. If misalignments are assigned to the stack, then <monospace>FocusStack</monospace> transparently shifts each frame when the stack is accessed. Re-alignment of each frame occurs lazily, the first time a frame is requested, and is subsequently cached on disk for quick repeated access.</p>
<p><monospace>FocusStack</monospace> includes a sub-pixel, rigid linear translation alignment algorithm based on Fourier phase matching (Guizar-Sicairos et al., <xref ref-type="bibr" rid="B7">2008</xref>; example in Listing <xref ref-type="table" rid="T5">1</xref>). This algorithm supports progressive alignment or alignment to a reference frame or image. Per-frame spatial filtering, sliding-window frame averaging and single-frame shift size rejection to regularize the alignment process. However, any alignment algorithm can be used and the shifts manually assigned to the <monospace>FocusStack.mfFrameShifts</monospace> property.</p>
<table-wrap position="float" id="T5">
<label>Listing 1</label>
<caption><p><bold>Creating a <monospace>FocusStack</monospace> object; performing stack alignment; estimating the baseline fluorescence distribution and assigning it to the <monospace>FocusStack</monospace> object</bold>.</p></caption>
<graphic xlink:href="fninf-08-00085-i0001.tif"/>
</table-wrap>
</sec>
<sec>
<title>3.1.2. Defining the baseline fluorescence distribution</title>
<p>Two-photon calcium imaging commonly uses fluorescent dyes that change their conformation and emission properties when bound to Ca<sup>2&#x0002B;</sup> molecules. In single-channel imaging, the calcium concentration change (a proxy for firing rate) of a neuron is related to the proportional increase in fluorescence (e.g., OGB; Grynkiewicz et al., <xref ref-type="bibr" rid="B6">1985</xref>). In FRET imaging, a molecule consisting of two bound fluorophores with differing emission wavelengths causes a differential change in fluorescence of both fluorophores when bound to Ca<sup>2&#x0002B;</sup> (e.g., Yellow Cameleon; Nagai et al., <xref ref-type="bibr" rid="B12">2004</xref>). In this case the response signal is the ratio of the responses in two imaging channels.</p>
<p>Both of these techniques require estimation of the &#x0201C;baseline&#x0201D; fluorescence (<italic>F</italic><sub>0</sub>) of a neuron to define the differential calcium response &#x00394;<italic>F</italic>/<italic>F</italic><sub>0</sub>. In <monospace>FocusStack</monospace>, this is obtained through two mechanisms. Firstly, each frame is assigned a &#x0201C;black&#x0201D; reference level using the <monospace>FocusStack.DefineBlackRegion</monospace> method or by directly setting the <monospace>FocusStack.vfBlackTrace</monospace> property. <monospace>DefineBlackRegion</monospace> allows a number of pixel indices to be provided that define a region in the stack which is expected to have zero fluorescence (for example, the interior of a blood vessel). This can be performed either by providing pixel indices, or through a GUI-based selection of a circular region.</p>
<p>Secondly, a baseline fluorescence distribution (<inline-formula><mml:math id="M1"><mml:mrow><mml:mover accent='true'><mml:mi>F</mml:mi><mml:mo stretchy='true'>&#x0005E;</mml:mo></mml:mover></mml:mrow></mml:math></inline-formula><sub>0</sub> and &#x003C3;<sub><italic>F</italic><sub>0</sub></sub>) must be estimated and recorded for each frame. An example procedure for estimating the baseline distribution using <monospace>FocusStack</monospace> is given in Listing <xref ref-type="table" rid="T5">1</xref>. In <monospace>FocusStack</monospace>, the baseline distribution is stored efficiently as stack meta-data (Figure <xref ref-type="fig" rid="F2">2</xref>). Each baseline frame is associated with a range of stack frames, leading to minimal storage requirements.</p>
</sec>
</sec>
<sec>
<title>3.2. Stack segmentation, stimulus derandomization and extracting calcium responses</title>
<sec>
<title>3.2.1. Defining regions of interest (ROIs)</title>
<p>Since <monospace>FocusStack</monospace> objects appear as Matlab tensors, standard image-processing pipelines can be applied directly. We have included two simple pipelines: the first, <monospace>FindCells_G</monospace>, seek peaks of intensity in channel 1&#x02014;useful for imaging with calcium indicators that brightly label cell nuclei; the second, <monospace>FindCells_GR</monospace>, subtracts channel 2 from channel 1&#x02014;useful when a second channel is used for a neuron-excluding fluorescent label such as sulforhodamine. Code is also included to import ROI definitions from ImageJ (Schneider et al., <xref ref-type="bibr" rid="B16">2012</xref>).</p>
</sec>
<sec>
<title>3.2.2. Extracting calcium responses</title>
<p>In any good experiment design, stimulus presentation order is randomized. During analysis of the acquired time series data, stimulus segmentation, and derandomization therefore becomes an important but fiddly task. Our solution is to store the stimulus presentation order with the stack, along with information about stimulus duration, &#x0201C;blank&#x0201D; stimuli, and periods of stimulus presentation during which analysis of the calcium signals should be performed (see Figure <xref ref-type="fig" rid="F2">2</xref>).</p>
<p>Extracting calcium response time-series from a <monospace>FocusStack</monospace> object is accomplished using the <monospace>ExtractRegionResponses</monospace> function (see Listing <xref ref-type="table" rid="T6">2</xref>). This workhorse function transparently performs stimulus derandomization, simultaneously averages and extracts responses from a number of arbitrary ROIs in the stack, segments the stack into single-trial per-neuron traces and returns estimated responses for each stimulus and each trial. <monospace>FocusStack</monospace> therefore provides an automated extraction of the peri-stimulus time histogram (PSTH) for each presented stimulus.</p>
<table-wrap position="float" id="T6">
<label>Listing 2</label>
<caption><p><bold>Assigning stimulus durations and extracting derandomized calcium traces. Note that the order of stimulus presentation can and should be stored as meta-data by the two-photon acquisition system. If meta-data is present in the data files, then <monospace>FocusStack</monospace> will assign the meta-data to the stack when the stack is created</bold>.</p></caption>
<graphic xlink:href="fninf-08-00085-i0002.tif"/>
</table-wrap>
<p><monospace>ExtractRegionResponses</monospace> is highly modular, and allows the user to define what a &#x0201C;response&#x0201D; means for a given calcium trace. For example, toolbox functions are included to extract the mean, the peak, and the ratio of a calcium stack; all support either extraction of raw signals or &#x00394;<italic>F</italic>/<italic>F</italic><sub>0</sub> processed data.</p>
<p>A flowchart showing an example of information flow during standard two-photon analysis steps applied to a <monospace>FocusStack</monospace> object is given in Figure <xref ref-type="fig" rid="F3">3</xref>.</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p><bold>Information flow of a <monospace>FocusStack</monospace> object, during standard analysis steps applied to a two-photon imaging stack</bold>. Additional analysis steps can easily be added (see text).</p></caption>
<graphic xlink:href="fninf-08-00085-g0003.tif"/>
</fig>
</sec>
</sec>
<sec>
<title>3.3. Interfacing with other software</title>
<p>ROIs are defined using the Matlab region structure format returned by <monospace>bwconncomp</monospace>. This means that <monospace>FocusStack</monospace> can easily accept ROI segmentations determined using the Matlab image processing toolbox. However, code is also included in <monospace>FocusStack</monospace> to import ROIs from ImageJ.</p>
<p>Since all responses traces and response values are produced by <monospace>FocusStack</monospace> in Matlab standard formats, existing software for processing calcium response traces can be used directly&#x02014;for example, the fast non-negative deconvolution algorithm for estimating spike times of Vogelstein et al. (<xref ref-type="bibr" rid="B18">2010</xref>), the compressive-sensing approach of Dyer et al. (<xref ref-type="bibr" rid="B3">2013</xref>) or the peeling algorithm of Grewe et al. (<xref ref-type="bibr" rid="B5">2010</xref>) and L&#x000FC;tcke et al. (<xref ref-type="bibr" rid="B11">2013</xref>).</p>
</sec>
</sec>
<sec>
<title>4. Low-level FocusStack representation</title>
<p><monospace>FocusStack</monospace> already provides in-build access to the data format of a previously published 3D imaging software, &#x0201C;Focus&#x0201D; (G&#x000F6;bel et al., <xref ref-type="bibr" rid="B4">2007</xref>) and the open source project &#x0201C;HelioScan&#x0201D; (Langer et al., <xref ref-type="bibr" rid="B10">2013</xref>). There are presently a plethora of binary data formats in which two-photon imaging systems store recorded calcium signals. Many of these are <italic>ad-hoc</italic>, &#x0201C;in-house&#x0201D; formats, and which may change with little warning. For this reason, it is important that a general analysis toolchain is abstracted away from the particular binary format in which data is stored. We designed <monospace>FocusStack</monospace> such that the low-level representation is itself modular, with a standard interface to the rest of the <monospace>FocusStack</monospace> core code. This implies that adding support for a new data format is a matter of an hour&#x00027;s work or less, after which existing analysis scripts will run without modification.</p>
<p><monospace>FocusStack</monospace> contains support for two low-level Matlab classes, which map binary data on disk to a Matlab tensor representation. The first, <monospace>MappedTensor</monospace>, handles arbitrary binary data files with linear representations and fixed numbers of bits per pixel. The second, <monospace>TIFFStack</monospace>, provides rapid access to standard multi-frame, multi-channel TIFF graphics files, which are generated by several common microscopy systems. Both classes use a lazy access paradigm, where data is only loaded from disk when needed.</p>
<sec>
<title>4.1. MappedTensor class</title>
<p>The <monospace>MappedTensor</monospace> class<xref ref-type="fn" rid="fn0002"><sup>2</sup></xref> transparently maps large tensors of arbitrary dimensions to temporary files on disk, or makes existing binary files available as Matlab tensors. Referencing is identical to a standard Matlab tensor, so a <monospace>MappedTensor</monospace> can be passed into functions without requiring that the function be written specifically to use <monospace>MappedTensors</monospace>. This is opposed to objects of the built-in Matlab class <monospace>memmapfile</monospace>, which cannot be used in such a way. <monospace>memmapfile</monospace> occasionally runs out of virtual addressing space, even if the data is stored only on disk. <monospace>MappedTensor</monospace> does not suffer from this problem. <monospace>MappedTensors</monospace> transparently support complex numbers, another advantage over <monospace>memmapfile</monospace>.</p>
<p>Being able to use <monospace>MappedTensors</monospace> as arguments to functions requires that the tensor is indexed inside the function (as opposed to manipulating the object without sub-referencing). This implies that a function using a <monospace>MappedTensor</monospace> must not be fully vectorized, but must operate on the mapped tensor in segments inside a <monospace>for</monospace> loop. Note that <monospace>parfor</monospace> loops are unsupported, but may work for local clusters or with a shared storage architecture. Functions that work on every element of a tensor, with an output the same size as the input tensor, can be applied to a <monospace>MappedTensor</monospace> without requiring the entire tensor to be allocated in memory. This is done with the convenience function <monospace>SliceFunction</monospace>.</p>
<p><monospace>MappedTensor</monospace> offers support for basic operations such as <monospace>permute</monospace> and <monospace>sum</monospace>, without requiring space for the tensor to be allocated in memory. Many operations can be performed in <italic>O</italic>(1) time, such as negation, multiplication and addition, <monospace>transpose</monospace> and <monospace>permute</monospace>. Addition and subtraction of scalars are performed in <italic>O</italic>(<italic>N</italic>) time.</p>
<p><monospace>MappedTensor</monospace> is implemented as a Matlab class, wrapping efficient file access and tensor handling functions. <monospace>MappedTensor</monospace> inherits from the Matlab <monospace>handle</monospace> class, which implies that duplicating a <monospace>MappedTensor</monospace> object does not duplicate the underlying data storage. Copies of a single <monospace>MappedTensor</monospace> contain the same data and properties, and modifying one copy modifies them all.</p>
<p>Examples of using <monospace>MappedTensor</monospace> objects are given in Listing <xref ref-type="table" rid="T7">3</xref>.</p>
<table-wrap position="float" id="T7">
<label>Listing 3</label>
<caption><p><bold>Creating and accessing MappedTensor objects</bold>.</p></caption>
<graphic xlink:href="fninf-08-00085-i0003.tif"/>
</table-wrap>
</sec>
<sec>
<title>4.2. TIFFStack class</title>
<p>A <monospace>TIFFStack</monospace> object<xref ref-type="fn" rid="fn0003"><sup>3</sup></xref> behaves like a read-only memory mapped TIFF file. The entire image stack is treated as a Matlab tensor. Each frame of the file must have the same dimensions. Reading the image data is optimized to the extent possible; the header information is only read once. <monospace>permute</monospace>, <monospace>ipermute</monospace> and <monospace>transpose</monospace> are transparently supported, with <italic>O</italic>(1) time requirements.</p>
<p>Examples of using <monospace>TIFFStack</monospace> objects are given in Listing <xref ref-type="table" rid="T8">4</xref>.</p>
<table-wrap position="float" id="T8">
<label>Listing 4</label>
<caption><p><bold>Creating and accessing <monospace>TIFFStack</monospace> objects</bold>.</p></caption>
<graphic xlink:href="fninf-08-00085-i0004.tif"/>
</table-wrap>
</sec>
<sec>
<title>4.3. Adapting FocusStack to new file formats</title>
<p>Enabling <monospace>FocusStack</monospace> to open new file formats requires adapting the <monospace>FocusStack/OpenFiles</monospace> static method. Depending on the file extension, <monospace>OpenFiles</monospace> must create a handle to a mapped file using <monospace>MappedTensor</monospace>, <monospace>TIFFStack</monospace>, <monospace>memmapfile</monospace> or any other appropriate method. <monospace>OpenFiles</monospace> must also extract any available meta-data concerning the stack, such as stimulus sequence and stack resolution. It is important for the design of <monospace>FocusStack</monospace> that binary data access be performed on a lazy basis, so that the memory footprint remains small.</p>
<p>The <monospace>MappedTensor</monospace> class described above is extremely flexible, and can easily be used to access binary data files with a wide range of formats.</p>
</sec>
<sec>
<title>4.4. Size and time benchmarks</title>
<p>Here we include some benchmarks for memory storage and data access time using <monospace>FocusStack</monospace> and <monospace>TIFFStack</monospace>, compared with loading stacks using the Matlab <monospace>imread</monospace> function and the Two-Photon Processor (2PP; Tomek et al., <xref ref-type="bibr" rid="B17">2013</xref>). All benchmarks were performed on a MacBook Pro (two-core Intel Core i7 3 GHz; 8 GB RAM; SSD HD; OS X 10.0) running Matlab 2014a. Scripts used for time benchmarks of <monospace>FocusStack</monospace> and <monospace>imread</monospace> are shown in Listings <xref ref-type="table" rid="T9">5</xref> and <xref ref-type="table" rid="T10">6</xref>; benchmark results are given in Table <xref ref-type="table" rid="T3">3</xref>. Data storage requirements for <monospace>FocusStack</monospace> and <monospace>TIFFStack</monospace> objects were estimated by linearizing the objects using the Matlab <monospace>struct</monospace> command. When timing file loading, the range of several benchmark trials is reported, skipping the first trial.</p>
<table-wrap position="float" id="T9">
<label>Listing 5</label>
<caption><p><bold>Timing the loading of a stack using <monospace>FocusStack</monospace></bold>.</p></caption>
<graphic xlink:href="fninf-08-00085-i0005.tif"/>
</table-wrap>
<table-wrap position="float" id="T10">
<label>Listing 6</label>
<caption><p><bold>Timing the loading of a TIFF file using <monospace>imread</monospace></bold>.</p></caption>
<graphic xlink:href="fninf-08-00085-i0006.tif"/>
</table-wrap>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption><p><bold>Memory storage and time benchmarks for <monospace>FocusStack</monospace>, <monospace>TIFFStack</monospace>, <monospace>imread</monospace>, and the Two-Photon Processor (2PP; Tomek et al., <xref ref-type="bibr" rid="B17">2013</xref>)</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left"><bold>Benchmark</bold></th>
<th/>
<th align="left"><bold>Result</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" colspan="3"><bold>BINARY FILE FORMAT</bold></td>
</tr>
<tr>
<td align="left">Data size on disk</td>
<td/>
<td align="left">241 MB<xref ref-type="table-fn" rid="TN1"><sup>a</sup></xref></td>
</tr>
<tr>
<td align="left">Time to create <monospace>FocusStack</monospace> object</td>
<td/>
<td align="left">&#x02248;350 ms</td>
</tr>
<tr>
<td align="left">Time to read in data for entire stack</td>
<td align="left"><bold><monospace>FocusStack</monospace></bold></td>
<td align="left">16&#x02013;17 s</td>
</tr>
<tr>
<td align="left">Memory usage within Matlab</td>
<td align="left"><bold><monospace>FocusStack</monospace></bold></td>
<td align="left">108 kB</td>
</tr>
<tr>
<td/>
<td align="left"><bold>data-native <monospace>uint8</monospace></bold> tensor</td>
<td align="left">230 MB</td>
</tr>
<tr>
<td/>
<td align="left"><bold>default <monospace>double</monospace></bold> tensor</td>
<td align="left">1.8 GB</td>
</tr>
<tr>
<td align="left" colspan="3"><bold>TIFF FILE FORMAT</bold></td>
</tr>
<tr>
<td align="left">Data size on disk</td>
<td/>
<td align="left">958 MB<xref ref-type="table-fn" rid="TN2"><sup>b</sup></xref></td>
</tr>
<tr>
<td align="left">Time to create stack</td>
<td align="left"><bold><monospace>FocusStack</monospace></bold></td>
<td align="left">&#x02248;280 ms</td>
</tr>
<tr>
<td/>
<td align="left"><bold><monospace>TIFFStack</monospace></bold></td>
<td align="left">&#x02248;230 ms</td>
</tr>
<tr>
<td align="left">Time to read in data for entire stack</td>
<td align="left"><bold><monospace>FocusStack</monospace></bold></td>
<td align="left">18&#x02013;25 s</td>
</tr>
<tr>
<td/>
<td align="left"><bold><monospace>TIFFStack</monospace></bold></td>
<td align="left">6.5&#x02013;7.4 s</td>
</tr>
<tr>
<td/>
<td align="left"><bold><monospace>imread</monospace></bold></td>
<td align="left">12&#x02013;17 s</td>
</tr>
<tr>
<td/>
<td align="left"><bold>Two-photon processor (2PP)</bold></td>
<td align="left">57&#x02013;68 s</td>
</tr>
<tr>
<td align="left">Memory usage within Matlab</td>
<td align="left"><bold><monospace>FocusStack</monospace></bold></td>
<td align="left">33 MB<xref ref-type="table-fn" rid="TN3"><sup>c</sup></xref></td>
</tr>
<tr>
<td/>
<td align="left"><bold>Two-photon processor (2PP)</bold></td>
<td align="left">900 MB</td>
</tr>
<tr>
<td/>
<td align="left"><bold>data-native <monospace>uint16</monospace></bold> tensor</td>
<td align="left">900 MB</td>
</tr>
<tr>
<td/>
<td align="left"><bold>default <monospace>double</monospace></bold> tensor</td>
<td align="left">3.5 GB</td>
</tr>
<tr>
<td align="left" colspan="3"><bold>DATA ACCESS AND PROCESSING</bold></td>
</tr>
<tr>
<td align="left">Data size on disk</td>
<td/>
<td align="left">116 MB<xref ref-type="table-fn" rid="TN4"><sup>d</sup></xref></td>
</tr>
<tr>
<td align="left">Time required to load data, align stack and extract calcium responses</td>
<td align="left"><bold><monospace>FocusStack</monospace></bold></td>
<td align="left">150 s</td>
</tr>
<tr>
<td/>
<td align="left"><bold>Two-photon processor (2PP)</bold></td>
<td align="left">470 s</td>
</tr>
<tr>
<td align="left">Memory usage within Matlab</td>
<td align="left"><bold><monospace>FocusStack</monospace></bold></td>
<td align="left">230 kB</td>
</tr>
<tr>
<td/>
<td align="left"><bold>Two-photon processor (2PP)</bold></td>
<td align="left">116 MB</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn id="TN1">
<label>a</label>
<p><italic>128 &#x000D7; 128 &#x000D7; 7378 &#x000D7; 2 pixels, 8-bit data across 7 files.</italic></p></fn>
<fn id="TN2">
<label>b</label>
<p><italic>512 &#x000D7; 512 &#x000D7; 900 &#x000D7; 1 pixels, 16-bit data across 2 files.</italic></p></fn>
<fn id="TN3">
<label>c</label>
<p><italic>Memory usage by <monospace>FocusStack</monospace> for TIFF data is mostly consumed by caching of image header information within <monospace>TIFFStack</monospace> objects.</italic></p></fn>
<fn id="TN4">
<label>d</label>
<p><italic>128 &#x000D7; 128 &#x000D7; 7378 &#x000D7; 1 pixels, 8-bit data across 7 files.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p>When accessing stacks stored in the &#x0201C;Focus&#x0201D; binary format, <monospace>FocusStack</monospace> required 0.05% of the memory storage than a Matlab matrix in a data-native format (<monospace>uint8</monospace>), and 0.006% of that required when using the default Matlab format (<monospace>double</monospace>).</p>
<p>When accessing data in TIFF format, <monospace>FocusStack</monospace> required 4% of the memory storage than using the 2PP or a data-native Matlab matrix (<monospace>uint16</monospace>), and 1% of that required when using the default Matlab format (<monospace>double</monospace>). In addition, <monospace>TIFFStack</monospace> and <monospace>FocusStack</monospace> were considerably faster when accessing data: 2PP required between three and nine times as long to read data. <monospace>FocusStack</monospace> and <monospace>imread</monospace> performed comparably, with <monospace>FocusStack</monospace> requiring 1.5 times as long as <monospace>imread</monospace> to read data; however, <monospace>TIFFStack</monospace> was approximately twice as fast as <monospace>imread</monospace>.</p>
<p>The low-level primitives used by <monospace>FocusStack</monospace> therefore allow efficient access to binary stack data, both in terms of speed and of memory usage. The time required to load data, align a stack and extract calcium responses was compared between <monospace>FocusStack</monospace> and 2PP. <monospace>FocusStack/Align</monospace> and <monospace>FocusStack/ExtractRegionResponses</monospace> were called in sequence to process a binary stack. The same stack was processed using 2PP via the <monospace>TSeriesProcessor/getIntensities</monospace> method, called with a minimal set of parameters. <monospace>FocusStack</monospace> completed alignment and signal extraction in only 30% of the time required by 2PP, and in 0.2% of the memory footprint. Note that the performance of both packages will depend greatly on the exact processing pipeline used.</p>
</sec>
</sec>
<sec>
<title>5. High-level interface to stimServer</title>
<p><monospace>StimServer</monospace> is a new, open source, Matlab-based stimulus generation and sequencing server for visual stimulation, using Psychtoolbox for low-level driving of a stimulus screen (Brainard, <xref ref-type="bibr" rid="B1">1997</xref>; Pelli, <xref ref-type="bibr" rid="B14">1997</xref>; Guizar-Sicairos et al., <xref ref-type="bibr" rid="B7">2008</xref>). Stimuli are designed and configured on the server machine, after which <monospace>StimServer</monospace> is designed to be controlled remotely to initiate stimulus presentation. <monospace>StimServer</monospace> requires either the Matlab Instrument Control Toolbox (ICT<xref ref-type="fn" rid="fn0004"><sup>4</sup></xref>) or the TCP/UDP/IP Toolbox (PNET<xref ref-type="fn" rid="fn0005"><sup>5</sup></xref>; included with Psychtoolbox) for low-level network communication.</p>
<sec>
<title>5.1. Configuring stimuli</title>
<p>An example of generating stimulus objects and configuring <monospace>StimServer</monospace> is given in Listings <xref ref-type="table" rid="T11">7</xref> and <xref ref-type="table" rid="T12">8</xref>. Stimuli are represented as Matlab structures with a standard format that describes the parameters of a stimulus, which parameters are available for modification by the remote controlling process, and the names of the stimulus generation, presentation, and description functions.</p>
<table-wrap position="float" id="T11">
<label>Listing 7</label>
<caption><p><bold>Initialization of the <monospace>StimServer</monospace> environment</bold>.</p></caption>
<graphic xlink:href="fninf-08-00085-i0007.tif"/>
</table-wrap>
<table-wrap position="float" id="T12">
<label>Listing 8</label>
<caption><p><bold>Configuring a set of stimuli and starting the <monospace>StimServer</monospace></bold>.</p></caption>
<graphic xlink:href="fninf-08-00085-i0008.tif"/>
</table-wrap>
<p>Stimuli are configured using a set of generation functions (<monospace>STIM&#x02026;</monospace>). These functions return stimulus objects which are then passed directly to <monospace>StimServer</monospace>. The number and identity of a set of stimuli is fixed once <monospace>StimServer</monospace> is started. However, many or all parameters of a given stimulus can be determined dynamically at presentation time, by sending a set of stimulus arguments over a network interface when triggering stimulus presentation. For example, the server could be configured with a single drifting grating stimulus. At presentation time, the network interface could dynamically set the orientation and drift speed, as well as other parameters of the stimulus.</p>
</sec>
<sec>
<title>5.2. Controlling the server remotely</title>
<p>Stimulus presentation is triggered over a network link (both TCP and UDP are supported). A series of textual commands are used to control stimulus presentation, parameters, and sequencing. An example dialogue between a controlling machine and <monospace>StimServer</monospace> is shown in Figure <xref ref-type="fig" rid="F4">4</xref>.</p>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption><p><bold>Example dialogue between <monospace>StimServer</monospace> and controlling machine</bold>. Commands are sent over the command channel (red); talkback notifications are sent of the talkback channel (blue).</p></caption>
<graphic xlink:href="fninf-08-00085-g0004.tif"/>
</fig>
<p>Most parameters of a visual stimulus can be controlled remotely at presentation time, including the order of presentation of a stimulus sequence. In this case the remote controlling process generates a pseudo-random sequence in which to present a set of stimuli; this sequence can then be recorded as meta-data along with the acquired neuronal responses. Alternatively, dynamic stimulation can be performed&#x02014;for example, setting an arbitrary orientation or spatial frequency of a drifting grating&#x02014;online during an experiment.</p>
<p>Once a connection is established with the server, a reverse &#x0201C;talkback&#x0201D; connection can be configured so that feedback and confirmation of commands is available to the remote controlling process. Logging of stimulus commands and any error messages are either logged by <monospace>StimServer</monospace> locally to a file, or delivered over another network connection for storage along with the acquired experimental data.</p>
</sec>
<sec>
<title>5.3. Adding new stimuli</title>
<p>Many useful visual stimuli are available out of the box (see Table <xref ref-type="table" rid="T4">4</xref>), and including new stimuli is straightforward due to the modular architecture of <monospace>StimServer</monospace>. Stimuli have a common defining structure, requiring a generation function, a description function and a presentation function. Any new stimulus that adheres to this interface can then be included in new stimulus sets, transparently to the core <monospace>StimServer</monospace> code.</p>
<table-wrap position="float" id="T4">
<label>Table 4</label>
<caption><p><bold>Stimuli provided out of the box by <monospace>StimServer</monospace></bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left"><bold>Stimulus (<monospace>STIM_&#x02026;</monospace>)</bold></th>
<th align="left"><bold>Description</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left"><monospace>Blank</monospace></td>
<td align="left">Blank stimulus</td>
</tr>
<tr>
<td align="left"><monospace>Sequence</monospace></td>
<td align="left">Group a set of other stimuli into a randomizable sequence</td>
</tr>
<tr>
<td align="left"><monospace>SineGrating</monospace></td>
<td align="left">Drifting and rotating masked sinusoidal grating</td>
</tr>
<tr>
<td align="left"><monospace>SinePlaid</monospace></td>
<td align="left">Drifting and rotating plaid composed of two additively combined sinusoidal gratings, with arbitrary relative orientations</td>
</tr>
<tr>
<td align="left"><monospace>SquareGrating</monospace></td>
<td align="left">Drifting and rotating masked square-wave grating</td>
</tr>
<tr>
<td align="left"><monospace>SquarePlaid</monospace></td>
<td align="left">Drifting and rotating plaid composed of two additively combined square-wave gratings, with arbitrary relative orientations</td>
</tr>
<tr>
<td align="left"><monospace>OscillatingGrating</monospace></td>
<td align="left">Static oriented square-wave grating that oscillates in contrast</td>
</tr>
<tr>
<td align="left"><monospace>OscillatingPlaid</monospace></td>
<td align="left">Static plaid composed of two oriented square-wave gratings that oscillate in contrast and phase</td>
</tr>
<tr>
<td align="left"><monospace>SparseNoise</monospace></td>
<td align="left">Sparse noise composed of pixels arranged in a grid</td>
</tr>
<tr>
<td align="left"><monospace>SparseNoiseFlicker</monospace></td>
<td align="left">Sparse noise composed of pixels that oscillate in contrast</td>
</tr>
<tr>
<td align="left"><monospace>SparseGrating</monospace></td>
<td align="left">Sparse noise, where each pixel is a masked square-wave grating that drifts and rotates</td>
</tr>
<tr>
<td align="left"><monospace>BandLimitedNoise</monospace></td>
<td align="left">Spatially- and temporally-filtered white noise</td>
</tr>
<tr>
<td align="left"><monospace>DotKinematogram</monospace></td>
<td align="left">Random dot kinematogram stimulus</td>
</tr>
<tr>
<td align="left"><monospace>FlashedImageSequence</monospace></td>
<td align="left">A sequence of flashed arbitrary images</td>
</tr>
<tr>
<td align="left"><monospace>GaborField</monospace></td>
<td align="left">A field of Gabors with arbitrary locations and arbitrary individual parameters, that drift in phase and rotate</td>
</tr>
<tr>
<td align="left"><monospace>GaborGrid</monospace></td>
<td align="left">A regular grid of Gabors with arbitrary individual parameters, that drift in phase and rotate</td>
</tr>
<tr>
<td align="left"><monospace>MaskedMovie</monospace></td>
<td align="left">Present an arbitrary movie from a file, with a circular mask</td>
</tr>
</tbody>
</table>
</table-wrap>
<p><monospace>StimServer</monospace> also provides a <monospace>PresentSimpleStimulus</monospace> function, which takes care of all the low-level timing and presentation tasks for stimuli comprising drifting and rotating textures, with optional masking.</p>
<p>A flowchart showing execution flow through <monospace>StimServer</monospace>, indicating functions replaced by user-defined stimuli, is given in Figure <xref ref-type="fig" rid="F5">5</xref>.</p>
<fig id="F5" position="float">
<label>Figure 5</label>
<caption><p><bold>Overview of <monospace>StimServer</monospace> information flow. (A)</bold> A list of stimuli and stimulus sequences is constructed (see Listing <xref ref-type="table" rid="T12">8</xref>). <bold>(B)</bold> <monospace>StartStimulusServer</monospace> is called from the Matlab command line. <bold>(C)</bold> If the Instrument Control Toolbox is used for network communication (green), control returns to the Matlab command line (i.e., non-blocking network listening). If PNET is used for network communication then <monospace>StimServer</monospace> enters a blocking poll loop (blue). When a presentation command is received <bold>(D)</bold>, the stimulus-defined presentation function is called <bold>(E)</bold>. Commands shown in orange are modular, and can be replaced to introduce new stimulus classes.</p></caption>
<graphic xlink:href="fninf-08-00085-g0005.tif"/>
</fig>
</sec>
</sec>
<sec>
<title>6. Example experiments and analysis</title>
<p>In this section we present analysis of <italic>in vivo</italic> two-photon calcium imaging recordings from mouse primary visual cortex (V1). The goal of the experiment was to characterize responses in mouse V1 to drifting grating and to natural visual stimuli, in populations of neurons with overlapping receptive fields. Experimental procedures followed the guidelines of the Veterinary Office of Switzerland and were approved by the Cantonal Veterinary Office in Zurich.</p>
<p>Example data and example scripts that reproduce the analyses in this section are available as supplementary information.</p>
<sec>
<title>6.1. Two-photon calcium imaging of neuronal responses in mouse V1</title>
<p>Methods for two-photon acquisition were as described elsewhere (Kampa et al., <xref ref-type="bibr" rid="B9">2011</xref>; Roth et al., <xref ref-type="bibr" rid="B15">2012</xref>). Briefly, C57BL/6 mice (at P75&#x02013;P90) were initially anesthetized with 4&#x02013;5% isoflurane in O2 and maintained on 1.5&#x02013;2% during the surgical procedure. The primary visual cortex (V1) was localized using intrinsic imaging. A craniotomy of 3&#x02013;4 mm was opened above the region of strongest intrinsic signal response. The genetically encoded calcium indicator GCaMP6m (Chen et al., <xref ref-type="bibr" rid="B2">2013</xref>) (AAV1.Syn.GCaMP6m.WPRE.SV40; UPenn) was injected around 250 &#x003BC; m below the cortical surface to target superficial layer neurons. The craniotomy was then sealed with a glass window. After recovery and expression of the calcium indicator, animals were head-fixed and calcium transients were acquired using a custom-built two-photon microscope equipped with a 40&#x000D7; water-immersion objective (LUMPlanFl/IR, 0.8 NA; Olympus). Frames of 128 &#x000D7; 128 pixels were acquired at 7.81 Hz with bidirectional scanning using custom-written software (&#x0201C;Focus&#x0201D;; LabView; National Instruments).</p>
<p>Visual stimuli generated with <monospace>StimServer</monospace> were presented on a 24 inch LCD monitor (1200 &#x000D7; 800 pixels; 60 Hz) to the left eye of the mouse, spanning approximately 80 visual degrees. Details of each visual stimulus are given below.</p>
</sec>
<sec>
<title>6.2. Receptive field localization</title>
<p>Knowing the location in visual space of the receptive fields (RF) of the neurons in an imaged region of visual cortex is important, if properties of the neural responses should be compared between neurons with fully overlapping RFs. If masked stimuli are to be used, the location and extent of the mask will depend also on the RF locations of the recorded neurons.</p>
<p><monospace>StimServer</monospace> provides several stimuli for estimating RF locations: <monospace>STIM_SparseNoise</monospace> uses flashed high-contrast squares; <monospace>STIM_SparseNoiseFlicker</monospace> uses contrast-reversing squares; and <monospace>STIM_SparseGrating</monospace> uses patches of drifting and rotating high-contrast gratings. We configured a 5&#x000D7;5 grid of 12 deg. diameter pixels on the stimulus screen, with 40% overlap between adjacent pixels. Each pixel contained a 100% contrast vertical square-wave grating of 25 deg. per cycle, drifting at 1 Hz and presented for 1 s, with the full set of pixels presented in random order. Seven random repeats of the stimulus were collected to estimate RF location.</p>
<p>An example of RF localization analysis is given in Figure <xref ref-type="fig" rid="F6">6</xref>. Segmented single-trial per-pixel responses are shown in Figure <xref ref-type="fig" rid="F6">6D</xref>; the trial-averaged response matrix is shown in Figure <xref ref-type="fig" rid="F6">6G</xref>. Both come directly from <monospace>ExtractRegionResponses</monospace>. A smoothed RF estimate was obtained by summing Gaussian fields located at each pixel, with a diameter of 12 deg., modulated by the amplitude of the average calcium response of that pixel (Figure <xref ref-type="fig" rid="F6">6H</xref>).</p>
<fig id="F6" position="float">
<label>Figure 6</label>
<caption><p><bold>Example analyses conducted with <monospace>FocusStack</monospace> and <monospace>StimServer</monospace> from recordings made in mouse V1. (A)</bold> An RF localization experiment, where a sparse random stimulus is presented over a 5&#x000D7;5 mesh. <bold>(B)</bold> Measuring preferred orientation using drifting gratings. <bold>(C)</bold> Recording single-neuron and population responses to natural movie stimuli. <bold>(D)</bold> Single-trial single-neuron calcium responses to sparse noise stimuli. <bold>(E)</bold> Single-trial single-neuron responses to drifting high-contrast gratings. <bold>(F)</bold> Single-trial single-neuron calcium responses to a natural movie stimulus. <bold>(G)</bold> and <bold>(H)</bold> show the estimated RF location for the neuron shown in <bold>(D)</bold>. <bold>(I)</bold> The trial-averaged direction tuning curve for the neuron shown in <bold>(E)</bold>. <bold>(J)</bold> The population distribution of lifetime (L.T.) and population (Pop.) sparseness, from responses imaged simultaneously with the neuron shown in <bold>(F)</bold>. Stimulus onset in all traces <bold>(D, E, F)</bold> is indicated by a vertical tick mark. Data provided by M. Roth.</p></caption>
<graphic xlink:href="fninf-08-00085-g0006.tif"/>
</fig>
</sec>
<sec>
<title>6.3. Orientation tuning</title>
<p>The canonical cortically-derived feature in primary visual cortex is tuning for the orientation (or direction) of a drifting edge (Hubel and Wiesel, <xref ref-type="bibr" rid="B8">1962</xref>; Ohki et al., <xref ref-type="bibr" rid="B13">2005</xref>). We used responses to drifting grating stimuli to characterize the direction tuning curves of neurons in mouse V1.</p>
<p><monospace>StimServer</monospace> provides drifting sinusoidal and drifting square-wave grating stimuli with a large range of manipulatable parameters. We presented full-field drifting high-contrast sinusoidal gratings at 16 drift directions, with spatial frequency of 20 deg per cycle and temporal frequency of 1 Hz (the <monospace>STIM_SineGrating</monospace> stimulus provided by <monospace>StimServer</monospace>). These stimuli were presented for 2 s each in random order, over 5 trials.</p>
<p>An example analysis of orientation tuning of a single cortical neuron is given in Figure <xref ref-type="fig" rid="F6">6</xref>. Segmented single-trial single-neuron responses are shown in Figure <xref ref-type="fig" rid="F6">6E</xref>. A polar plot of the trial-averaged responses for the same neuron are shown in Figure <xref ref-type="fig" rid="F6">6I</xref>. Both these data come directly from <monospace>ExtractRegionResponses</monospace>.</p>
</sec>
<sec>
<title>6.4. Natural movie representations</title>
<p>Neurons in visual cortex show complex selectivity for natural scenes and movies (Kampa et al., <xref ref-type="bibr" rid="B9">2011</xref>). We recorded the responses of populations of neurons in mouse V1, to a sequence of short grayscale movies with normalized contrast. We characterized the efficiency of encoding natural movies on a single-neuron level and on a population level, by measuring the sparseness of neuronal responses.</p>
<p><monospace>StimServer</monospace> provides stimuli for presenting randomized sequences of flashed images (<monospace>STIM_FlashedImageSequence</monospace>), as well as efficient stimulation with movies in standard Matlab-readable formats (<monospace>STIM_MaskedMovie</monospace>). Both these stimuli attempt to cache frames to the extent possible, leading to efficient presentation of stimuli without dropped frames. We presented 7 trials of a 43 s duration natural movie sequence (30 Hz movie frame rate), centered at the average location of the RF of the imaged population, and spanning approximately 70 visual degrees. The movie consisted of a sequence of three segments of video. Responses up to 1.5 s post the onset of the stimulus and after each movie transition were excluded from analysis.</p>
<p>An example analysis of the natural movie response of a single neuron in mouse V1 is given in Figure <xref ref-type="fig" rid="F6">6</xref>. Segmented single trial responses are shown in Figure <xref ref-type="fig" rid="F6">6F</xref>. Once again, these traces come directly from <monospace>ExtractRegionResponses</monospace>. An analysis of lifetime (LT) and population (Pop.) response sparseness, defined as the skewness of the calcium responses either over time (LT) or over simultaneous responses in the population (Pop.), is shown in Figure <xref ref-type="fig" rid="F6">6J</xref>. These data were calculated simply by taking the trial-averaged response matrix from <monospace>ExtractRegionResponses</monospace>, and passing it to the Matlab <monospace>skewness</monospace> function.</p>
</sec>
</sec>
<sec sec-type="conclusion" id="s2">
<title>7. Conclusion</title>
<p><monospace>FocusStack</monospace> provides a toolbox for simple yet powerful analysis of calcium imaging data. It presents an abstraction layer that takes advantage of standard Matlab tensor representations, but facilitates analysis by being aware of stimulus information and other experiment-related meta-data required to interpret neuronal responses (Figure <xref ref-type="fig" rid="F2">2</xref>). Many low-level, repeated tasks of calcium signal extraction and analysis are taken care of by the toolbox, ensuring consistent analysis between experiments and minimizing errors introduced by re-writing code.</p>
<p><monospace>StimServer</monospace> provides a modular toolbox for stimulus generation and sequencing in Matlab, in conjunction with Psychtoolbox. It is designed to integrate into two-photon imaging systems, by allowing triggering of arbitrary stimuli over a network interface (Figures <xref ref-type="fig" rid="F1">1</xref>, <xref ref-type="fig" rid="F4">4</xref>, <xref ref-type="fig" rid="F5">5</xref>). Presentation order and most stimulus parameters can be reconfigured dynamically over the network interface during an experiment, allowing a two-photon acquisition system to sequence visual stimuli and then store stimulus information along with acquired imaging data.</p>
<p>When this stimulus meta-data is provided to <monospace>FocusStack</monospace>, the toolbox takes care of extraction of stimulus-related responses, by automatically performing time-series segmentation and derandomization of a two-photon stack. This implies that responses to complex and arbitrary sets of stimuli can be extracted and analyzed easily with few lines of code (see Figure <xref ref-type="fig" rid="F6">6</xref>).</p>
<p><monospace>FocusStack</monospace> and <monospace>StimServer</monospace> comprise an open-source toolchain provided to the neuroscience community. We expect that the open availability and easy to use structure will encourage uptake of consistent analysis tools in the field, as well as many contributions to add and exchange features in both toolboxes.</p>
<p><monospace>FocusStack</monospace> and <monospace>StimServer</monospace> are available as version-controlled <monospace>GIT</monospace> repositories, or as stand-alone downloads, from <ext-link ext-link-type="uri" xlink:href="https://bitbucket.org/DylanMuir/twophotonanalysis">https://bitbucket.org/DylanMuir/twophotonanalysis</ext-link> and <ext-link ext-link-type="uri" xlink:href="https://bitbucket.org/DylanMuir/stimserver">https://bitbucket.org/DylanMuir/stimserver</ext-link>.</p>
</sec>
<sec>
<title>Author contributions</title>
<p>Dylan R. Muir designed and implemented the toolbox code, and wrote the manuscript. Bj&#x000F6;rn M. Kampa contributed to the toolbox code, and wrote the manuscript.</p>
</sec>
<sec>
<title>Funding</title>
<p>This work was supported by the Novartis Foundation (grant to Dylan R. Muir), Velux Stiftung (grant to Dylan R. Muir), the Swiss National Science Foundation (Grant Nr. 31-120480 to Bj&#x000F6;rn M. Kampa), and the EU-FP7 program (BrainScales project 269921 to Bj&#x000F6;rn M. Kampa).</p>
<sec>
<title>Conflict of interest statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p></sec>
</sec>
</body>
<back>
<ack>
<p>The authors would like to express effusive thanks to M. Roth for acquiring the experimental data illustrated in this manuscript and making it available for distribution (Figures <xref ref-type="fig" rid="F1">1</xref>, <xref ref-type="fig" rid="F2">2</xref>, <xref ref-type="fig" rid="F6">6</xref>). We gratefully acknowledge the contributions of the users of <monospace>FocusStack</monospace> and <monospace>StimServer</monospace> in locating and fixing bugs, and those who also contributed code to the toolboxes. In particular, we would like to thank M. Roth, P. Molina-Luna, and A. Keller for their contributions.</p>
</ack>
<sec sec-type="supplementary-material" id="s3">
<title>Supplementary material</title>
<p>The Supplementary Material for this article can be found online at: <ext-link ext-link-type="uri" xlink:href="http://www.frontiersin.org/journal/10.3389/fninf.2014.00085/abstract">http://www.frontiersin.org/journal/10.3389/fninf.2014.00085/abstract</ext-link></p>
<supplementary-material xlink:href="DataSheet1.PDF" mimetype="application/pdf" xmlns:xlink="http://www.w3.org/1999/xlink"/>
<supplementary-material xlink:href="DataSheet2.ZIP" mimetype="application/zip" xmlns:xlink="http://www.w3.org/1999/xlink"/>
</sec>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brainard</surname> <given-names>D. H.</given-names></name></person-group> (<year>1997</year>). <article-title>The psychophysics toolbox</article-title>. <source>Spat. Vis</source>. <volume>10</volume>, <fpage>433</fpage>&#x02013;<lpage>436</lpage>. <pub-id pub-id-type="doi">10.1163/156856897X00357</pub-id><pub-id pub-id-type="pmid">9176952</pub-id></citation>
</ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>T.-W.</given-names></name> <name><surname>Wardill</surname> <given-names>T. J.</given-names></name> <name><surname>Sun</surname> <given-names>Y.</given-names></name> <name><surname>Pulver</surname> <given-names>S. R.</given-names></name> <name><surname>Renninger</surname> <given-names>S. L.</given-names></name> <name><surname>Baohan</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Ultrasensitive fluorescent proteins for imaging neuronal activity</article-title>. <source>Nature</source> <volume>499</volume>, <fpage>295</fpage>&#x02013;<lpage>300</lpage>. <pub-id pub-id-type="doi">10.1038/nature12354</pub-id><pub-id pub-id-type="pmid">23868258</pub-id></citation>
</ref>
<ref id="B3">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Dyer</surname> <given-names>E. L.</given-names></name> <name><surname>Studer</surname> <given-names>C.</given-names></name> <name><surname>Robinson</surname> <given-names>J. T.</given-names></name> <name><surname>Baraniuk</surname> <given-names>R. G.</given-names></name></person-group> (<year>2013</year>). <article-title>A robust and efficient method to recover neural events from noisy and corrupted data,</article-title> in <source>Neural Engineering (NER), 2013 6th International IEEE/EMBS Conference on</source> (<publisher-loc>San Diego, CA</publisher-loc>: <publisher-name>IEEE</publisher-name>), <fpage>593</fpage>&#x02013;<lpage>596</lpage>.</citation>
</ref>
<ref id="B4">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>G&#x000F6;bel</surname> <given-names>W.</given-names></name> <name><surname>Kampa</surname> <given-names>B. M.</given-names></name> <name><surname>Helmchen</surname> <given-names>F.</given-names></name></person-group> (<year>2007</year>). <article-title>Imaging cellular network dynamics in three dimensions using fast 3d laser scanning</article-title>. <source>Nat. Methods</source> <volume>4</volume>, <fpage>73</fpage>&#x02013;<lpage>79</lpage>. <pub-id pub-id-type="doi">10.1038/nmeth989</pub-id><pub-id pub-id-type="pmid">17143280</pub-id></citation>
</ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Grewe</surname> <given-names>B. F.</given-names></name> <name><surname>Langer</surname> <given-names>D.</given-names></name> <name><surname>Kasper</surname> <given-names>H.</given-names></name> <name><surname>Kampa</surname> <given-names>B. M.</given-names></name> <name><surname>Helmchen</surname> <given-names>F.</given-names></name> <name><surname>Grewe</surname> <given-names>B. F.</given-names></name> <etal/></person-group>. (<year>2010</year>). <article-title>High-speed <italic>in vivo</italic> calcium imaging reveals neuronal network activity with near-millisecond precision</article-title>. <source>Nat. Methods</source> <volume>7</volume>, <fpage>399</fpage>. <pub-id pub-id-type="doi">10.1038/nmeth.1453</pub-id><pub-id pub-id-type="pmid">20400966</pub-id></citation>
</ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Grynkiewicz</surname> <given-names>G.</given-names></name> <name><surname>Poenie</surname> <given-names>M.</given-names></name> <name><surname>Tsien</surname> <given-names>R. Y.</given-names></name></person-group> (<year>1985</year>). <article-title>A new generation of ca2&#x0002B; indicators with greatly improved fluorescence properties</article-title>. <source>J. Biol. Chem</source>. <volume>260</volume>, <fpage>3440</fpage>&#x02013;<lpage>3450</lpage>. <pub-id pub-id-type="pmid">3838314</pub-id></citation>
</ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Guizar-Sicairos</surname> <given-names>M.</given-names></name> <name><surname>Thurman</surname> <given-names>S. T.</given-names></name> <name><surname>Fienup</surname> <given-names>J. R.</given-names></name></person-group> (<year>2008</year>). <article-title>Efficient subpixel image registration algorithms</article-title>. <source>Opt. lett</source>. <volume>33</volume>, <fpage>156</fpage>&#x02013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1364/OL.33.000156</pub-id><pub-id pub-id-type="pmid">18197224</pub-id></citation>
</ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hubel</surname> <given-names>D. H.</given-names></name> <name><surname>Wiesel</surname> <given-names>T. N.</given-names></name></person-group> (<year>1962</year>). <article-title>Receptive fields, binocular interaction and functional architecture in the cat&#x00027;s visual cortex</article-title>. <source>J. Physiol</source>. <volume>160</volume>, <fpage>106</fpage>&#x02013;<lpage>154</lpage>. <pub-id pub-id-type="pmid">14449617</pub-id></citation>
</ref>
<ref id="B9">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kampa</surname> <given-names>B. M.</given-names></name> <name><surname>Roth</surname> <given-names>M. M.</given-names></name> <name><surname>G&#x000F6;bel</surname> <given-names>W.</given-names></name> <name><surname>Helmchen</surname> <given-names>F.</given-names></name></person-group> (<year>2011</year>). <article-title>Representation of visual scenes by local neuronal populations in layer 2/3 of mouse visual cortex</article-title>. <source>Front. Neural Circuits</source> <volume>5</volume>:<issue>18</issue>. <pub-id pub-id-type="doi">10.3389/fncir.2011.00018</pub-id><pub-id pub-id-type="pmid">22180739</pub-id></citation>
</ref>
<ref id="B10">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Langer</surname> <given-names>D.</given-names></name> <name><surname>van&#x00027;t Hoff</surname> <given-names>M.</given-names></name> <name><surname>Keller</surname> <given-names>A. J.</given-names></name> <name><surname>Nagaraja</surname> <given-names>C.</given-names></name> <name><surname>Pf&#x000E4;ffli</surname> <given-names>O. A.</given-names></name> <name><surname>G&#x000F6;ldi</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Helioscan: a software framework for controlling <italic>in vivo</italic> microscopy setups with high hardware flexibility, functional diversity and extendibility</article-title>. <source>J. Neurosci. Methods</source> <volume>215</volume>, <fpage>38</fpage>&#x02013;<lpage>52</lpage>. <pub-id pub-id-type="doi">10.1016/j.jneumeth.2013.02.006</pub-id><pub-id pub-id-type="pmid">23416135</pub-id></citation>
</ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>L&#x000FC;tcke</surname> <given-names>H.</given-names></name> <name><surname>Gerhard</surname> <given-names>F.</given-names></name> <name><surname>Zenke</surname> <given-names>F.</given-names></name> <name><surname>Gerstner</surname> <given-names>W.</given-names></name> <name><surname>Helmchen</surname> <given-names>F.</given-names></name></person-group> (<year>2013</year>). <article-title>Inference of neuronal network spike dynamics and topology from calcium imaging data</article-title>. <source>Front. Neural Circuits</source> <volume>7</volume>:<issue>201</issue>. <pub-id pub-id-type="doi">10.3389/fncir.2013.00201</pub-id><pub-id pub-id-type="pmid">24399936</pub-id></citation>
</ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nagai</surname> <given-names>T.</given-names></name> <name><surname>Yamada</surname> <given-names>S.</given-names></name> <name><surname>Tominaga</surname> <given-names>T.</given-names></name> <name><surname>Ichikawa</surname> <given-names>M.</given-names></name> <name><surname>Miyawaki</surname> <given-names>A.</given-names></name></person-group> (<year>2004</year>). <article-title>Expanded dynamic range of fluorescent indicators for ca2&#x0002B; by circularly permuted yellow fluorescent proteins</article-title>. <source>Proc. Natl. Acad. Sci. U.S.A</source>. <volume>101</volume>, <fpage>10554</fpage>&#x02013;<lpage>10559</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.0400417101</pub-id><pub-id pub-id-type="pmid">15247428</pub-id></citation>
</ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ohki</surname> <given-names>K.</given-names></name> <name><surname>Chung</surname> <given-names>S.</given-names></name> <name><surname>Ch&#x00027;ng</surname> <given-names>Y. H.</given-names></name> <name><surname>Kara</surname> <given-names>P.</given-names></name> <name><surname>Reid</surname> <given-names>R. C.</given-names></name></person-group> (<year>2005</year>). <article-title>Functional imaging with cellular resolution reveals precise micro-architecture in visual cortex</article-title>. <source>Nature</source> <volume>433</volume>, <fpage>597</fpage>&#x02013;<lpage>603</lpage>. <pub-id pub-id-type="doi">10.1038/nature03274</pub-id><pub-id pub-id-type="pmid">15660108</pub-id></citation>
</ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pelli</surname> <given-names>D. G.</given-names></name></person-group> (<year>1997</year>). <article-title>The videotoolbox software for visual psychophysics: transforming numbers into movies</article-title>. <source>Spat. Vis</source>. <volume>10</volume>, <fpage>437</fpage>&#x02013;<lpage>442</lpage>. <pub-id pub-id-type="doi">10.1163/156856897X00366</pub-id><pub-id pub-id-type="pmid">9176953</pub-id></citation>
</ref>
<ref id="B15">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roth</surname> <given-names>M. M.</given-names></name> <name><surname>Helmchen</surname> <given-names>F.</given-names></name> <name><surname>Kampa</surname> <given-names>B. M.</given-names></name></person-group> (<year>2012</year>). <article-title>Distinct functional properties of primary and posteromedial visual area of mouse neocortex</article-title>. <source>J. Neurosci</source>. <volume>32</volume>, <fpage>9716</fpage>&#x02013;<lpage>9726</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.0110-12.2012</pub-id><pub-id pub-id-type="pmid">22787057</pub-id></citation>
</ref>
<ref id="B16">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schneider</surname> <given-names>C. A.</given-names></name> <name><surname>Rasband</surname> <given-names>W. S.</given-names></name> <name><surname>Eliceiri</surname> <given-names>K. W.</given-names></name></person-group> (<year>2012</year>). <article-title>Nih image to imagej: 25 years of image analysis</article-title>. <source>Nat. Methods</source> <volume>9</volume>, <fpage>671</fpage>&#x02013;<lpage>675</lpage>. <pub-id pub-id-type="doi">10.1038/nmeth.2089</pub-id><pub-id pub-id-type="pmid">22930834</pub-id></citation>
</ref>
<ref id="B17">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tomek</surname> <given-names>J.</given-names></name> <name><surname>Novak</surname> <given-names>O.</given-names></name> <name><surname>Syka</surname> <given-names>J.</given-names></name></person-group> (<year>2013</year>). <article-title>Two-photon processor and seneca: a freely available software package to process data from two-photon calcium imaging at speeds down to several milliseconds per frame</article-title>. <source>J. Neurophysiol</source>. <volume>110</volume>, <fpage>243</fpage>&#x02013;<lpage>256</lpage>. <pub-id pub-id-type="doi">10.1152/jn.00087.2013</pub-id><pub-id pub-id-type="pmid">23576700</pub-id></citation>
</ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vogelstein</surname> <given-names>J. T.</given-names></name> <name><surname>Packer</surname> <given-names>A. M.</given-names></name> <name><surname>Machado</surname> <given-names>T. A.</given-names></name> <name><surname>Sippy</surname> <given-names>T.</given-names></name> <name><surname>Babadi</surname> <given-names>B.</given-names></name> <name><surname>Yuste</surname> <given-names>R.</given-names></name> <etal/></person-group>. (<year>2010</year>). <article-title>Fast nonnegative deconvolution for spike train inference from population calcium imaging</article-title>. <source>J. Neurophysiol</source>. <volume>104</volume>, <fpage>3691</fpage>&#x02013;<lpage>3704</lpage>. <pub-id pub-id-type="doi">10.1152/jn.01073.2009</pub-id><pub-id pub-id-type="pmid">20554834</pub-id></citation>
</ref>
</ref-list>
<fn-group>
<fn id="fn0001"><p><sup>1</sup><ext-link ext-link-type="uri" xlink:href="https://bitbucket.org/DylanMuir/twophotonanalysis">https://bitbucket.org/DylanMuir/twophotonanalysis</ext-link>; <ext-link ext-link-type="uri" xlink:href="https://bitbucket.org/DylanMuir/stimserver">https://bitbucket.org/DylanMuir/stimserver</ext-link></p></fn>
<fn id="fn0002"><p><sup>1</sup><ext-link ext-link-type="uri" xlink:href="http://dylan-muir.com/articles/mapped_tensor/">http://dylan-muir.com/articles/mapped_tensor/</ext-link></p></fn>
<fn id="fn0003"><p><sup>2</sup><ext-link ext-link-type="uri" xlink:href="http://dylan-muir.com/articles/tiffstack/">http://dylan-muir.com/articles/tiffstack/</ext-link></p></fn>
<fn id="fn0004"><p><sup>3</sup><ext-link ext-link-type="uri" xlink:href="http://www.mathworks.com/products/instrument/">http://www.mathworks.com/products/instrument/</ext-link></p></fn>
<fn id="fn0005"><p><sup>4</sup><ext-link ext-link-type="uri" xlink:href="http://www.mathworks.com/matlabcentral/fileexchange/345-tcp-udp-ip-toolbox-2-0-6">http://www.mathworks.com/matlabcentral/fileexchange/345-tcp-udp-ip-toolbox-2-0-6</ext-link></p></fn>
</fn-group>
</back>
</article>