The 8thannual computational and systems neuroscience (Cosyne) meeting
© Histed and Pillow; licensee BioMed Central Ltd. 2011
Received: 5 April 2011
Accepted: 20 April 2011
Published: 20 April 2011
The 8th annual Computational and Systems Neuroscience meeting (Cosyne) was held February 24-27, 2011 in Salt Lake City, Utah (abstracts are freely available online: http://www.cosyne.org/c/index.php?title=Cosyne2011_Program). Cosyne brings together experimental and theoretical approaches to systems neuroscience, with the goal of understanding neurons, neural assemblies, and the perceptual, cognitive and behavioral functions they mediate.
The range of questions available to systems and computational neuroscience has grown substantially in recent years, with both theoretical and experimental approaches driven by the increasing availability of data about neural circuits and systems. The Cosyne meeting has reflected this growth, nearly doubling in size since the first meeting in 2004, to a new record of nearly 600 attendees this year. It remains single-track, which allows discussions of presentations to drive scientific interaction between attendees with diverse backgrounds. Poster sessions take place each evening, which provide a forum for intense scientific conversations that frequently spill out into more informal settings late at night. The meeting is followed by two days of workshops, held at the Snowbird ski resort, which feature more specialized talks and interactive discussions on a wide collection of topics, this year ranging from consciousness and compressed sensing to dynamics, learning, and perception.
Below we highlight a few presentations of special interest. We have made an effort to sample broadly, but Cosyne appeals to a large audience across several disciplines, and we are limited by space and a residual slant towards our own interests and interactions at the meeting. We apologize to those presenters whose contributions we do not have space to mention, but we are excited about the broad extent of new work we observed.
Neural activity and perception
Stanislas Dehaene (INSERM/CEA) gave a wide-ranging summary of his research on how humans perceive and process numerosity, citing behavioral studies of infants and diverse human societies, and connecting that to neural findings in macaques. Jonathan Victor, D. Thengone, and M. Conte (Weill Cornell Medical College) presented a novel approach for characterizing the perceptual salience of low and high-order statistics in natural images using an innovative method for parameterizing textures; their findings suggested that image statistics interact perceptually according to an approximately Euclidean distance function. Wilson Geisler (UT Austin) showed another method for exploiting higher-order statistical properties of natural images using local measurements from a very large collection of images. These statistics were then used to derive Bayesian solutions to a variety of low-level vision problems. Saskia de Vries and T. Clandinin (Stanford) identified a group of Drosophila visual neurons that detect objects on a collision course with the fly; inactivating these prevents the animal from moving to avoid a collision.
Circuits affecting behavioral computations
Tirin Moore (Stanford) described new work linking the dopamine circuits of the frontal cortex of the macaque both to activity of visual neurons and to behavioral responses. David Anderson (Caltech) described beautiful work dissecting circuits in the ventromedial hypothalamus that underlie aggression in the mouse. Inducing spiking activity in a small but specific population of neurons in the amygdala produces a complex, sustained attacking behavior in which male mice are induced to attack females, a rare event in their normal behavior. Franz Weber, C. Machens, and A. Borst (MPI of Neurobiology) presented an elegant application of the generalized linear model (GLM) to functional coupling between two identified fly neurons (H1 and Vi) implicated in the processing of visual motion. Their analyses revealed a unidirectional coupling from H1 to Vi, tuned to produce optimally informative representations in Vi. Finally, Surya Ganguli (UCSF) and R. Hahnloser showed that local, Hebbian learning rules can explain rapid learning of complex sequences by neural circuits, a novel paradigm for sequence learning that poses a significant theoretical challenge to reinforcement learning models. The range of this work highlights the interest in coupling between neurons; we believe a major goal for the field is to determine which behaviors rely on a small number of neurons, and which are more sensitive to coupling due to the dynamic interaction of many neurons or multiple circuits.
Information processing in neural assemblies
E.J. Chichilnisky (Salk) reviewed his work identifying the functional connectivity between individual cones and ganglion cells in the primate retina and computational methods to infer bipolar cell connectivity. Alison Barth (Carnegie-Mellon) discussed a series of studies on subsets of highly-active neurons in the mouse cerebral cortex and how they might affect the computations performed there. Murray Sherman (Univ. of Chicago) presented an alternative to the usual view that representations are constructed sequentially (for example, first by the sensory thalamus, then primary, and then secondary sensory cortex and so on). He outlined how subcortical regions, including the thalamus, might support more parallel or simultaneous processing. Elad Ganmor, R. Segev, and E. Schneidman (Weizmann Inst.) described a novel approach for capturing the joint activity of very large populations of neurons using sparse, low-order interaction networks. Brice Bathellier and S. Rumpel (IMP Vienna) used two-photon calcium imaging in mouse auditory cortex to show that neural subpopulations can combine to represent a large number of diverse sounds and also predict performance in a sound discrimination task. Rubén Moreno-Bote and A. Pouget (Rochester) used an analysis of spiking neural networks to argue that decorrelation does not affect the amount of information available to downstream populations, thus calling into question a central dogma of population coding.
Understanding network structure
Tony Zador (Cold Spring Harbor Laboratory) discussed a new method for solving a major challenge facing the field - determining which neurons are connected to each other. The method exploits the tremendous advances in DNA sequencing technology. It uses short oligonucleotides to uniquely tag neurons, and viral machinery to transport the tags across synapses where they are identified via sequencing. Ian Ellwood and V. Sohal (UCSF) used both experiments and models to show how dopaminergic inputs can strongly modulate cells' firing through intertwined effects on calcium, potassium, and sodium channels. Sandra Kuhlman, E. Tring, and J. Trachtenberg (UCLA) showed that mouse visual inhibitory neurons acquire broader visual tuning during development, though excitatory neurons sharpen their tuning as a result of activity. John Cunningham (Cambridge), M. Churchland, M. Kaufman and K. Shenoy presented 'jPCA', a method for reducing the dimension of large neural datasets by looking for rotational or oscillatory dynamics. Mark Churchland, J. Cunningham, M. Kaufman, S. Ryu and K. Shenoy (Stanford) showed an application of this method to unit recordings from macaque motor cortex and argued that slow (1-3 Hz) network oscillations seem to be an important basis for motor control.
The field of computational and systems neuroscience is advancing quickly, driven both by innovation in experimental approaches and simultaneous development of theoretical ideas to understand these data. The growth and energy of the Cosyne meeting clearly reflect both trends.
We thank A. Churchland and E. Simoncelli for providing Figure 1, D. Soudan for attendance statistics, and A. Churchland, M. Bethge, I. Park, T. Vogels, and R. Wilson for feedback on noteworthy presentations.
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.