Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kristin Branson is active.

Publication


Featured researches published by Kristin Branson.


Nature Methods | 2009

High-throughput Ethomics in Large Groups of Drosophila

Kristin Branson; Alice A. Robie; John A. Bender; Pietro Perona; Michael H. Dickinson

We present a camera-based method for automatically quantifying the individual and social behaviors of fruit flies, Drosophila melanogaster, interacting in a planar arena. Our system includes machine-vision algorithms that accurately track many individuals without swapping identities and classification algorithms that detect behaviors. The data may be represented as an ethogram that plots the time course of behaviors exhibited by each fly or as a vector that concisely captures the statistical properties of all behaviors displayed in a given period. We found that behavioral differences between individuals were consistent over time and were sufficient to accurately predict gender and genotype. In addition, we found that the relative positions of flies during social interactions vary according to gender, genotype and social environment. We expect that our software, which permits high-throughput screening, will complement existing molecular methods available in Drosophila, facilitating new investigations into the genetic and cellular basis of behavior.


eLife | 2014

Mushroom body output neurons encode valence and guide memory-based action selection in Drosophila

Yoshinori Aso; Divya Sitaraman; Toshiharu Ichinose; Karla R. Kaun; Katrin Vogt; Ghislain Belliart-Guérin; Pierre-Yves Plaçais; Alice A. Robie; Nobuhiro Yamagata; Christopher Schnaitmann; William J Rowell; Rebecca M. Johnston; Teri-T B. Ngo; Nan Chen; Wyatt Korff; Michael N. Nitabach; Ulrike Heberlein; Thomas Preat; Kristin Branson; Hiromu Tanimoto; Gerald M. Rubin

Animals discriminate stimuli, learn their predictive value and use this knowledge to modify their behavior. In Drosophila, the mushroom body (MB) plays a key role in these processes. Sensory stimuli are sparsely represented by ∼2000 Kenyon cells, which converge onto 34 output neurons (MBONs) of 21 types. We studied the role of MBONs in several associative learning tasks and in sleep regulation, revealing the extent to which information flow is segregated into distinct channels and suggesting possible roles for the multi-layered MBON network. We also show that optogenetic activation of MBONs can, depending on cell type, induce repulsion or attraction in flies. The behavioral effects of MBON perturbation are combinatorial, suggesting that the MBON ensemble collectively represents valence. We propose that local, stimulus-specific dopaminergic modulation selectively alters the balance within the MBON network for those stimuli. Our results suggest that valence encoded by the MBON ensemble biases memory-based action selection. DOI: http://dx.doi.org/10.7554/eLife.04580.001


Nature Methods | 2013

JAABA: interactive machine learning for automatic annotation of animal behavior

Mayank Kabra; Alice A. Robie; Marta Rivera-Alba; Steven Branson; Kristin Branson

We present a machine learning–based system for automatically computing interpretable, quantitative measures of animal behavior. Through our interactive system, users encode their intuition about behavior by annotating a small set of video frames. These manual labels are converted into classifiers that can automatically annotate behaviors in screen-scale data sets. Our general-purpose system can create a variety of accurate individual and social behavior classifiers for different organisms, including mice and adult and larval Drosophila.


Trends in Ecology and Evolution | 2014

Automated image-based tracking and its application in ecology

Anthony I. Dell; John A. Bender; Kristin Branson; Iain D. Couzin; Gonzalo G. de Polavieja; Lucas P.J.J. Noldus; Alfonso Pérez-Escudero; Pietro Perona; Andrew D. Straw; Martin Wikelski; Ulrich Brose

The behavior of individuals determines the strength and outcome of ecological interactions, which drive population, community, and ecosystem organization. Bio-logging, such as telemetry and animal-borne imaging, provides essential individual viewpoints, tracks, and life histories, but requires capture of individuals and is often impractical to scale. Recent developments in automated image-based tracking offers opportunities to remotely quantify and understand individual behavior at scales and resolutions not previously possible, providing an essential supplement to other tracking methodologies in ecology. Automated image-based tracking should continue to advance the field of ecology by enabling better understanding of the linkages between individual and higher-level ecological processes, via high-throughput quantitative analysis of complex ecological patterns and processes across scales, including analysis of environmental drivers.


Nature Methods | 2014

Fast, accurate reconstruction of cell lineages from large-scale fluorescence microscopy data

Fernando Amat; William C. Lemon; Daniel P Mossing; Katie McDole; Yinan Wan; Kristin Branson; Eugene W. Myers; Philipp J. Keller

The comprehensive reconstruction of cell lineages in complex multicellular organisms is a central goal of developmental biology. We present an open-source computational framework for the segmentation and tracking of cell nuclei with high accuracy and speed. We demonstrate its (i) generality by reconstructing cell lineages in four-dimensional, terabyte-sized image data sets of fruit fly, zebrafish and mouse embryos acquired with three types of fluorescence microscopes, (ii) scalability by analyzing advanced stages of development with up to 20,000 cells per time point at 26,000 cells min−1 on a single computer workstation and (iii) ease of use by adjusting only two parameters across all data sets and providing visualization and editing tools for efficient data curation. Our approach achieves on average 97.0% linkage accuracy across all species and imaging modalities. Using our system, we performed the first cell lineage reconstruction of early Drosophila melanogaster nervous system development, revealing neuroblast dynamics throughout an entire embryo.


Nature | 2015

A multilevel multimodal circuit enhances action selection in Drosophila

Tomoko Ohyama; Casey M Schneider-Mizell; Richard D. Fetter; Javier Valdes Aleman; Romain Franconville; Marta Rivera-Alba; Brett D. Mensh; Kristin Branson; Julie H. Simpson; James W. Truman; Albert Cardona; Marta Zlatic

Natural events present multiple types of sensory cues, each detected by a specialized sensory modality. Combining information from several modalities is essential for the selection of appropriate actions. Key to understanding multimodal computations is determining the structural patterns of multimodal convergence and how these patterns contribute to behaviour. Modalities could converge early, late or at multiple levels in the sensory processing hierarchy. Here we show that combining mechanosensory and nociceptive cues synergistically enhances the selection of the fastest mode of escape locomotion in Drosophila larvae. In an electron microscopy volume that spans the entire insect nervous system, we reconstructed the multisensory circuit supporting the synergy, spanning multiple levels of the sensory processing hierarchy. The wiring diagram revealed a complex multilevel multimodal convergence architecture. Using behavioural and physiological studies, we identified functionally connected circuit nodes that trigger the fastest locomotor mode, and others that facilitate it, and we provide evidence that multiple levels of multimodal integration contribute to escape mode selection. We propose that the multilevel multimodal convergence architecture may be a general feature of multisensory circuits enabling complex input–output functions and selective tuning to ecologically relevant combinations of cues.


Journal of the Royal Society Interface | 2011

Multi-camera real-time three-dimensional tracking of multiple flying animals

Andrew D. Straw; Kristin Branson; Titus R. Neumann; Michael H. Dickinson

Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in real time—with minimal latency—opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behaviour. Here, we describe a system capable of tracking the three-dimensional position and body orientation of animals such as flies and birds. The system operates with less than 40 ms latency and can track multiple animals simultaneously. To achieve these results, a multi-target tracking algorithm was developed based on the extended Kalman filter and the nearest neighbour standard filter data association algorithm. In one implementation, an 11-camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behaviour of freely flying animals. If combined with other techniques, such as ‘virtual reality’-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.


Nature Communications | 2015

Whole-central nervous system functional imaging in larval Drosophila

William C. Lemon; Stefan R. Pulver; Burkhard Höckendorf; Katie McDole; Kristin Branson; Jeremy Freeman; Philipp J. Keller

Understanding how the brain works in tight concert with the rest of the central nervous system (CNS) hinges upon knowledge of coordinated activity patterns across the whole CNS. We present a method for measuring activity in an entire, non-transparent CNS with high spatiotemporal resolution. We combine a light-sheet microscope capable of simultaneous multi-view imaging at volumetric speeds 25-fold faster than the state-of-the-art, a whole-CNS imaging assay for the isolated Drosophila larval CNS and a computational framework for analysing multi-view, whole-CNS calcium imaging data. We image both brain and ventral nerve cord, covering the entire CNS at 2 or 5 Hz with two- or one-photon excitation, respectively. By mapping network activity during fictive behaviours and quantitatively comparing high-resolution whole-CNS activity maps across individuals, we predict functional connections between CNS regions and reveal neurons in the brain that identify type and temporal state of motor programs executed in the ventral nerve cord.


eLife | 2015

Cortex commands the performance of skilled movement

Jian-Zhong Guo; Austin R. Graves; Wendy W Guo; Jihong Zheng; Allen Lee; Juan Rodríguez-González; Nuo Li; John J. Macklin; James W Phillips; Brett D. Mensh; Kristin Branson; Adam Hantman

Mammalian cerebral cortex is accepted as being critical for voluntary motor control, but what functions depend on cortex is still unclear. Here we used rapid, reversible optogenetic inhibition to test the role of cortex during a head-fixed task in which mice reach, grab, and eat a food pellet. Sudden cortical inhibition blocked initiation or froze execution of this skilled prehension behavior, but left untrained forelimb movements unaffected. Unexpectedly, kinematically normal prehension occurred immediately after cortical inhibition, even during rest periods lacking cue and pellet. This ‘rebound’ prehension was only evoked in trained and food-deprived animals, suggesting that a motivation-gated motor engram sufficient to evoke prehension is activated at inhibition’s end. These results demonstrate the necessity and sufficiency of cortical activity for enacting a learned skill. DOI: http://dx.doi.org/10.7554/eLife.10774.001


Current Biology | 2012

A Simple Strategy for Detecting Moving Objects during Locomotion Revealed by Animal-Robot Interactions

Francisco Zabala; Peter Polidoro; Alice Robie; Kristin Branson; Pietro Perona; Michael H. Dickinson

An important role of visual systems is to detect nearby predators, prey, and potential mates, which may be distinguished in part by their motion. When an animal is at rest, an object moving in any direction may easily be detected by motion-sensitive visual circuits. During locomotion, however, this strategy is compromised because the observer must detect a moving object within the pattern of optic flow created by its own motion through the stationary background. However, objects that move creating back-to-front (regressive) motion may be unambiguously distinguished from stationary objects because forward locomotion creates only front-to-back (progressive) optic flow. Thus, moving animals should exhibit an enhanced sensitivity to regressively moving objects. We explicitly tested this hypothesis by constructing a simple fly-sized robot that was programmed to interact with a real fly. Our measurements indicate that whereas walking female flies freeze in response to a regressively moving object, they ignore a progressively moving one. Regressive motion salience also explains observations of behaviors exhibited by pairs of walking flies. Because the assumptions underlying the regressive motion salience hypothesis are general, we suspect that the behavior we have observed in Drosophila may be widespread among eyed, motile organisms.

Collaboration


Dive into the Kristin Branson's collaboration.

Top Co-Authors

Avatar

Alice A. Robie

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Mayank Kabra

University of California

View shared research outputs
Top Co-Authors

Avatar

Katie McDole

Howard Hughes Medical Institute

View shared research outputs
Top Co-Authors

Avatar

Michael H. Dickinson

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Philipp J. Keller

Howard Hughes Medical Institute

View shared research outputs
Top Co-Authors

Avatar

Pietro Perona

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Adam Hantman

Howard Hughes Medical Institute

View shared research outputs
Top Co-Authors

Avatar

Allen Lee

Howard Hughes Medical Institute

View shared research outputs
Top Co-Authors

Avatar

Andrew D. Straw

California Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge