University of Cambridge > Talks.cam > Adrian Seminars in Neuroscience > Towards a brain architecture for visual behavior selection

Towards a brain architecture for visual behavior selection

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Marisa Parsonage.

Selecting the right behavior at the right time is critical for animal survival. Animals rely on their senses to deliver information about the environment to sensory processing areas in the brain that extract relevant features and form the perceptual representations that guide behavior. We aim to uncover the organization of this feature space and the neural mechanisms by which these cues are translated into dynamic motor activity. Our current focus is visually-driven behaviors of the fly. In particular, those driven by visual looming cues produced by an approaching predator or an imminent collision. The same looming stimulus can evoke a wide range of different behaviors, including a rapid escape jump, a slower, more stable takeoff sequence, or a landing response. As part of the Janelia Descending Interneuron Project Team, we have created a library of transgenic fly lines that target the descending neuron population with cell-type specificity. We use these genetic tools along with whole-cell patch clamp physiology in behaving flies, calcium imaging, and high-throughput/high-resolution behavioral assays to examine the transformation of information from sensory to motor and how this transformation is modified by context, such as behavioral state. I will discuss our recent work investigating the representation of ethologically-relevant visual features in the fly optic glomeruli and the mechanisms by which descending neurons read out this feature information to produce an appropriate behavioral choice.

This talk is part of the Adrian Seminars in Neuroscience series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity