University of Cambridge > Talks.cam > Imagers Interest Group > Extracting Universal Representations of Cognition from fMRI mega-analyses

Extracting Universal Representations of Cognition from fMRI mega-analyses

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Johan Carlin.

To map the neural substrate of mental function, cognitive neuroimaging relies on controlled psychological manipulations that engage brain systems associated with specific cognitive processes. In order to build comprehensive atlases of cognitive function in the brain, it must assemble maps for many different cognitive processes, which often evoke overlapping patterns of activation. Such data aggregation faces contrasting goals: on the one hand finding correspondences across vastly different cognitive experiments, while on the other hand precisely describing the function of any given brain region.

In this talk I will present two analysis frameworks that tackle these difficulties and thereby enable the generation of brain atlases for cognitive function. The first one uses deep-learning techniques to extract representations—task-optimized networks—that form a set of basis cognitive dimensions relevant to the psychological manipulations. This approach does not assume any prior knowledge of the commonalities shared by the studies in the corpus; those are inferred during model training.

The second one leverages ontologies of cognitive concepts and multi-label brain decoding to map the neural substrate of these concepts. Crucially, it can accurately decode the cognitive concepts recruited in new tasks. These results demonstrate that aggregating independent task-fMRI studies can provide a more precise global atlas of selective associations between brain and cognition.

This talk is part of the Imagers Interest Group series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity