COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Modelling Large- and Small-Scale Brain Networks

## Modelling Large- and Small-Scale Brain NetworksAdd to your list(s) Download to your calendar using vCal - Thomas Nichols (University of Warwick)
- Thursday 14 July 2016, 14:00-14:30
- Seminar Room 1, Newton Institute.
If you have a question about this talk, please contact INI IT. SNAW01 - Graph limits and statistics Investigations of the human brain with neuroimaging have recently seen a dramatic shift in focus, from “brain mapping”, identifying brain regions related to particular functions, to connectivity or “connectomics”, identifying networks of coordinated brain regions, and how these networks behave at rest and during tasks. In this presentation I will discuss two quite different approaches to modeling brain connectivity. In the first work, we use Bayesian time series methods to allow for time-varying connectivity. Non-stationarity connectivity methods typically use a moving-window approach, while this method poses a single generative model for all nodes, all time points. Known as a “Multiregression Dynamic Model” (MDM), it comprises an extension of a traditional Bayesian Network (or Graphical Model), by posing latent time-varying coefficients that implement a regression a given node on its parent nodes. Intended for a modest number of nodes (up to about 12), a MDM allows inference of the structure of the graph using closed form Bayes factors (conditional on a single estimated “discount factor”, reflecting the balance of observation and latent variance. While originally developed for directed acyclic graphs, it can also accommodate directed (possibly cyclic) graphs as well. In the second work, we use mixtures of simple binary random graph models to account for complex structure in brain networks. In this approach, the network is reduced to a binary adjacency matrix. While this is invariably represents a loss of information, it avoids a Gaussianity assumption and allows the use of much larger graphs, e.g. with 100's of nodes. Daudin et al. (2008) proposed a “Erdos-Reyni Mixture Model”, which assumes that, after an unknown number of latent node classes have been estimated, that connections arise as Bernoulli counts, homogeneously for each pair of classes. We extend this work to account for multisubject data (where edge data are now Binomially distributed), allowing - http://warwick.ac.uk/tenichols – Home page of Prof. Nichols
This talk is part of the Isaac Newton Institute Seminar Series series. ## This talk is included in these lists:- All CMS events
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge talks
- Chris Davis' list
- Featured lists
- INI info aggregator
- Interested Talks
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
- Seminar Room 1, Newton Institute
- Trust & Technology Initiative - interesting events
- bld31
- ndk22's list
- ob366-ai4er
- rp587
Note that ex-directory lists are not shown. |
## Other listsCafe RSA Language and Music as Cognitive Systems israel Cambridge University Linguistic Society (LingSoc) Cambridge Virology Seminars Trinity College Science Society (TCSS)## Other talksElectoral intrigue, ethnic politics and the vibrancy of the Kenyan public sphere Modeling and understanding of Quaternary climate cycles Prof Murray Shanahan: Artificial Intelligence Zone 6 Convention Cognition in the wild: foraging hummingbirds and building nests Quantifying Uncertainty in Turbulent Flow Predictions based on RANS/LES Closures |