University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > An efficient kernel product for automatic differentiation libraries, with applications to measure transport

An efficient kernel product for automatic differentiation libraries, with applications to measure transport

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact info@newton.ac.uk.

GFSW03 - Shape analysis and computational anatomy

Authors : Benjamin Charlier, Jean Feydy, Joan Alexis Glaunès and Alain Trouvé This paper presents a memory-efficient implementation of the kernel matrix-vector product, which is suitable for use with automatic differentiation libraries—in our case, PyTorch. This piece of software alleviates the major bottleneck of autodiff libraries as far as diffeomorphic image registration is concerned: symbolic python code can now scale up to large point clouds and shapes (100,000+ vertices). To showcase the value of automatic differentiation to the LDDMM community, we introduce the “normalized Hamiltonian” setting and show that it corresponds to a spatially regularized optimal transport of mass distributions: made tractable by autodiff libraries, the kernel normalization trick turns an extrinsic image deformation routine into an intrinsic measure transportation program.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2017 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity