University of Cambridge > Talks.cam > CUED Computer Vision Research Seminars > Spectral Edge Image Fusion: Theory and Applications

Spectral Edge Image Fusion: Theory and Applications

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Rachel Fogg.

This talk describes a novel approach to the fusion of multidimensional images for colour displays. The goal of the method is to generate an output image whose gradient matches that of the input as closely as possible. It achieves this using a constrained contrast mapping paradigm in the gradient domain, where the structure tensor of a high-dimensional gradient representation is mapped exactly to that of a low-dimensional gradient field which is subsequently reintegrated to generate an output. Constraints on the output colours are provided by an initial RGB rendering to ensure naturalistic colours: we provide a theorem for projecting higher-D contrast onto the initial colour gradients such that they remain close to the original gradients whilst maintaining exact high-D contrast. The solution to this constrained optimisation is closed-form, allowing for a very simple and hence fast and efficient algorithm. Our approach is generic in that it can map any N-D image data to any M-D output, and can be used in a variety of applications using the same basic algorithm. In this paper we focus on the problem of mapping N-D inputs to 3-D colour outputs. We present results in several applications, including hyperspectral remote sensing, fusion of colour and near-infrared images, colour visualisation of MRI Diffusion-Tensor imaging, and colour to colourblind.

[Spectral Edge Image Fusion: Theory and Applications, David Connah, Mark S. Drew, and Graham D. Finlayson, Zurich, Sept. 2014, To Appear.]

This talk is part of the CUED Computer Vision Research Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity