University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Directional Framelets with Low Redundancy and Directional Quasi-tight Framelets

Directional Framelets with Low Redundancy and Directional Quasi-tight Framelets

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact INI IT.

ASCW01 - Challenges in optimal recovery and hyperbolic cross approximation

Edge singularities are ubiquitous and hold key
information for many high-dimensional problems. Consequently, directional
representation systems are required to effectively capture edge singularities
for high-dimensional problems. However, the increased angular resolution often
significantly increases the redundancy rates of a directional system. High
redundancy rates lead to expensive computational costs and large storage
requirement, which hinder the usefulness of such directional systems for
problems in moderately high dimensions such as video processing. In this talk,
we attack this problem by using directional tensor product complex tight
framelets with mixed sampling factors. Such introduced directional system has
good directionality with a very low redundancy rate $frac{3d-1}{2d-1}$,
e.g., the redundancy rates are $2$, $2frac{2}{3}$, $3frac{5}{7}$,
$5frac{1}{3}$ and $7frac{25}{31}$ for dimension $d=1,ldots,5$. Our numerical
experiments on image/video denoising and inpainting show that the performance
of our proposed directional system with low redundancy rate is comparable or
better than several state-of-the-art methods which have much higher redundancy
rates. In the second part, we shall discuss our recent developments of
directional quasi-tight framelets in high dimensions. This is a joint work with
Chenzhe Diao, Zhenpeng Zhao and Xiaosheng Zhuang.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity