University of Cambridge > > Machine Learning @ CUED > On the Equivalence of Graph Cuts and Max-product Belief Propagation

On the Equivalence of Graph Cuts and Max-product Belief Propagation

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Peter Orbanz.

(Joint work with Inmar Givoni, Richard Zemel, and Brendan Frey.)

A common task in computer vision is the minimization of pairwise submodular energies over binary variables. For such problems, both graph cuts and max-product belief propagation are applicable. Graph cuts is guaranteed to produce optimal solutions, and empirically it does so very quickly. Max-product can be applied in more general settings, but it is not guaranteed to be optimal even in the binary submodular case, and empirically it has been shown to produce suboptimal results while taking longer to do so.

Both algorithms operate on parametric representations of an underlying energy function, and it is known that both algorithms can be viewed as performing iterative reparameterizations of that energy function. This suggests that despite the difference in empirical performance, under a certain view, the two algorithms might be similar.

In this talk, I will present graph cuts and max-product via this lense of reparameterization, then I will develop the precise connection between the algorithms, showing that with proper scheduling and damping, max-product can be made equivalent to graph cuts. The end result is a scheduling algorithm and damping procedure for max-product that is guaranteed to be efficient and optimal on binary submodular problems, even when the graphical model has loopy structure.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2020, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity