University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Inferring change points in signal levels through deterministic minimization of a generalized global functional

Inferring change points in signal levels through deterministic minimization of a generalized global functional

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Mustapha Amrani.

Inference for Change-Point and Related Processes

Abrupt level change points are ubiquitous. Knowing the change points and levels of a time series, is critical to many practical signal analysis problems in science and engineering. For this, and other reasons, the problem of detecting level shifts, first studied in the 1940’s in process control, is of enduring interest. In this talk I will detail a set of simple, novel, generalized, deterministic nonlinear algorithms for this problem. These algorithms are based on a global functional which, when minimized, finds the maximum a-posteriori location of the change points and values of the levels. This global functional approach subsumes some well-known algorithms for this problem that have been developed in digital image processing contexts, and also folds in several algorithms from statistical machine learning that have hitherto been seen as distinct. The algorithms are computationally simple, and many are convex optimization problems for which standard, fast implementations are available.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity