COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

University of Cambridge > Talks.cam > Statistics > Assessing the finite-dimensionality of functional data

## Assessing the finite-dimensionality of functional dataAdd to your list(s) Download to your calendar using vCal - Celine Vial (Rennes)
- Friday 30 November 2007, 14:00-15:00
- MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB.
If you have a question about this talk, please contact S.M.Pitts. If a problem in functional data analysis is low-dimensional then the methodology for its solution can often be reduced to relatively conventional techniques in multivariate analysis. Hence, there is intrinsic interest in assessing the finite-dimensionality of functional data. We show that this problem has several unique features. From some viewpoints the problem is trivial, in the sense that continuously-distributed functional data which are exactly finite-dimensional are immediately recognisable as such, if the sample size is sufficiently large. However, in practice, functional data are almost always observed with noise, for example resulting from rounding or experimental error. Then the problem is almost insolubly difficult. In such cases a part of the average noise variance is confounded with the true signal, and is not identifiable. However, it is possible to define the unconfounded part of the noise variance. This represents the best possible lower bound to all potential values of average noise variance, and is estimable in low-noise settings. Moreover, bootstrap methods can be used to describe the reliability of estimates of unconfounded noise variance, under the assumption that the signal is finite-dimensional. Motivated by these ideas, we suggest techniques for assessing the finiteness of dimensionality. In particular, we show how to construct a critical point $\hat{v}_q$ such that, if the distribution of our functional data has fewer than q – 1 degrees of freedom, then we should be prepared to assume that the average variance of the added noise is at least $\hat{v}_q$ If this level seems too high then we must conclude that the dimension is at least q – 1. We show that simpler, more conventional techniques, based on hypothesis testing, are generally not effective. This talk is part of the Statistics series. ## This talk is included in these lists:- All CMS events
- All Talks (aka the CURE list)
- CMS Events
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- DPMMS Lists
- DPMMS info aggregator
- DPMMS lists
- Guy Emerson's list
- MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB
- Machine Learning
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
- rp587
Note that ex-directory lists are not shown. |
## Other listsInnovation Forum Datalog for Program Analysis: Beyond the Free Lunch Type the title of a new list here## Other talksRegulation of progenitor cells in adult lung and in lung cancer New micro-machines, new materials Animal Migration Prof. Stephen Cusack - Title to be confirmed Coral reefs and climate change: Ecological surprises from the epicentre of the 2015-2016 El NiĆ±o Efficient regularization of functional map computations |