BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Isaac Newton Institute Seminar Series
SUMMARY:Principal component analysis for learning tree ten
sor networks - Anthony Nouy (Université de Nantes)
DTSTART;TZID=Europe/London:20180309T114500
DTEND;TZID=Europe/London:20180309T123000
UID:TALK102130AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/102130
DESCRIPTION:We present an extension of principal component ana
lysis for functions of multiple random variables a
nd an associated algorithm for the approximation
of such functions using tree-based low-rank forma
ts (tree tensor networks). A multivariate function
is here considered as an element of a Hilbert ten
sor space of functions defined on a product set eq
uipped with a probability measure. The algorithm
only requires evaluations of functions on a struct
ured set of points which is constructed adaptivel
y. The algorithm constructs a hierarchy of subspac
es associated with the different nodes of a dimens
ion partition tree and a corresponding hierarchy o
f projection operators\, based on interpolation or
least-squares projection. Optimal subspaces are e
stimated using empirical principal component analy
sis of interpolations of partial random evaluation
s of the function. The algorithm is able to provi
de an approximation in any tree-based format with
either a prescribed rank or a prescribed relative
error\, with a number of evaluations of the order
of the storage complexity of the approximation for
mat.
LOCATION:Seminar Room 1\, Newton Institute
CONTACT:info@newton.ac.uk
END:VEVENT
END:VCALENDAR