Stochastic variants of classical optimization methods, with complexity guarantees
- 👤 Speaker: Professor Coralia Cartis
- 📅 Date & Time: Wednesday 01 May 2019, 14:00 - 15:00
- 📍 Venue: CMS, MR14
Abstract
Optimization is a key component of machine learning application, as it helps with training of (neural net, nonconvex) models and parameter tuning. Classical optimization methods are challenged by the scale of machine learning applications and the lack of /cost of full derivatives, as well as the stochastic nature of the problem. On the other hand, the simple approaches that the machine learning community uses need improvement. Here we try to merge the two perspectives and adapt the strength of classical optimization techniques to meet the challenges of data science applications: from deterministic to stochastic problems, from typical to large scale. We propose a general algorithmic framework and complexity analysis that allows the use of inexact, stochastic and even possibly biased, problem information in classical methods for nonconvex optimization. This work is joint with Katya Scheinberg (Cornell), Jose Blanchet (Columbia) and Matt Menickelly (Argonne).
Series This talk is part of the CCIMI Seminars series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge talks
- CCIMI
- CCIMI Seminars
- Chris Davis' list
- CMS Events
- CMS, MR14
- custom
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- ndk22's list
- ob366-ai4er
- rp587
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Professor Coralia Cartis
Wednesday 01 May 2019, 14:00-15:00